WO2025003910A1 - Humeral marker for mixed reality surgical navigation - Google Patents
Humeral marker for mixed reality surgical navigation Download PDFInfo
- Publication number
- WO2025003910A1 WO2025003910A1 PCT/IB2024/056216 IB2024056216W WO2025003910A1 WO 2025003910 A1 WO2025003910 A1 WO 2025003910A1 IB 2024056216 W IB2024056216 W IB 2024056216W WO 2025003910 A1 WO2025003910 A1 WO 2025003910A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual
- tracking
- humerus
- points
- processing system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B17/14—Surgical saws
- A61B17/15—Guides therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/90—Identification means for patients or instruments, e.g. tags
- A61B90/94—Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/90—Identification means for patients or instruments, e.g. tags
- A61B90/94—Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text
- A61B90/96—Identification means for patients or instruments, e.g. tags coded with symbols, e.g. text using barcodes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00203—Electrical control of surgical instruments with speech control or speech recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00207—Electrical control of surgical instruments with hand gesture control or hand gesture recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00216—Electrical control of surgical instruments with eye tracking or head position tracking control
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00681—Aspects not otherwise provided for
- A61B2017/00725—Calibration or performance testing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
Definitions
- Shoulder arthroplasty is a form of orthopedic surgery in which one or more prostheses are implanted on a patient’s scapula and humerus.
- the humerus prosthesis has a ball-shaped surface that mates with a socket-shaped surface of the scapular implant.
- the scapular prosthesis has ball-shaped surface that mates with a socket-shaped surface of a humeral implant.
- a surgeon may resect the patient’s humeral head. Resecting the humeral head along an appropriate plane may be a significant factor in the success of the surgery.
- a tracking marker is affixed at a bicipital groove of a patient’s humerus.
- the tracking marker includes elements that enable a tracking system to determine a position and orientation of the tracking marker. Because the tracking marker is affixed to the patient’s humerus, by determining the position and orientation of the tracking marker, a processing system may determine the position and orientation of the patient’s humerus in a physical coordinate system. The processing system may perform a humeral registration process to generate registration data that defines a relationship between a virtual coordinate system and the physical coordinate system. Aspects of a surgical plan, such as a cut plane, may be defined in the virtual coordinate system.
- the processing system may cause an MR device to use the registration data to present virtual guidance.
- the processing system may cause the MR device to display planned plane elements and current plane elements.
- the planned plane elements indicate locations on a planned cutting plane through a bone.
- the planned plane elements are not contiguous with each other.
- the current plane elements indicate locations on a current cutting plane of a sawblade of a saw.
- the current plane elements are not contiguous with each other.
- the processing system may cause the MR device to concurrently display the planned plane elements at their determined positions and the current plane elements at their determined positions.
- At least one of the current or planned plane element of the pair has a visual property (e.g., color, texture, etc.) based on the position for the current plane element of the pair relative to the planned cutting plane. Displaying the current plane elements and the planned plane elements in this way may guide a user (e.g., a clinician) to cut the bone at an appropriate position and angle, without unduly obscuring the user’s vision of the bone.
- a visual property e.g., color, texture, etc.
- the virtual guidance can also include a virtual guidance element.
- the processing system can cause the MR device to display a virtual guidance element.
- the virtual guidance element can include a divided ring element including an enclosed area circle bisected by a line.
- the virtual guidance element can also include an active element with an inner edge and an outer edge. The distance between a center of the line of the divided ring element and the inner edge of the active element is indicative of a distance between a resection level of a current cutting plane of the sawblade into the bone and the resection level of the planned cutting plane through the bone.
- the angle of the inner element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane.
- a length of the line perpendicular to the inner edge of the active element from the center of the inner edge of the active element to the outer edge of the active element is indicative of an anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane. Displaying the virtual guidance element may guide the clinician to cut the bone at an appropriate position and angle.
- the processing system may determine, based on signals from sensors of a tracking system, first points corresponding to a first tracking marker attached to a body of a saw.
- the processing system may determine, based on the signals, second points corresponding to a second tracking marker of a tracking structure while a sawblade of the saw is positioned in a recess defined by a support body of the tracking structure.
- the processing system may generate, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
- this disclosure describes a tracking structure comprising: an attachment body shaped for attachment at a bicipital groove of a humerus of a patient, the attachment body defining a slot having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer; and a tracking marker connected to the attachment body, the tracking marker having a polyhedral shape that includes a plurality of faces that have different optical patterns.
- this disclosure describes a method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker of a tracking structure, the tracking structure comprising the first tracking marker and an attachment body positioned at a bicipital groove of a humerus of a patient; receiving, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determining, by the processing system, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; and generating, by the processing system, based on the second
- this disclosure describes a method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker attached to a body of a saw; and determining, by the processing system, based on the first signals, second points corresponding to a second tracking marker of a tracking structure while a sawblade of the saw is positioned in a recess defined by a support body of the tracking structure; and generating, by the processing system, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
- this disclosure describes a system comprising: a saw comprising a sawblade; a first tracking marker attached to the saw; and a tracking structure that comprises a support body and a second tracking marker, wherein the support body defines a recess to accommodate the sawblade; a processing system comprising one or more processors that are implemented in circuitry and configured to: receive first signals from one or more sensors of a tracking system; determine, based on the first signals, first points corresponding to the first tracking marker attached to a body of the saw; and determine, based on the first signals, second points corresponding to the second tracking marker while the sawblade of the saw is positioned in the recess defined by the support body of the tracking structure; and generate, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
- this disclosure describes a method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, positions for a plurality of planned plane elements; determining, by the processing system, based on the first signals, positions for a plurality of current plane elements, wherein: each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient, the planned plane elements are not contiguous with each other; each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of a sawblade of a saw, and the current plane elements are not contiguous with each other; causing, by the processing system, a mixed reality (MR) device to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements, wherein, for each pair of corresponding current plane elements and planned plane elements, at least one of the current plane element
- MR mixed reality
- this disclosure describes a system comprising: a mixed reality (MR) device; and a processing system that includes one or more processors implemented in circuitry, the processing system configured to: receive first signals from one or more sensors of a tracking system; determine, based on the first signals, positions for a plurality of planned plane elements; determine, based on the first signals, positions for a plurality of current plane elements, wherein: each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient, the planned plane elements are not contiguous with each other; each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of a sawblade of a saw, and the current plane elements are not contiguous with each other; and cause the mixed reality (MR) device to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements, wherein, for each pair of corresponding current plane elements and planned plane elements, at least
- this disclosure describes a computer-implemented method comprising: determining, by a processing system that includes one or more processors implemented in circuitry, a current cutting plane of a sawblade of a saw; and outputting, by the processing system, for display by a mixed reality (MR) device, a virtual guidance element that includes: a divided ring element that includes an enclosed area circle bisected by a line; and an active element having an inner edge and an outer edge, wherein: a distance between a center of the line and a center of the inner edge of the active element is indicative of a distance between a resection level of the current cutting plane of the sawblade into a bone and a resection level of a planned cutting plane through the bone, an angle of the inner edge of the active element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane, and a length of a line perpendicular to the inner edge of the active element from the center of the
- this disclosure describes a system comprising one or more processors implemented in circuitry, wherein one or more processors are configured to: determine a current cutting plane of a sawblade of a saw; and output for display by a mixed reality (MR) device, a virtual guidance element that includes: a divided ring element that includes an enclosed area circle bisected by a line; and an active element having an inner edge and an outer edge, wherein: a distance between a center of the line and a center of the inner edge of the active element is indicative of a distance between a resection level of the current cutting plane of the sawblade into a bone and a resection level of a planned cutting plane through the bone, an angle of the inner edge of the active element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane, and a length of a line perpendicular to the inner edge of the active element from the center of the inner edge of the active element to the outer edge of
- MR mixed reality
- FIG. 1 is a conceptual diagram illustrating an example system in which one or more techniques of this disclosure may be performed.
- FIG. 2 is a flowchart illustrating an example process for mixed reality (MR)-based navigation for surgical tasks associated with a humerus of a patient according to techniques of this disclosure.
- MR mixed reality
- FIG. 3 is a block diagram illustrating an example computing system in accordance with one or more techniques of this disclosure.
- FIG. 4 is a schematic representation of a mixed reality (MR) device in accordance with one or more techniques of this disclosure.
- FIG. 5A and FIG. 5B are schematic representations of an example attachment body of a humeral tracking structure according to techniques of this disclosure.
- FIG. 6A and FIG. 6B are schematic representations of an example attachment body of a humeral tracking structure according to techniques of this disclosure.
- FIG. 7A and FIG. 7B are schematic representations of an example attachment body of a humeral tracking structure according to techniques of this disclosure.
- FIG. 7C and FIG. 7D are schematic representations of a first alternative version of the attachment body, according to techniques of this disclosure.
- FIG. 7E and FIG. 7F are schematic representations of a second alternative version of the attachment body, according to techniques of this disclosure.
- FIG. 7G and FIG. 7H are schematic representations of a third alternative version of the attachment body, according to techniques of this disclosure.
- FIG. 71 and FIG. 7J are schematic representations of a fourth alternative version of the attachment body, according to techniques of this disclosure.
- FIG. 7K, FIG. 7L, and FIG. 7M are schematic representations of a fifth alternative version of the attachment body, according to techniques of this disclosure.
- FIG. 8 is a conceptual diagram illustrating an example view of registering a humeral head according to techniques of this disclosure.
- FIG. 9 is a conceptual diagram illustrating an example view of registering a humeral metaphysis according to techniques of this disclosure.
- FIG. 10 is a conceptual diagram illustrating an example view of registering a bicipital groove according to techniques of this disclosure.
- FIG. 11 is a conceptual diagram illustrating validation of registration of a humeral head according to techniques of this disclosure.
- FIG. 12 is a conceptual diagram illustrating validation of registration of a humeral head according to techniques of this disclosure.
- FIG. 13 is a conceptual diagram illustrating validation of registration of a humeral metaphysis according to techniques of this disclosure.
- FIG. 14 is a flowchart illustrating an example registration operation according to techniques of this disclosure.
- FIG. 15 is a conceptual diagram illustrating an example of identifying a position of a saw blade according to techniques of this disclosure.
- FIG. 16 is a conceptual diagram illustrating an example of identifying a position of a saw blade according to techniques of this disclosure.
- FIG. 17A is a conceptual diagram illustrating a profile view of an example tracking structure according to techniques of this disclosure.
- FIG. 17B is a conceptual diagram illustrating a profile view of an example tracking structure according to techniques of this disclosure.
- FIG. 18 is a conceptual diagram illustrating an example of confirming an orientation of a sawblade according to techniques of this disclosure.
- FIG. 19A is a conceptual diagram illustrating an example of identifying a position of a sawblade according to techniques of this disclosure.
- FIG. 19B is a conceptual diagram illustrating an example tracking structure for identifying the position of a sawblade according to techniques of this disclosure.
- FIG. 19C is a conceptual diagram illustrating a bottom view of the tracking structure of FIG. 19B.
- FIG. 19D is a conceptual diagram illustrating an alternative tracking structure for identifying the position of a sawblade according to techniques of the disclosure.
- FIG. 19E is a conceptual diagram illustrating a second alternative tracking structure for identifying the position of a sawblade according to techniques of the disclosure.
- FIG. 20 is a flowchart illustrating an example operation for identifying a position of a sawblade for MR-based guidance according to techniques of this disclosure.
- FIG. 21 is a conceptual diagram illustrating example virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 22 is a flowchart illustrating an example operation for providing cut guidance according to techniques of this disclosure.
- FIG. 23 is a conceptual diagram illustrating an example ring-shaped virtual element for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 24 is a conceptual diagram illustrating example ring-shaped virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 25 is a conceptual diagram illustrating example planned plane virtual elements and current plane virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 26 is a conceptual diagram illustrating example virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 27 is a conceptual diagram illustrating first example elevation and angular virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 28 is a conceptual diagram illustrating second example elevation and angular virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 29 is a conceptual diagram illustrating example elevation and angle virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 30 is a conceptual diagram illustrating example elevation element and bubblelevel virtual element for guiding resection of a humeral head according to techniques of this disclosure.
- FIG. 31 is a conceptual diagram illustrating an example of confirming accuracy of a humeral resection according to techniques of this disclosure.
- FIGS. 32A - 32E are conceptual diagrams illustrating example guidance displayed by a virtual guidance element for positioning a sawblade at a correct position and orientation, in accordance with techniques of this disclosure.
- FIG. 33 is a conceptual diagram illustrating an example MR visualization that includes a virtual guidance element according with techniques of this disclosure.
- FIG. 34 is a conceptual diagram illustrating an example MR visualization that includes an outline of a bone according to techniques of this disclosure.
- FIG. 35A and FIG. 35B are schematic representations of an example cutting guide, according to the techniques of this disclosure.
- FIG. 35C and FIG. 35D are schematic representations of a first alternative version of the cutting guide, according to techniques of this disclosure.
- FIG. 35E and FIG. 35F are schematic representations of a second alternative version of the cutting guide, according to techniques of this disclosure.
- FIG. 35G is a schematic representation of a third alternative version of the cutting guide, according to the techniques of this disclosure.
- FIG. 36 is a flowchart illustrating an example operation for providing cut guidance according to techniques of this disclosure.
- FIG. 37 is a flowchart illustrating an example operation for providing cut guidance in conjunction with a guide instrument, according to techniques of this disclosure.
- FIG. 38 is a flowchart illustrating an example operation for providing cut guidance in conjunction with a guide instrument including an optical marker, according to the techniques of this disclosure.
- a surgeon may use an oscillating saw to resect the head of a patient’s humerus. Resecting the head of the patient’s humerus allows the surgeon to insert a stem of a humeral implant into the intramedullary canal of the humerus. It is important for the surgeon to resect the humeral head at the correct position and angle. Resecting the humeral head at the wrong position or the wrong angle may lead to restricted patient range of motion, bone fractures, failures of the humeral implant, poor seating of the humeral implant, or other complications.
- a physical humeral cut guide is a physical object that fits over the humeral head.
- the physical humeral cut guide defines a slot through which the surgeon can insert the sawblade of an oscillating saw.
- the slot defined by the physical humeral cut guide is aligned with a planned position and angle of entry for resecting the humeral head.
- This disclosure describes devices and techniques associated with the user of mixed reality (MR) to aid in humeral head resection.
- the term MR may be taken to encompass augmented reality (AR).
- AR augmented reality
- this disclosure describes a tracking structure that helps a tracking system determine a position of the humerus.
- This disclosure also describes a technique for identifying a position of a sawblade of an oscillating saw for purposes of providing guidance to a user (e.g., surgeon) while the user is using the oscillating saw to resect the humeral head.
- this disclosure describes techniques for providing virtual guidance to the user to guide resection of the humeral head.
- the techniques of this disclosure may eliminate the drawbacks associated with physical humeral cut guides.
- the techniques of this disclosure may improve accuracy and ease of use of virtual guidance.
- FIG. 1 is a conceptual diagram illustrating an example system 100 in which one or more techniques of this disclosure may be performed.
- system 100 includes a processing system 102, a MR device 104, a humeral tracking structure 106, and a tracking system 116.
- Humeral tracking structure 106 includes an attachment body 108 and atracking marker 110. Humeral tracking structure 106 is attached to a humerus 112 using fixation members 114 that pass through apertures defined in attachment body 108. Fixation members 114 may include pins, screws, wires, or other items to attach attachment body 108 to humerus 112.
- tracking marker 110 is an optical marker having predefined optical patterns on different faces of a cube.
- tracking marker 110 may be a cube having different predefined optical patterns on each face other than a face connected to attachment body 108.
- tracking marker 110 has 2-dimensional optical barcodes on different faces.
- the faces of humeral tracking structure 106 have different predefined optical patterns, such as numbers.
- tracking marker 110 has different polyhedral shapes.
- tracking marker 110 may be a dodecahedron, pyramid, or another shape.
- tracking marker 110 may be an ultrasonic emitter, an electromagnetic marker, a passive optical marker that reflects light, an active optical marker that emits light, and so on.
- tracking marker 110 comprises a set of objects (e.g., balls, cubes, etc.) having predefined sizes and arranged in a predefined spatial configuration.
- MR device 104 may use various visualization techniques to display MR visualizations to the user.
- a MR visualization may comprise one or more virtual objects that are viewable by a user at the same time as real -world objects. Thus, what the user sees may be a mixture of real and virtual objects.
- MR device 104 may include a see-through display that allows the user to directly see real objects.
- MR device 104 may display an MR visualization comprising images that combine virtual objects and video of the real-world environment. Thus, in such examples, the user of MR device 104 does not directly see the real- world environment.
- MR device 104 may comprise various types of devices for presenting MR visualizations.
- MR device 104 may be a Microsoft HOLOLENSTM headset, such as the HOLOLENS 2 headset, available from Microsoft Corporation, of Redmond, Washington, USA, or a similar device, such as, for example, a MR visualization device that includes waveguides.
- the HOLOLENSTM device can be used to present 3D virtual objects via holographic lenses, or waveguides, while permitting a user to view actual objects in a real-world scene, i.e., in a real-world environment, through the holographic lenses.
- MR device 104 may be a holographic projector, a headmounted smartphone, a special-purpose MR visualization device, or another type of device for presenting MR visualizations.
- MR device 104 includes a head-mounted unit that communicates with a separate device (e.g., a smartphone, personal computer, tablet computer, etc.) that performs at least some of the processing functionality of MR device 104.
- all functionality of MR device 104 is performed by hardware residing in a head-mounted unit. Discussion in this disclosure of actions performed by processing system 102 may be performed by processors in MR device 104, one or more computing devices separate from MR device 104, or a combination of the one or more computing devices and MR device 104.
- Tracking system 116 may include sensors 118 for tracking the positions of real -world physical objects.
- tracking system 116 may include depth sensors, Red-Green- Blue (RGB) cameras, infrared sensors, and/or other types of sensors.
- tracking system 116 is incorporated into MR device 104. In other examples, tracking system 116 is separate from MR device 104.
- Processing system 102 may comprise one or more processing units located in one or more computing devices.
- the computing devices may include MR device 104, server computers, personal computers, smartphones, tablet computers, laptop computers, and other types of computing devices.
- processing system 102 includes one or more computing devices in addition to MR device 104, the computing devices may communicate with MR device 104 via one or more wired or wireless communication links.
- a user may wear MR device 104 during a surgery, such as a shoulder arthroplasty or other type of surgery.
- MR device 104 may display guidance to the surgeon to help the user determine how to resect the humeral head of humerus 112. Resecting the humeral head may be part of a process to prepare humerus 112 for implantation of a humeral prosthesis.
- the guidance may take different forms. For instance, in some examples, the guidance may include a virtual cut plane that shows the user how to position the blade of an oscillating saw during resection of the humeral head.
- the guidance may include a plurality of planned plane elements and a plurality of current plane elements.
- Each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient.
- Each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of the sawblade.
- the planned plane elements are not contiguous with each other, and the current plane elements are not contiguous with each other. This example is described in greater detail with respect to FIG. 21.
- Processing system 102 performs a registration process to ensure that MR device 104 is able to display the guidance at the correct location relative to humerus 112.
- Virtual elements such as virtual guidance for resection of the humeral head, may be defined and generated using a coordinate system, such as an (x, y, z) coordinate system.
- a coordinate system such as an (x, y, z) coordinate system.
- the coordinate system in which virtual elements is defined is referred to as a virtual coordinate system.
- a virtual model of humerus 112 may be generated based on preoperative images of the patient. Points on the virtual model may be defined in the virtual coordinate system.
- a cut plane for resection of the humeral head may be defined in the virtual coordinate system so that the user can see the virtual cut plane relative to the virtual model of humerus 112.
- processing system 102 determines the locations of physical objects, including MR device 104 and humerus 112, in terms of a different coordinate system, which is referred to as a physical coordinate system.
- the physical coordinate system may be an (x, y, z) coordinate system, or another type of coordinate system.
- a point on or within tracking marker 110 may be the origin of the physical coordinate system. For instance, a comer or a centroid of tracking marker 110 may be the origin of the physical coordinate system.
- the axes of the physical coordinate system may be aligned with edges of tracking marker 110.
- the registration process generates registration data that define a relationship between the physical coordinate system and the virtual coordinate system.
- Processing system 102 may use the registration data to determine how to position and orient virtual elements within the physical coordinate system so that MR device 104 is able to display the virtual elements, and so that the virtual elements appear to the user to be at the correct locations.
- processing system 102 may use registration data to determine that the virtual coordinate (x v i, y v i, z v i) corresponds to (x p i, y p i, z p i).
- processing system 102 may need to accurately determine the current location of humerus 112. Humerus 112 may move during surgery. Thus, it cannot be assumed that any point on humerus 112 will retain the same coordinates in the physical coordinate system.
- Tracking marker 110 helps processing system 102 determine the current location of humerus 112.
- Humeral tracking structure 106 is fixed to humerus 112 such that tracking marker 110 of humeral tracking structure 106 cannot move independently of humerus 112.
- Sensors 118 of tracking system 116 may sense the position of tracking marker 110 in 3 -dimensions. Processing system 102 may thus use signals from sensors 118 to determine the position of tracking marker 110 in the physical coordinate system.
- determining the position of tracking marker 110 in the physical coordinate system is not sufficient to determine the position of humerus 112 in the physical coordinate system. This is because of variations on the shapes of humeri and the slight variations in how tracking structures may be fixed to humeri.
- a process is performed to determine the position of the humerus relative to tracking marker 110.
- the digitizer may be a handheld stylusshaped object to which a tracking marker is attached. For instance, the tracking marker may be attached to an end of the digitizer opposite a tip of the digitizer.
- Palpating humerus 112 is a process of touching the tip of the digitizer to humerus 112.
- palpating may refer to the act or process of touching to the tip of the digitizer on an anatomical object, such as humerus 112.
- There is a fixed spatial relationship between the tracking marker of the digitizer and the tip of the digitizer For example, the tracking marker of the digitizer and the tip of the digitizer may be exactly 10 centimeters.
- processing system 102 is able to use the fixed spatial relationship between the tracking marker of the digitizer and the tip of the digitizer along with the 3 -dimensional position of the tracking marker of the digitizer to determine positions on the palpated portions of humerus 112 in the physical coordinate system.
- processing system 102 has data describing the 3- dimensional position of tracking marker 110 and concurrent 3 -dimensional positions of the palpated portions of humerus 112 in the physical coordinate system.
- Processing system 102 may complete the registration process by determining a transform that maps the 3 -dimensional positions of the palpated portions of humerus 112 to corresponding portions of the virtual model of humerus 112.
- Attachment body 108 of humeral tracking structure 106 may be attached at a bicipital groove of humerus 112.
- the bicipital groove of humerus 112 is a groove on humerus 112 that separates the greater tubercle from the lesser tubercle.
- the bicipital groove of humerus 112 is exposed as part of the process of preparing humerus 112 for implantation of a humeral prosthesis.
- Attaching attachment body 108 at the bicipital groove of humerus 112 may be advantageous because doing so may help to ensure that tracking marker 110 is out of the way of a sawblade during resection of the humeral head. Additionally, because attachment body 108 may sit at least partially within the bicipital groove, the walls of the bicipital groove may make it harder for attachment body 108 to slip during attachment of attachment body 108 to humerus 112.
- attachment body 108 of humeral tracking structure 106 at the bicipital groove may prevent the user from palpating the bicipital groove.
- attachment body 108 may define a slot having dimensions sufficient for palpation of the bicipital groove using a digitizer.
- the slot may have dimensions sufficient for palpation of the bicipital groove if the dimensions are large enough for a tip of the digitizer to contact one or more locations on the bone tissue of the bicipital groove.
- the slot may have a width ranging from 4 millimeters (mm) to 10mm. In some examples, the slot may have a length ranging from 50mm to 70mm. In other examples, the slot may have other dimensions. In some examples, the user palpates other areas of humerus 112, such as the humeral head and metaphysis, during the registration process.
- processing system 102 may receive first signals (e.g., video signals) from one or more sensors 118 of tracking system 116. Processing system 102 may determine, based on the first signals, first points corresponding to a tracking marker 110 of humeral tracking structure 106. Humeral tracking structure 106 includes tracking marker 110 and attachment body 108, which is positioned at a bicipital groove of humerus 112 of a patient. Processing system 102 may also receive second signals (e.g., later video signals) from sensors 118 of tracking system 116.
- first signals e.g., video signals
- Processing system 102 may determine, based on the first signals, first points corresponding to a tracking marker 110 of humeral tracking structure 106.
- Humeral tracking structure 106 includes tracking marker 110 and attachment body 108, which is positioned at a bicipital groove of humerus 112 of a patient.
- Processing system 102 may also receive second signals (e.g., later video signals) from sensors 118 of tracking system 116.
- Processing system 102 may determine, based on the second signals, second points corresponding to a tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in attachment body 108 of humeral tracking structure 106. Processing system 102 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove. Positions of the first, second and third points may be defined in a physical coordinate system. Processing system 102 may then generate, based on the third points and a virtual model of the humerus having points defined in a virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system.
- processing system 102 may output, for display via MR device 104, a virtual bone model and/or a virtual planned cutting plane adjacent to or superimposed on humerus 112.
- the virtual bone model and/or virtual planned cutting plane may be rotated and spatially oriented based on the registration data to be rotated and oriented in the same way as humerus 112.
- the user may confirm the registration was successful based on the display. That is, if registration is successful and humeral tracking structure 106 is attached to the correct position on humerus 112, the rotation and spatial orientation of the virtual bone model as shown by MR device 104 are the same real rotation and spatial orientation of humerus 112.
- humeral tracking structure 106 is not correctly positioned on humerus 112
- the virtual bone model as shown by the MR device 104 may appear to be rotated or oriented differently from humerus 112.
- the virtual planned cutting plane may be at an unreasonable angle relative to humerus 112.
- Correct positioning of humeral tracking structure 106 on humerus 112 may be challenging in some cases, especially cases of trauma or severe bone erosion. Accordingly, displaying the virtual bone model and/or the virtual planned cutting plane in this way may help assure the user that humeral tracking structure 106 was attached to humerus 112 correctly.
- the user may use an oscillating saw 120 to resect the humeral head of humerus 112.
- Oscillating saw 120 has a sawblade 122.
- a tracking marker 124 is attached to a body of oscillating saw 120.
- the body of saw 120 may be amain section of saw 120 or another physical component of saw 120.
- Processing system 102 may use signals from tracking system 116 to determine a position of tracking marker 124 in the physical coordinate system .
- Tracking marker 124 may be at a fixed distance from the body of oscillating saw 120.
- processing system 102 may be able to track the position of oscillating saw 120 while the user is using oscillating saw 120 to resect the humeral head of humerus 112.
- processing system 102 may be able to cause MR device 104 to display real-time guidance regarding the position of oscillating saw 120 while the user is using the oscillating saw to resect the humeral head.
- processing system 102 may receive signals from one or more sensors 118 of tracking system 116. Processing system 102 may determine, based on the signals, first points corresponding to tracking marker 124 attached to a body of saw 120. Additionally, processing system 102 may determine, based on the signals, second points corresponding to a second tracking marker of a tracking structure (illustrated elsewhere in this disclosure) while sawblade 122 of saw 120 is positioned in a recess defined by a support body of the tracking structure. The first points and the second points are defined in a physical coordinate system.
- Processing system 102 may generate, based on the first points and the second points, position identification data that specify a position of a lower edge of sawblade 122 relative to tracking marker 124. Processing system 102 may use the position identification data when generating guidance for display by MR device 104.
- processing system 102 determine positions for planned plane elements and current plane elements.
- the planned plane elements indicate locations on a planned cutting plane through a bone, such as humerus 112 or another type of bone.
- the planned plane elements are not contiguous with each other.
- the current plane elements indicate locations on a current cutting plane of a sawblade of a saw.
- Processing system 102 may cause MR device 104 to concurrently display the planned plane elements at their determined positions and the current plane elements at their determined positions.
- At least one of the current or planned plane element of the pair has a visual property (e.g., color, texture, outlining pattern, etc.) based on the position for the current plane element of the pair relative to the planned cutting plane.
- a visual property e.g., color, texture, outlining pattern, etc.
- FIG. 2 is a flowchart illustrating an example process for MR-based navigation for surgical tasks associated with a humerus of a patient according to techniques of this disclosure.
- processing system 102 may perform a registration process (200).
- the registration process may involve determining a position of tracking marker 110 of humeral tracking structure 106, determining positions on humerus 112 based on positions of a digitizer that palpates humerus 112, and generating registration data that map positions in a virtual coordinate system to positions in a physical coordinate system.
- processing system 102 may perform a sawblade position identification process (202).
- Processing system 102 may then perform a surgical navigation process (204). During the surgical navigation process, processing system 102 may cause MR device 104 to display virtual elements that guide the user to resect the humeral head of humerus 112. After completing the resection of the humeral head is complete, processing system 102 may perform an accuracy check to ensure that the humeral head was resected at the correct location (206).
- FIG. 3 is a block diagram illustrating an example computing system in accordance with one or more techniques of this disclosure.
- a computing system 300 includes processors 302, memory 304, a communication interface 306, and a display 308.
- computing system 102 may include more, fewer, or different components.
- the components of computing system 102 may be in one or more computing devices.
- processors 302 may be in a single computing device or distributed among multiple computing devices of computing system 102
- memory 304 may be in a single computing device or distributed among multiple computing devices of computing system 102, and so on.
- Processors 302 may be implemented in circuitry and include one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), hardware, or any combinations thereof.
- processors 302 maybe implemented as fixed-function circuits, programmable circuits, or a combination thereof.
- Fixed-function circuits refer to circuits that provide particular functionality and are preset on the operations that can be performed.
- Programmable circuits refer to circuits that can be programmed to perform various tasks and provide flexible functionality in the operations that can be performed. For instance, programmable circuits may execute software or firmware that cause the programmable circuits to operate in the manner defined by instructions of the software or firmware.
- Fixed-function circuits may execute software instructions (e.g., to receive parameters or output parameters), but the types of operations that the fixed-function circuits perform are generally immutable.
- one or more of the units may be distinct circuit blocks (fixed-function or programmable), and in some examples, the one or more units may be integrated circuits.
- Processors 302 may include arithmetic logic units (ALUs), elementary function units (EFUs), digital circuits, analog circuits, and/or programmable cores, formed from programmable circuits.
- ALUs arithmetic logic units
- EFUs elementary function units
- Digital circuits analog circuits
- programmable cores formed from programmable circuits.
- memory 304 may store the object code of the software that processors 302 receives and executes, or another memory within processors 302 (not shown) may store such instructions.
- Examples of the software include software designed for surgical planning.
- Processors 302 may perform the actions ascribed in this disclosure to processing system 102.
- Memory 304 may store various types of data used by processors 302.
- Memory 304 may include any of a variety of memory devices, such as dynamic random access memory (DRAM), including synchronous DRAM (SDRAM), magnetoresistive RAM (MRAM), resistive RAM (RRAM), or other types of memory devices.
- DRAM dynamic random access memory
- SDRAM synchronous DRAM
- MRAM magnetoresistive RAM
- RRAM resistive RAM
- Examples of display 308 include a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device.
- LCD liquid crystal display
- OLED organic light emitting diode
- Communication interface 306 that allows computing system 102 to output data and instructions to and receive data and instructions from MR device 104, a medical imaging system, or other device via one or more communication links or networks.
- Communication interface 306 may be hardware circuitry that enables computing system 102 to communicate (e.g., wirelessly or using wires) to other computing systems and devices, such as MR device 104.
- Example networks may include various types of communication networks including one or more wide-area networks, such as the Internet, local area networks, and so on. In some examples, the network may include wired and/or wireless communication links.
- memory 304 stores registration data 310 and plan data 312. Additionally, in the example of FIG. 3, memory 304 stores a registration system 316, aplanning system 318, and virtual guidance system 320. In other examples, memory 304 may store more, fewer, or different types of data or units. Moreover, the data and units illustrated in the example of FIG. 3 are provided for purposes of explanation and may not represent how data is actually stored or how software is actually implemented. Registration system 316, planning system 318, and virtual guidance system 320 may comprise instructions that are executable by processors 302.
- Computing system 102 may receive, from tracking system 116, tracking input (e.g., signals) of a scene that includes humerus 112.
- Registration system 316 may generate registration data 310 that registers humeral tracking structure 106 with a coordinate system.
- Registration data 310 may define transforms between a virtual coordinate system and the physical coordinate system.
- registration system 316 may obtain a first point cloud and a second point cloud.
- the first point cloud includes points on one or more virtual objects, such as a virtual object representing a surface of a bone.
- the second point cloud may include points on real-world objects, such as tracking marker 110 and tracking marker 124.
- the points in the first point cloud may be expressed in terms of coordinates in a virtual coordinate system and the points in the second point cloud may be expressed in terms of coordinates in a physical coordinate system. Because virtual objects may be designed with positions that are relative to one another but not relative to any real -world objects, the virtual and physical coordinate systems may be different.
- Registration system 316 may generate the second point cloud using a Simultaneous Localization and Mapping (SLAM) algorithm. By performing the SLAM algorithm, registration system 316 may generate the second point cloud based on the tracking data.
- Registration system 316 may perform one of various implementations of SLAM algorithms, such as a SLAM algorithm having a particular filter implementation, an extended Kalman filter implementation, a covariance intersection implementation, a GraphSLAM implementation, an ORB SLAM implementation, or another implementation.
- registration system 316 applies an outlier removal process to remove outlying points in the first and/or second point clouds.
- the outlying points may be points lying beyond a certain standard deviation threshold from other points in the point clouds. Applying outlier removal may improve the accuracy of the registration process.
- registration system 316 may generate a preliminary spatial relationship between points in the first point cloud and points in the second point cloud.
- registration system 316 may perform an iterative closest point (ICP) algorithm to determine the preliminary spatial relationship between the points in the first point cloud and the points in the second point cloud.
- ICP iterative closest point
- the ICP algorithm may determine a preliminary spatial relationship between points on a bone in the physical coordinate system and corresponding points in a model of the bone in the virtual coordinate system.
- the iterative closest point algorithm finds a combination of translational and rotational parameters that minimize the sum of distances between corresponding points in the first and second point clouds.
- the iterative closest point algorithm determines a combination of translational and rotational parameters that minimizes AA + AB + AC, where AA is the distance between A and A’, AB is the distance between B and B’, and AC is the distance between C and C’.
- registration system 316 may perform the following steps:
- the first point cloud includes points corresponding to landmarks on one or more virtual objects and the second point cloud may include points corresponding to landmarks on real-world objects (e.g., tracking marker 110, tracking marker 124).
- registration system 316 may determine rotation and translation parameters that describe a spatial relationship between the original positions of the points in the first point cloud and the final positions of the points in the first point cloud.
- the determined rotation and translation parameters can therefore express a mapping between the first point cloud and the second point cloud.
- Registration data 310 may include the determined rotation and translation parameters. In this way, registration system 316 may generate registration data 310.
- Plan data 312 may include data information describing a plan for a user to follow with respect to a patient.
- plan data 312 may include surgical planning data that describe a process to prepare for and conduct a surgery on the patient.
- plan data 312 may include data defining a cutting plane for resecting the humeral head of humerus 112.
- plan data 312 may include other information describing a process to prepare for and conduct the surgery on the patient.
- plan data 312 may include information defining a planned reaming axis and position for reaming the patient’s scapula, information defining a planned axis for inserting a surgical pin in a humerus for extracting a bone fragment to use as a bone graft, and other details of the surgery.
- plan data 312 also includes medical images, e.g., x-ray images, computed tomography images or models, and so on.
- Plarming system 318 may enable a user to view plan data 312.
- planning system 318 may cause a display device (e.g., MR. visualization device 104, display 308, etc.) to output one or more graphical user interfaces that enable the user to see models of anatomic structures, prostheses, bone grafts, and so on.
- planning system 318 may generate some or all of plan data 312 in response to input from the user. For example, planning system 318 may generate, based on user input, data defining a cutting plane for resecting the humeral head of humerus 112.
- virtual guidance system 320 may cause MR device 104 to output virtual objects for display.
- virtual guidance system 320 may determine positions for planned plane elements and current plane elements.
- the planned plane elements indicate locations on a planned cutting plane through a bone.
- the plarmed plane elements are not contiguous with each other.
- the current plane elements indicate locations on a current cutting plane of a sawblade of a saw.
- the current plane elements are not contiguous with each other.
- Virtual guidance system 320 may cause MR device 104 to concurrently display the planned plane elements at their determined positions and the current plane elements at their determined positions.
- At least one of the current or planned plane element of the pair has a visual property (e.g., color, texture, etc.) based on the position for the current plane element of the pair relative to the planned cutting plane.
- a visual property e.g., color, texture, etc.
- FIG. 4 is a schematic representation of a MR device in accordance with one or more techniques of this disclosure.
- MR device 104 can include a variety of electronic components found in a computing system, including one or more processors 414 (e.g., microprocessors or other types of processing units) and memory 416 that may be mounted on or within a frame 418.
- processors 414 e.g., microprocessors or other types of processing units
- memory 416 may be mounted on or within a frame 418.
- processors 302 may include processors 414 and/or memory 304 may include memory 416.
- MR device 104 may include a transparent screen 420 that is positioned at eye level when MR device 104 is worn by a user.
- screen 420 can include one or more liquid crystal displays (LCDs) or other types of display screens on which images are perceptible to a user who is wearing or otherwise using MR device 104 via screen 420.
- LCDs liquid crystal displays
- Other display examples include organic light emitting diode (OLED) displays.
- MR device 104 can operate to project 3D images onto the user’s retinas using techniques known in the art.
- screen 420 includes see-through holographic lenses, sometimes referred to as waveguides, that permit a user to see real -world objects through (e.g., beyond) the lenses and also see holographic imagery projected into the lenses and onto the user’s retinas by displays, such as liquid crystal on silicon (LCoS) display devices, which are sometimes referred to as light engines or projectors, operating as an example of a holographic projection system 838 within MR device 104.
- LCD liquid crystal on silicon
- MR device 104 may include one or more see-through holographic lenses to present virtual images to a user.
- MR device 104 can operate to project 3D images onto the user’s retinas via screen 420, e.g., formed by holographic lenses.
- MR device 104 may be configured to present a 3D virtual image to a user within a real-world view observed through screen 420, e.g., such that the virtual image appears to form part of the real-world environment.
- MR device 104 may be a Microsoft HOLO LENSTM headset, available from Microsoft Corporation, of Redmond, Washington, USA, or a similar device, such as, for example, a similar MR visualization device that includes waveguides.
- the HOLOLENS TM device can be used to present 3D virtual objects via holographic lenses, or waveguides, while permitting a user to view actual objects in a real-world scene, i.e., in a real-world environment, through the holographic lenses.
- MR device 104 may have other forms and form factors.
- Lor instance in some examples, MR device 104 may be a handheld smartphone or tablet.
- MR device 104 is a supported by an armature that allows the user to move MR device 104 into and out of a position for viewing a patient’s anatomy without the user wearing MR device 104.
- MR device 104 can also generate a user interface (UI) 422 that is visible to the user, e.g., as holographic imagery projected into see-through holographic lenses as described above.
- UI 422 can include a variety of selectable widgets 424 that allow the user to interact with a MR system.
- Imagery presented by MR device 104 may include, for example, one or more 2D or 3D virtual objects.
- MR device 104 also can include a speaker or other sensory devices 426 that may be positioned adjacent the user’s ears. Sensory devices 426 can convey audible information or other perceptible information (e.g., vibrations) to assist the user of MR device 104.
- MR device 104 can also include a transceiver 428 to connect MR device 104 to a network, a computing cloud, such as via a wired communication protocol or a wireless protocol, e.g., Wi-Fi, Bluetooth, etc.
- MR device 104 also includes a variety of sensors to collect sensor data, such as one or more optical sensors 430 and one or more depth sensors 432 (or other depth sensors), mounted to, on or within frame 418.
- optical sensor(s) 430 are operable to scan the geometry of the physical environment in which a user of MR device 104 is located (e.g., an operating room) and collect two-dimensional (2D) optical image data (either monochrome or color).
- Depth sensor(s) 432 are operable to provide 3D image data, such as by employing time of flight, stereo or other known or future -developed techniques for determining depth and thereby generating image data in three dimensions.
- Other sensors can include motion sensors 433 (e.g., Inertial Mass Unit (IMU) sensors, accelerometers, etc.) to assist with tracking movement.
- IMU Inertial Mass Unit
- Processing system 102 may receive tracking data from sensors of MR device 104.
- the tracking data may include data from optical sensors 430, depth sensors 432, motion sensors 433, and so on.
- Processing system 102 may process the tracking data so that geometric, environmental, textural, or other types of landmarks (e.g., comers, edges or other lines, walls, floors, objects) in the user’s environment or “scene” can be defined and movements within the scene can be detected.
- the various types of tracking data can be combined or fused so that the user of MR device 104 can perceive virtual objects that can be positioned, or fixed and/or moved within the scene.
- processing system 102 may process the tracking data so that the user can position a 3D virtual object on an observed physical object in the scene and/or orient the 3D virtual object with other virtual objects displayed in the scene.
- processing system 102 may process the tracking data so that the user can position and fix a virtual representation of the surgical plan (or other widget, image or information) onto a surface, such as a wall of the operating room.
- computing system 102 may use the tracking data to recognize surgical instruments and determine the positions of those surgical instruments.
- MR device 104 may include one or more processors 414 and memory 416, e.g., within frame 418 of MR device 104.
- one or more external computing resources 436 process and store information, such as sensor data, instead of or in addition to in-frame processor(s) 414 and memory 416.
- external computing resources 436 may include processing circuitry, memory, and/or other computing resources of computing system 102 (FIG. 1). In this way, data processing and storage may be performed by one or more processors 414 and memory 416 within MR device 104 and/or some of the processing and storage requirements may be offloaded from MR device 104.
- one or more processors that control the operation of MR device 104 may be within MR device 104, e.g., as processor(s) 414.
- at least one of the processors that controls the operation of MR device 104 may be external to MR device 104, e.g., as processor(s) 414.
- operation of MR device 104 may, in some examples, be controlled in part by a combination of one or more processors 414 within the visualization device and one or more processors external to MR device 104.
- processing of tracking data can be performed by processor(s) 414 in conjunction with memory 416 or memory 304.
- processor(s) 414 and memory 416 mounted to frame 418 may provide sufficient computing resources to process the tracking data collected by optical sensor(s) 430, depth sensor(s) 432 and motion sensors 433.
- the tracking data can be processed using a Simultaneous Localization and Mapping (SLAM) algorithm, or other algorithms for processing and mapping 2D and 3D image data and tracking the position of MR device 104 in the 3D scene.
- SLAM Simultaneous Localization and Mapping
- image tracking may be performed using sensor processing and tracking functionality provided by the Microsoft HOLOLENSTM system, e.g., by one or more sensors and processors 414 within a MR device 104 substantially conforming to the Microsoft HOLOLENSTM device or a similar mixed reality (MR) visualization device.
- MR mixed reality
- system 100 can also include user-operated control device(s) 434 that allow the user to operate MR device 104, use MR device 104 in spectator mode (either as master or observer), interact with UI 422 and/or otherwise provide commands or requests to processors(s) 814 or other systems connected to a network.
- control device(s) 834 can include a microphone, a touch pad, a control panel, a motion sensor or other types of control input devices with which the user can interact.
- FIG. 5A and FIG. 5B are schematic representations of an example attachment body 500 of a humeral tracking structure according to techniques of this disclosure.
- FIG. 5 A shows a non-contact side of attachment body 500.
- the non-contact side of attachment body 500 is a side of attachment body 500 that does not contact humerus 112 when attachment body 500 is attached to humerus 112.
- FIG. 5B shows a lateral view of attachment body 500.
- the lateral view shown in FIG. 5B is rotated 90° from the view shown in FIG. 5A.
- Attachment body 500 defines fixation apertures 502A, 502B, 502C, 502D, and 502E (collectively, “fixation apertures 502”).
- Fixation members may be passed through fixation apertures 502 to affix attachment body 500 to humerus 112.
- the fixation members could intrude into cancellous bone, e.g., cancellous bone of humerus 112.
- a humeral prosthesis includes a portion, e.g., a stem or peg, configured to be placed within the cancellous bone for press-fit fixation
- the fixation members could impede the implantation process if the fixation members intrude the cancellous bone of the humerus.
- the fixation members may be sized such that the fixation members do not intrude into cancellous bone of humerus 112.
- the fixation members may include depth stop elements to prevent intrusion into cancellous bone of humerus 112.
- the fixation elements are screws
- the user may use a screwdriver to tighten the screws to affix attachment body 500 to humerus 112.
- Fixation aperture 502A and fixation aperture 502B lead to fixation aperture 502C.
- the two fixation apertures 502A, 502B leading to the single fixation aperture 502 may enable attachment body 500 to be used for either a left or a right humerus.
- Fixation aperture 502D leads to fixation aperture 502E.
- Attachment body 500 defines a slot 504 having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer.
- FIG. 6A and FIG. 6B are schematic representations of an example attachment body 600 of a humeral tracking structure according to techniques of this disclosure.
- FIG. 6A shows a non-contact side of attachment body 600.
- the non-contact side of attachment body 600 is a side of attachment body 600 that does not contact humerus 112 when attachment body 600 is attached to humerus 112.
- FIG. 6B shows a lateral view of attachment body 600.
- the lateral view shown in FIG. 6B is rotated 90° from the view shown in FIG. 6A.
- Attachment body 600 defines fixation apertures 602A, 602B, 602C, 602D, and 602E (collectively, “fixation apertures 602”).
- Fixation members may be passed through fixation apertures 602 to affix attachment body 600 to humerus 112.
- the fixation members may be sized such that the fixation members do not intrude into cancellous bone of humerus 112.
- the fixation elements are screws
- the user may use a screwdriver to tighten the screws to affix attachment body 600 to humerus 112.
- Fixation aperture 602A and fixation aperture 602B lead to a single fixation aperture (not shown) on the contact side of attachment body 600.
- the two fixation apertures 602A, 602B leading to the single fixation aperture may enable attachment body 600 to be used for either a left or a right humerus.
- Fixation aperture 602D leads to fixation aperture 602E.
- Attachment body 600 defines a slot 604 having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer.
- a distal end of slot 604 is open-ended. That is, attachment body 600 includes a first prong 606A on a first side of slot 604 and a second prong 606B on a second side of slot 604.
- first prong 606A and second prong 606B define respective apertures (fixation aperture 602C, 602D) sized to accommodate fixation members for attachment of attachment body 600 to humerus 112.
- FIG. 7A and FIG. 7B are schematic representations of an example attachment body 700 of a humeral tracking structure according to techniques of this disclosure.
- Attachment body 700 is similar to attachment body 600, except has greater curvature.
- FIG. 7C and FIG. 7D are schematic representations of a first alternative version of attachment body 700.
- prongs 706A and 706B are angled relative to one another.
- FIG. 7E and FIG. 7F are schematic representations of a second alternative version of attachment body 700.
- prongs 706A and 706B are parallel and meet at a curved structure
- FIG. 7G and FIG. 7H are schematic representations of a third alternative version of attachment body 700
- FIG. 71 and FIG. 7J are schematic representations of a fourth alternative version of attachment body 700, according to techniques of this disclosure.
- FIGS. 7K, 7L, and 7M are schematic representations of a fifth alternative version of attachment body 700, according to techniques of this disclosure. As shown in the example of FIG. 7K, depth-stop screws 702 may be used to attach attachment body 700 to humerus 112. In FIGS. 7K, 7L, and 7M, prongs 706A and 706B have a textured area 708 that may improve a user’s ability to handle attachment body 700.
- FIG. 8 is a conceptual diagram illustrating an example view of registering a humeral head 802 according to techniques of this disclosure.
- a user may use a digitizer 800 to palpate humeral head 802 of humerus 112.
- Digitizer 800 includes a tracking marker 804.
- MR device 104 may display virtual instructions 806 to the user instructing the user which part of humerus 112 to palpate.
- a ring-shaped virtual element 810 indicates an amount of progress toward completion of palpating humeral head 802. Palpation of humeral head 802 may be complete when registration system 316 determines at least a sufficient quantity of points on humeral head 802. In some examples, tubercles of the humerus are not palpated during registration.
- FIG. 9 is a conceptual diagram illustrating an example view of registering a humeral metaphysis 900 according to techniques of this disclosure.
- a user may use digitizer 800 to palpate humeral metaphysis 900 of humerus 112.
- MR device 104 may display virtual instructions 906 to the user instructing the user which part of humerus 112 to palpate.
- a ring-shaped virtual element 910 indicates an amount of progress toward completion of palpating humeral metaphysis 900.
- Palpation of humeral metaphysis 900 may be complete when registration system 316 determines at least a sufficient quantity of points on humeral metaphysis 900.
- FIG. 10 is a conceptual diagram illustrating an example view of registering a bicipital groove 1000 according to techniques of this disclosure.
- a user may use digitizer 800 to palpate bicipital groove 1000 of humerus 112 via a slot defined in an attachment body 108 of humeral tracking structure 106.
- MR device 104 may display virtual instructions 1006 to the user instructing the user which part of humerus 112 to palpate.
- a ring-shaped virtual element 1010 indicates an amount of progress toward completion of palpating bicipital groove 1000.
- Palpation of bicipital groove 1000 may be complete when registration system 316 determines at least a sufficient quantity of points on bicipital groove 1000. In some examples, palpation of bicipital groove 1000 does not include an upper end of bicipital groove 1000.
- FIG. 11 is a conceptual diagram illustrating validation of registration of humeral head 802 according to techniques of this disclosure.
- MR device 104 displays a virtual element 1100 at a location on humeral head 802.
- the user is prompted to position digitizer 800 at a location on humeral head 802 indicated by virtual element 1100.
- Processing system 102 tracks the positions of digitizer 800 and humerus 112 based on the positions of tracking marker 804 and tracking marker 110, respectively.
- processing system 102 may determine a distance from a tip of digitizer 800 to the location on humeral head 802 indicated by virtual element 1100.
- MR device 104 displays a virtual element 1102 that indicates the distance from the tip of digitizer 800 to the location on humeral head 802 indicated by virtual element 1100. If the distance is non-zero when the tip of digitizer 800 is in contact with the location on humeral head 802 indicated by virtual element 1100, there registration data may be inaccurate. Accordingly, if the distance is non-zero when the tip of digitizer element 800 is in contact with the location on humeral head 802 indicated by virtual element 1100, processing system 102 may perform one or more actions to refine registration data 310 or generate new registration data. For example, if the distance is non-zero when the tip of digitizer element 800 is in contact with the location on humeral head 802, processing system 102 may restart the registration process. If the registration process is not successful, MR device 104 does not display virtual guidance.
- FIG. 12 is a conceptual diagram illustrating validation of registration of humeral head 802 according to techniques of this disclosure.
- MR device 104 may display virtual elements at multiple points on humeral head 802 to validate registration. For instance, in addition to displaying virtual element 1100 at a superior point of humeral head 802, MR device 104 may display virtual element 1200 at a most-medial point of humeral head 802. Similar to FIG. 11, MR device 104 may display a virtual element 1202 that indicates a distance of the tip of digitizer 800 to the point indicated by virtual element 1200. In some examples, MR device 104 changes the color of virtual elements 1100, 1200 based on the determined distance of the tip of digitizer 800 to the indicated point. For instance, the virtual element may be blue if the distance is non-zero and green if the distance is zero.
- FIG. 13 is a conceptual diagram illustrating validation of registration of a humeral metaphysis according to techniques of this disclosure. Similar to FIG. 11 and FIG. 12, MR device 104 may display a virtual element 1302 at a location on the humeral metaphysis 1300 of humerus 112. MR device 104 may display a virtual element 1304 that indicates a distance of a tip of digitizer 800 to the location on humeral metaphysis 1300.
- FIG. 14 is a flowchart illustrating an example registration operation according to techniques of this disclosure.
- processing system 102 may receive first signals from one or more sensors 118 of tracking system 116 (1400).
- Sensors 118 may include video and/or depth cameras and the signals may include video signals and depth image signals.
- tracking system 116 is included in MR device 104.
- Registration system 316 may determine, based on the first signals, first points corresponding to tracking marker 110 of humeral tracking structure 106 (1402).
- Humeral tracking structure 106 includes tracking marker 110 and an attachment body 108 positioned at a bicipital groove of humerus 112 of a patient.
- attachment body 108 defines two or more apertures (e.g., apertures 502, 602) sized to accommodate fixation members for attachment of attachment body 108 to humerus 112.
- attachment body 108 is manufactured to have a shape that is specific to humerus 112 of the patient.
- attachment body 108 is not specific to any patient but may be generic for all patients, or attachment body 108 may have a limited range of two or more sizes.
- attachment body 108 and tracking marker 110 are one physical unit, or attachment body 108 and tracking marker 110 are separate units assembled together.
- the first signals may comprise images of optical patterns on two or more faces of tracking marker 110. Each of the faces of tracking marker 110 may have a different optical pattern.
- Registration system 316 may determine positions of vertices of tracking marker 110 based on the images of the optical patterns. For instance, registration system may use a SLAM algorithm to determine the positions of the vertices of tracking marker 110.
- registration system 316 may cause MR device 104 to display instructions to position humeral tracking structure 106 at the bicipital groove of humerus 112 and to insert fixation elements 114 through apertures of attachment body 108 into humerus 112 to attach humeral tracking structure 106 to humerus 112.
- processing system 102 may receive second signals from sensors 118 of tracking system 116 (1404).
- Sensors 118 may include video and/or depth cameras and the second signals may include video signals and depth image signals from a time later than the first signals.
- Registration system 316 may determine, based on the second signals, second points corresponding to tracking marker 804 of digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot (e.g., slot 504, 604) defined in attachment body 108 of humeral tracking structure 106 (1406).
- the second signals may comprise images of optical patterns on two or more faces of tracking marker 804. Each of the faces of tracking marker 804 may have a different optical pattern.
- Registration system 316 may determine positions of vertices of tracking marker 804 based on the images of the optical patterns. For instance, registration system may use a SLAM algorithm to determine the positions of the vertices of tracking marker 804.
- Registration system 316 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove (1408). Positions of the first, second and third points are defined in a physical coordinate system. In some examples, registration system 316 may cause MR device 104 to display instructions 1006 to palpate the bicipital groove with the digitizer 800, e.g., as shown in FIG. 10.
- Registration system 316 may generate, based on the third points and a virtual model of the humerus having points defined in a virtual coordinate system, registration data 310 defining a relationship between the physical coordinate system and the virtual coordinate system (1410). Registration system 316 may use the process described above with reference to FIG. 2 to generate registration data 310.
- registration system 316 may generate registration data 310 based on additional information. For instance, registration system 316 may receive third signals from sensors 118 of tracking system 116 and determine, based on the third signals, positions of fourth points in the physical coordinate system corresponding to tracking marker 804 while the tip of digitizer 800 palpates a humeral head of humerus 112. Additionally, registration system 316 may receive fourth signals from sensors 118 of tracking system 116 and determine, based on the fourth signals, positions of fifth points in the physical coordinate system corresponding to tracking marker 804 while the tip of digitizer 800 palpates a metaphysis of humerus 112. Registration system 316 may generate registration data 310 comprises generating registration data 310 based on the third points, the fourth points, the fifth points, and the virtual model of the humerus, e.g., using an ICP algorithm.
- registration system 316 may perform a process (e.g., as shown in the examples of FIG. 11 to FIG. 13) to validate the registration.
- registration system 316 may receive third signals from sensors 118 of tracking system 116 after generating registration data 310.
- Registration system 316 may determine, based on registration data 310 and the third signals, a distance between a location on a surface of humerus 112 and the tip of digitizer 800.
- Registration system 316 may cause MR device 106 to display a first virtual element (e.g., virtual element 1100, 1200, 1302) at the location on the surface of humerus 112 and to display a second virtual element (e.g., virtual element 1102, 1202, 1304) indicating the determined distance of a tip of digitizer 800 to the location on the surface of humerus 112.
- Registration system 316 may refine registration data 310 based on the determined distance being greater than a threshold while the tip of the digitizer is positioned at the location on the surface of the humerus.
- processing system 102 may determine, based on registration data 310, points in the physical coordinate system corresponding to points of a virtual element defined in the virtual coordinate system.
- Virtual guidance system 320 may cause MR device 104 to display a virtual element (e.g., virtual elements 1100, 1200, 1302, virtual elements for resecting a portion of the humerus, etc.) such that the virtual element appears to a user to be located at the determined points in the virtual coordinate system.
- a virtual element e.g., virtual elements 1100, 1200, 1302, virtual elements for resecting a portion of the humerus, etc.
- processors 302 of computing system 300 may be configured to control communication interface 306 to output instructions to MR device 104 to display, via user interface 422, a virtual bone model (not depicted) and a virtual planned cutting plane (not depicted) adjacent to a bone, e.g., humerus 112, based on registration data 310.
- the virtual bone model and the virtual planned cutting plane may be rotated to correspond to the position of humerus 112.
- the user may determine whether the registration process was successful based on the display of the virtual bone model and virtual planned cutting plane adjacent to humerus 112.
- the user may proceed with a next step of the surgery, e.g., the arthroplasty.
- the registration process may be unsuccessful due to an incorrect placement of a tracking marker, e.g., tracking marker 110.
- the user may determine to restart the registration process or otherwise update registration data 310.
- FIG. 15 is a conceptual diagram illustrating an example of identifying a position of sawblade 122 according to techniques of this disclosure.
- oscillating saw 120 includes a sawblade 122.
- a tracking marker 124 is attached to a body of saw 120.
- a tracking structure 1500 includes a support body 1502 and a tracking marker 1504. Tracking marker 1504 is connected to support body 1502.
- Support body 1502 defines a recess to accommodate sawblade 122.
- the recess is a rectangular opening that passes through support body 1502.
- the recess is an indented region on a lower surface of support body 1502.
- the lower surface of support body 1502 may be a surface of support body 1502 opposite the surface to which tracking marker 124 is attached.
- the recess is an indented region on an upper surface of support body 1502.
- the upper surface of support body 1502 may be the surface of support body 1502 to which tracking marker 124 is attached.
- support body 1502 defines multiple recesses, such as multiple slots. Each of the slots may correspond to a different sawblade thickness.
- registration system 316 determines a spatial relationship between tracking marker 124 and the bottom of sawblade 122. To do so, a user inserts sawblade 122 into the recess defined in support body 1502 of tracking structure 1500.
- registration system 316 may determine the spatial relationship between tracking marker 124 and the bottom of sawblade 122. For instance, in the example of FIG. 15, if sawblade 122 is thicker, tracking marker 1504 and tracking marker 124 are further from one other vertically. If sawblade 122 is thinner, tracking marker 1504 and tracking marker 124 are closer to each other vertically.
- FIG. 16 is a conceptual diagram illustrating an example of identifying a position of sawblade 122 according to techniques of this disclosure.
- MR device 104 displays a virtual element 1600 that guides the user where to place support body 1502 during a process of identifying a position of sawblade 122.
- registration system 316 may cause MR device 104 to display instructions to position the sawblade in a recess of support body 1502 as a virtual representation (e.g., virtual element 1600) of one or more surfaces of support body 1502 at a position along a lateral side (non-tip side) of sawblade 122.
- a virtual representation e.g., virtual element 1600
- FIG. 17A is a conceptual diagram illustrating a profile view of an example tracking structure 1500 according to techniques of this disclosure.
- support body 1502 defines a recess 1700 into which sawblade 122 may be positioned during the process of identifying the position of sawblade 122.
- FIG. 17B is a conceptual diagram illustrating a profile view of an example tracking structure 1500 according to techniques of this disclosure.
- support body 1502 is shaped to define a first recess 1750 onto which sawblade 122 may be positioned during the process of identifying the position of sawblade 122. Additionally, support body 1502 is shaped to define a second recess 1752 onto which sawblade 122 may be positioned during a step of confirming the position of sawblade 122.
- FIG. 18 is a conceptual diagram illustrating an example of identifying a position of a spatial orientation of sawblade 122 according to techniques of this disclosure.
- the process of FIG. 18 for identifying a position of sawblade 122 may be used with respect to a surgery to resect a humeral head of humerus 112 or with respect to any other bone or organ. Ensuring that sawblade 122 is correctly angled with respect saw 120 may be another important factor when providing virtual guidance on how to make a bone cut. For example, if sawblade is angled upward or downward relative to saw 120, and this is not accounted for, it may be possible for MR device 104 to indicate cut guidance at the wrong position. Additionally, sawblade 122 (or tracking marker 124) may be rotated relative to the body of saw 120. If this rotation is not properly accounted for, it may be possible for MR device 104 to indicate cut guidance at the wrong angle.
- MR device 104 may display a set of expected plane elements 1800A, 1800B, 1800C, and 1800D (collectively, “expected plane elements 1800”).
- Registration system 316 may determine the positions of expected plane elements 1800 based on the position of tracking marker 124.
- Expected plane elements 1800 are termed “expected” because they indicate a plane in which sawblade 122 would be expected to operate (i.e., an expected operating plane of sawblade 122), given the position of tracking marker 124.
- MR device 104 may display a set of current plane elements 1802A, 1802B, 1802C, and 1802D (collectively, “current plane elements 1802”).
- Registration system 316 may determine the positions of current plane elements 1802 based on the position of tracking marker 1504 of tracking structure 1500.
- MR device 104 may change the color (or some other attribute such as texture) of the current plane element depending on whether the current plane element is above or below its corresponding expected plane element.
- at least one of the current or planned plane element of the pair has a visual property (e.g., color, texture, etc.) based on the position for the current plane element of the pair relative to the expected plane of sawblade 122.
- current plane element 1802A is below its corresponding expected plane element 1800A, so current plane element 1802A is a first color.
- Current plane element 1802B is above its corresponding expected plane element 1800B, so current plane element 1802B is a second different color.
- Current plane elements 1802 may have a third color (e.g., the same color as expected plane markers 1800 or another color) if current plane elements 1802 are at the same position as their corresponding expected plane elements 1800.
- a user may be able to determine, based on the colors of current plane elements 1802 whether tracking structure 1500 is aligned with the plane in which sawblade 122 would be expected to operate.
- Tracking structure 1500 might not be aligned with the expected operating plane of sawblade 122 because sawblade 122 is not correctly positioned in the recess of tracking structure 1500, because sawblade 122 itself is not in the expected operating plane of sawblade 122, or both. In any of these scenarios, the user may be able to use expected plane elements 1800 and current plane elements 1802 to make adjustments.
- FIG. 19A is a conceptual diagram illustrating an example of identifying a position of a sawblade 122 according to techniques of this disclosure.
- support body 1502 of tracking structure 1500 is positioned at a tip of sawblade 122 instead of alongside sawblade 122.
- the design of support body 1502 is an alternative to the design shown in FIGS. 15-17.
- registration system 316 may cause MR device 104 to display instructions to position sawblade 122 in a recess of support body 1502.
- registration system 316 may cause MR device 104 to instruct the user to position a tip of sawblade 122 in a slot-shaped recess 1902.
- recess 1902 of support body 1502 is closed-ended such that sawblade 122 cannot be inserted completely through support body 1502. In other examples, recess 1902 of support body 1502 is open-ended such that sawblade 122 can be inserted completely through support body 1502.
- support body 1502 of tracking structure defines a second slot-shaped recess.
- the user may position the tip of sawblade 122 in the second recess.
- the second recess has a predefined depth and a predefined distance from marker 1504.
- registration system 316 may use the predefined depth and the predefined distance, along with the position of marker 154 to confirm the position of the tip of sawblade 122.
- the second recess may be on the same side or a different side of support body 1502 as recess 1902.
- FIG. 19B is a conceptual diagram illustrating an example tracking structure 1920 for identifying the position of a sawblade according to techniques of this disclosure.
- tracking structure 1920 includes a support body 1922 and a tracking marker 1924.
- Support body 1922 defines a slot 1926 into which a tip of sawblade 122 may be inserted during the process of identifying a position of sawblade 122.
- FIG. 19C is a conceptual diagram illustrating a bottom view of tracking structure 1920. Slot 1926 may be open-ended or closed- ended.
- FIG. 19D is a conceptual diagram illustrating an alternative tracking structure 1940 for identifying the position of a sawblade according to techniques of the disclosure.
- tracking structure 1940 includes a support body 1942 and a tracking marker 1944.
- Support body 1942 defines a slot 1946 into which a side of sawblade 122 may be inserted during the process of identifying a position of sawblade 122.
- Support body 1942 defines a verification slot 1948 into which the side of sawblade 122 may be inserted to confirm the position of sawblade 122.
- Slot 1946 and verification slot 1948 may be defined on the same side of support body 1942.
- slot 1946 and slot 1948 are formed parallel with one another and generally parallel with the major outer surfaces of support body 1942.
- FIG. 19E is a conceptual diagram illustrating a second alternative tracking structure 1960 for identifying the position of a sawblade according to techniques of the disclosure.
- tracking structure 1960 includes a support body 1962 and a tracking marker 1964.
- Support body 1962 defines a slot 1966 into which a side of sawblade 122 may be inserted during the process of identifying a position of sawblade 122.
- Support body 1962 defines a verification slot 1968 into which the side of sawblade 122 may be inserted to confirm the position of sawblade 122.
- Slot 1966 and verification slot 1968 may be defined diagonally relative to a top and bottom surface of support body 1962.
- FIG. 20 is a flowchart illustrating an example operation for identifying a position of sawblade 122 for MR-based guidance according to techniques of this disclosure.
- registration system 316 may receive first signals from one or more sensors 118 of tracking system 116 (2000).
- the first signals may include RGB images, depth images, etc.
- registration system 316 may cause MR device 104 to display instructions to position sawblade 122 in a recess of support body 1502 of tracking structure 1500.
- Registration system 316 may determine, based on the first signals, first points corresponding to a first tracking marker 124 attached to a body of a saw 120 (2002).
- the first points may have coordinates defined in a physical coordinate system and may be associated with data that identify the points.
- the first points may include points at the comers of tracking marker 124.
- Registration system 316 may generate the data that identify the comer points based on the patterns on the faces of tracking marker 124.
- registration system 316 may determine that a first comer point is the top comer between the first and second faces, a second comer is the bottom comer between the first and second faces, and so on.
- registration system 316 may determine, based on the first signals, second points corresponding to a second tracking marker 1504 of tracking structure 1500 while sawblade 122 of saw 120 is positioned in recess 1700 defined by support body 1502 of tracking stmcture 1500 (2004).
- the second points are also defined in the physical coordinate system.
- registration system 316 may determine, based on the second points, a planar orientation of the tracking stmcture and may cause MR device 104 to display virtual elements (e.g., current plane elements 1802 and expected plane elements 1800 of FIG. 18) that indicate whether the planar orientation of the tracking stmcture is aligned with a cutting plane of sawblade 122.
- virtual elements e.g., current plane elements 1802 and expected plane elements 1800 of FIG. 18
- Registration system 316 may generate, based on the first points and the second points, position identification data that specify a position of a lower edge of sawblade 122 relative to first tracking marker 124 (2006).
- the position identification data generated by registration system 316 may indicate that the lower edge of sawblade 122 is at a specific position in space relative to one or more points on tracking marker 124.
- the position identification data may be defined in terms of one or more vectors in the physical coordinate system.
- virtual guidance system 320 may obtain second signals from sensors 118 of tracking system 116. Virtual guidance system 320 may determine, based on the second signals, third points corresponding to tracking marker 124. Virtual guidance system 320 may determine, based on the third points and the position identification data, guidance data that guide the user to position the lower edge of the sawblade at a location on a bone of a patient while cutting the bone. Virtual guidance system 320 may cause MR device 106 to display the virtual guidance. An example of displaying guidance data is provided below.
- FIG. 21 is a conceptual diagram illustrating example virtual elements for guiding resection of a humeral head 802 according to techniques of this disclosure.
- MR device 104 displays planned plane elements 2100A, 2100B, 2100C (collectively, “planned plane elements 2100”).
- planned plane elements 2100 there may be more or fewer plarmed plane elements 2100.
- one or more of planned plane elements 2100 may be hidden from the perspective shown in FIG. 21.
- Each of planned plane elements 2100 may indicate a location on a planned cutting plane through a bone (e.g., humerus 112) of a patient.
- the planned cutting plane may be a 2-dimensional plane along which sawblade 122 is planned to move while cutting the bone according to a surgical plan.
- MR device 104 displays current plane elements 2102A, 2102B, 2102C (collectively, “current plane elements 2102”). In other examples, there may be more or fewer current plane elements 2102. For instance, one or more of current plane elements 2102 may be hidden from the perspective shown in FIG. 21. Each of current plane elements 2102 may indicate a location on a current cutting plane of sawblade 122 (not shown in FIG. 21). MR device 104 may determine the positions of planned plane elements 2100 based on the position of tracking marker 110. MR device 104 may determine the positions of current plane elements 2102 based on the position of tracking marker 110 and tracking marker 124 attached to the body of saw 120.
- planned plane element 2100C and current plane element 2102C are at the same position, which may make planned plane element 2100 indistinguishable from current plane element 2102C.
- a user may be able to determine, based on the colors of current plane elements 2102 whether a bottom of sawblade 122 is aligned with a planned cutting plane through the bone (e.g., a planned cutting plane to resect humeral head 802 from humerus 112).
- planned plane elements 2100 and current plane elements 2102 have crescent shapes.
- planned plane elements 2100 and current plane elements 2102 have other shapes.
- planned plane elements 2100 and current plane elements 2102 may have circular shapes, linear shapes, angled shapes, square shapes, and so on.
- planned plane elements 2100 and current plane elements 2102 may have 2-dimensional or 3 -dimensional shapes.
- Plarmed plane elements 2100 are not contiguous with each other.
- current plane elements 2102 are not contiguous with each other.
- each of planned plane elements 2100 and current plane elements 2102 is discrete.
- Using discrete plane elements to indicate where to make a bone cut may be advantageous over displaying a full plane of the bone cut. For instance, displaying a rectangular plane element oriented in 3 dimensions along the planned cut plane may obscure the user’s vision of the bone and may not be as helpful in indicating how to adjust the position of sawblade.
- Displaying two full planes may further hinder the user’s vision of the surgical site and it may be more difficult for the user to understand how to reorient the sawblade to align the sawblade with the planned cut plane.
- MR device 104 may also display an entry line element 2104.
- Entry line element 2104 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone.
- virtual guidance system 320 may cause MR device 104 to display a virtual line element (e.g., entry line element 2104) such that the virtual line element appears to the user to be located on the bone at locations where the planned cutting plane intersects the bone.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) humeral head 802.
- the virtual humeral model is not shown in the figures because it would overlap with humerus 112 itself.
- the virtual humeral model may be a virtual model of a portion of humerus 112 that includes humeral head 802.
- the virtual humeral model may be semitransparent, which may have the effect of darkening humeral head 802, which may make it easier for the user to see entry line element 2104 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with humerus 112 that registration between humerus 112 and the virtual elements remains valid.
- FIG. 22 is a flowchart illustrating an example operation for providing cut guidance according to techniques of this disclosure.
- virtual guidance system 320 may receive first signals from one or more sensors of a tracking system (2200).
- Virtual guidance system 320 may determine, based on the first signals, positions for a plurality of planned plane elements 2100 (2202). Additionally, virtual guidance system 320 may determine, based on the first signals, positions for a plurality of current plane elements 2102 (2204).
- Each of planned plane elements 2100 indicates a location on a planned cutting plane through a bone (e.g., humerus 112) of a patient. Planned plane elements 2100 are not contiguous with each other. In other words, plarmed plane elements 2100 are visually separate virtual elements and not part of the same visible virtual element, such as a single virtual element representing the planned cutting plane.
- each of current plane elements 2102 corresponds to one of planned plane elements 2100. For instance, in the example of FIG.
- current plane element 2102A corresponds to planned plane element 2100A
- current plane element 2102B corresponds to planned plane element 2100B
- Each of current plane elements 2102 indicates a location on a current cutting plane of the sawblade.
- Current plane elements 2102 are not contiguous with each other.
- Virtual guidance system 320 may cause MR device 104 to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements (2206). For one or more pairs of corresponding current plane elements and planned plane elements, at least one of the current plane element of the pair or the planned plane element of the pair has a visual property (e.g., color, texture, etc.) that is based on the position for the current plane element of the pair relative to the planned cutting plane. For instance, virtual guidance system 320 may determine the visual property based on whether the location indicated by the current plane element of the pair is above or below the planned cutting plane. In the example of FIG.
- current plane element 2102A has a first color/texture because current plane element 2102A is below the planned cutting plane (as indicated by planned plane element 2100A) and current plane element 2102B has a second, different color/texture because current plane element 2102B is above planned cutting plane (as indicated by planned plane element 2100B).
- Virtual guidance system 320 may cause MR device 104 to update positions of the current plane elements based on changes to the current cutting plane of the sawblade.
- Registration system 316 may determine, based on the third signals, second points corresponding to tracking marker 804 of digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot defined in attachment body 108 of humeral tracking structure 106. Registration system 316 may determine, based on the second points and a predetermined spatial relationship between the tracking marker 804 and the tip of digitizer 800, third points corresponding to the bicipital groove. Positions of the first, second and third points may be defined in a physical coordinate system. Registration system 316 may generate, based on the third points and a virtual model of humerus 112 having points defined in the virtual coordinate system, registration data 310 defining a relationship between the physical coordinate system and the virtual coordinate system. Virtual guidance system 320 may determine the positions for planned plane elements 2100 in the physical coordinate system based on registration data 310 and a position of tracking marker 110 in the physical coordinate system.
- FIG. 23 is a conceptual diagram illustrating an example ring-shaped virtual element 2300 for guiding resection of a humeral head according to techniques of this disclosure.
- Virtual guidance system 320 may cause MR device 104 to display ring-shaped virtual element 2300 such that ring-shaped virtual element 2300 is centered on an axis passing through a center of curvature of humeral head 802 and normal to an anatomical neck of humerus 112.
- ring-shaped virtual element 2300 may be centered on a different axis.
- ring-shaped virtual element 2300 may be centered on an axis passing through the center of curvature of humeral head 802 and normal to a planned cutting plane of humerus 112.
- virtual guidance system 320 may cause MR device 104 to display ring-shaped virtual element 2300 at different positions centered along the axis.
- Ring-shaped virtual element 2300 may be partially transparent.
- Ring-shaped virtual element 2300 is spatially aligned with a current cutting plane of sawblade 122.
- virtual guidance system 320 may cause MR device 104 to tilt ring-shaped virtual element 2300 in 3D space to match the current cutting plane of sawblade 122 while keeping a center of ring-shaped virtual element 2300 centered on the axis.
- Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2304.
- Entry line element 2304 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone.
- virtual guidance system 320 may cause MR device 104 to display a virtual element 2306 indicating a difference in height (AHeight), a difference in frontal angle (AFrontal), and a difference in Sagittal angle (ASagittal) of the current cutting plane of sawblade 122 relative to the planned cutting plane.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) humeral head 802.
- the virtual humeral model may be a virtual model of a portion of humerus 112 that includes humeral head 802.
- the virtual humeral model may be semitransparent, which may have the effect of darkening humeral head 802, which may make it easier for the user to see entry line element 2304 and/or other virtual elements.
- the user may be able to determine, based on alignment of the virtual humeral model with humerus 112 that registration between humerus 112 and the virtual elements remains valid.
- a line corresponding to the current cutting plane may be shown on the virtual humeral model.
- Each of supplemental elements 2402 is a version of ring-shaped virtual element 2400 but rotated orthogonal to a plane of ring-shaped virtual element 2400.
- Visual properties e.g., colors, textures, etc.
- Supplemental elements 2402 may help the user determine the angle and position of sawblade 122 even when the user cannot see all of ringshaped virtual element 2400.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) humeral head 802.
- the virtual humeral model may be a virtual model of a portion of humerus 112 that includes humeral head 802.
- the virtual humeral model may be semitransparent, which may have the effect of darkening humeral head 802, which may make it easier for the user to see entry line element 2404 and/or other virtual elements.
- the user may be able to determine, based on alignment of the virtual humeral model with humerus 112 that registration between humerus 112 and the virtual elements remains valid.
- a line corresponding to the current cutting plane may be shown on the virtual humeral model.
- Planned plane virtual elements 2500 may be separated from each other on the planned cutting plane by a 90° angle, or another angle.
- Current plane virtual elements 2502 may be separated from each other on the current cutting plane by a 90° angle, or another angle.
- the line elements of current plane virtual elements 2502 correspond to a current cutting plane.
- the user may be able to determine that sawblade 122 is aligned with the planned cutting plane when the line elements of current plane elements 2502 match the line elements of planned plane elements 2502.
- MR device 104 may position planned plane virtual elements 2500 and current plane virtual elements 2502 so that the planned plane virtual elements 2500 and current plane virtual elements 2502 appear to the user as if there were invisible mirrors behind and next to the humeral head, which reflect images of the planned and current cutting planes.
- Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2504.
- Entry line element 2504 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone.
- virtual guidance system 320 may cause MR device 104 to display a virtual element 2506 indicating a difference in height (AHeight), a difference in frontal angle (AFrontal), and a difference in Sagittal angle (ASagittal) of the current cutting plane of sawblade 122 relative to the planned cutting plane.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head.
- the virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head.
- the virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2504 and/or other virtual elements.
- the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid.
- a line 2508 corresponding to the current cutting plane may be shown on the virtual humeral model.
- FIG. 26 is a conceptual diagram illustrating example virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- virtual guidance system 320 may cause MR device 104 to display a planned virtual element 2600 and a current virtual element 2602.
- Planned virtual element 2600 includes a circle 2604 corresponding to the planned cutting plane through the humerus.
- Planned virtual element 2600 also includes a circle 2606 in a plane orthogonal to the planned cutting plane and orthogonal to a planned insertion axis of sawblade 122.
- current virtual element 2602 includes a circle 2608 corresponding to a current cutting plane.
- Current virtual element 2602 also includes a circle 2610 that is orthogonal to the current cutting plane and orthogonal to a current axis of sawblade 122.
- Virtual guidance system 320 may cause MR device 104 to update one or more visual properties of planned virtual element 2600 and current virtual element 2602 based on the alignment of the planned cutting plane and the current cutting plane. For instance, MR device 104 may update the visual properties of circles 2604, 2606, 2608, 2610 based on the alignment of the planned cutting plane and the current cutting plane such that the visual properties of circles 2604, 2606, 2608, 2610 match when the planned cutting plane and the current cutting plane are aligned. The visual properties of circles 2604, 2606, 2608, and 2610 may change to indicate a direction or angle by which the planned cutting plane and the current cutting plane are not aligned.
- a first color e.g., red
- a second color e.g., green
- virtual guidance system 320 may cause MR device 104 to display a transverse virtual element 2612.
- Transverse virtual element 2612 may be orthogonal to the current cutting plane and have a diameter aligned with a lengthwise axis of sawblade 122. Transverse virtual element 2612 may help the user visualize the angle of sawblade 122.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head.
- the virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head.
- the virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2616 and/or other virtual elements.
- the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid.
- a line corresponding to the current cutting plane may be shown on the virtual humeral model.
- FIG. 27 is a conceptual diagram illustrating first example elevation and angular virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- virtual guidance system 320 causes MR device 104 to display a frontal angle element 2700, a sagittal angle element 2702, and an elevation element 2704.
- Frontal angle element 2700 includes a marker 2706 that indicates a frontal angle of a planned cutting plane and a marker 2708 that indicates a frontal angle of a current cutting plane of sawblade 122.
- sagittal angle element 2702 includes a marker 2710 that indicates a sagittal angle of the planned cutting plane and a marker 2712 that indicates a sagittal angle of the current cutting plane of sawblade 122.
- Elevation element 2704 includes a marker 2714 and a marker 2716 that indicate the relative elevation of the current cutting plane relative to the elevation of the planned cutting plane.
- Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2718.
- Entry line element 2718 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head.
- the virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head.
- the virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2718 and/or other virtual elements.
- the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid.
- the virtual humeral model may be divided into two parts, so that a line 2720 is defined, along the current cutting plane.
- FIG. 28 is a conceptual diagram illustrating second example elevation and angular virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- frontal angle element 2800, sagittal angle element 2802, and elevation element 2804 have the same functionality as frontal angle element 2700, sagittal angle element 2702, and elevation element 2704 of FIG. 27.
- Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2806.
- Entry line element 2806 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head.
- the virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head.
- the virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2104 and/or other virtual elements.
- the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid. Furthermore, in some examples, a line corresponding to the current cutting plane may be shown on the virtual humeral model.
- FIG. 29 is a conceptual diagram illustrating example elevation and angle virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
- virtual guidance system 320 causes MR device 104 to display a frontal angle element 2900, a sagittal angle element 2902, and an elevation element 2904.
- Each of frontal angle element 2900, sagittal angle element 2902, and elevation element 2904 are divided into two halves.
- both halves may have the same visual property (e.g., same color or texture) if the current cutting plane is aligned with the planned cutting plane.
- the visual properties of the different halves of frontal angle element 2900, sagittal angle element 2902, and elevation element 2904 are different in a way that indicates how to adjust the current cutting plane. For instance, an upper half of elevation element 2904 may be highlighted to indicate that the user should increase the elevation of the current cutting plane. In another example, a front or back half of sagittal angle element 2902 may be highlighted to indicate that the user should tilt sawblade more upward or downward in the sagittal plane. In another example, a left or right half of frontal angle element 2900 may be highlighted to indicate that the user should tilt sawblade more leftward or rightward in the frontal plane.
- Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2906.
- Entry line element 2906 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head.
- the virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head.
- the virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2906 and/or other virtual elements.
- FIG. 30 is a conceptual diagram illustrating an example elevation element 3000 and bubble-level element 3002 for guiding resection of a humeral head according to techniques of this disclosure.
- Elevation element 3000 includes a marker 3004 and marker 3006.
- Virtual guidance system 340 may cause MR device 104 to update the position of marker 3006 relative to marker 3004 based on a difference in elevation between the planned cutting plane and the current cutting plane. Thus, when marker 3004 and marker 3006 overlap, the plarmed cutting plane and the current cutting plane have the same elevation.
- bubble-level element 3002 includes a crosshair pattern and a marker element 3008.
- MR device 104 may update the position of marker element 3008 based on angles (e.g., frontal and sagittal angles) of the planned cutting plane relative to the current cutting plane.
- marker element 3008 is centered on bubble-level element 3002.
- marker element 3008 may be centered on the center of the crosshair pattern of bubble-level element 3002.
- MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head.
- the virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head.
- the virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2104 and/or other virtual elements.
- the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid.
- a line 3012 corresponding to the current cutting plane may be shown on the virtual humeral model.
- MR device 104 may display ring-shaped virtual element 2300 of FIG. 23 along with planned plane virtual elements 2500 and current plane virtual elements 2502.
- MR device 104 may display frontal angle element 2700, sagittal angle element 2702, and/or elevation element 2704 of FIG. 27 along with bubble-level element 3002 of FIG. 30.
- virtual guidance system 320 may receive data indicating user preferences on which virtual elements or combinations of virtual elements to display for guiding humeral head resection.
- the operation of FIG. 22 may apply, with appropriate modifications, with respect to FIGS. 23-30.
- FIG. 21 and FIGS. 23-30 may be designed to increase the user’s ability to clearly see both the humerus and the sawblade, as well as to intuitively understand how the current cutting plane relates to the planned cutting plane.
- FIGS. 23-30 show a tracking marker connected to a clamping structure.
- tracking markers may be connected to a humeral tracking structure positioned at a bicipital groove of the humerus as described elsewhere in this disclosure.
- FIG. 31 is a conceptual diagram illustrating an example of confirming accuracy of a humeral resection according to techniques of this disclosure.
- a resected surface 3100 of humerus 112 results from resecting a humeral head of humerus 3102.
- a tracking structure 3106 includes a tracking marker 3108 and a support body 3110.
- Support body 3110 has a lower surface 3112 opposite an upper surface 3114 that is connected to tracking marker 3108.
- Lower surface 3112 is flat.
- a user e.g., a surgeon
- tracking structure 3106 is the same instrument as tracking structure 1500 of FIGS. 15, 16, 17A, 17B, 18, and 19A. In examples in which tracking structure 3106 is the same as tracking structure 1500, tracking structure 3106 may be configured to identify a position and/or thickness of sawblade 122 according to the techniques described with respect to FIG. 15. In other examples, tracking structure 3106 is a separate instrument from tracking structure 1500. In examples in which tracking structure 3106 is a separate instrument from tracking structure 1500, tracking structure may be similar to tracking structure 1500 in shape and size but may not include recesses, such as slots (e.g., slots 1946 and 1948).
- Processing system 102 may determine a position of tracking marker 3108 within a physical coordinate system. Humerus 112 is registered to the physical coordinate system based on tracking marker 110. Processing system 102 may therefore be able to determine the location in the physical coordinate system of lower surface 3112 of support body 2410, and hence the location of the resected surface 3100 of humerus 112. Furthermore, because the physical coordinate system is registered with a virtual coordinate system in which a planned cutting plane is defined, processing system 102 may determine whether the resected surface 3100 of humerus 112 is positioned and oriented along the planned cutting plane. In the example of FIG.
- MR device 104 may output a virtual element 3116 that indicates a resection level, version angle, and inclination angle of resected surface 3100 determined based on tracking structure 3106.
- the resection level indicates a height of the cut.
- MR device 104 may output a virtual element 3116 indicating a perimeter of a planned cutting plane.
- MR device 104 may additionally output a virtual overlay element (not depicted) on resected surface 3100 to indicate which areas of resected surface 3100 deviate (e.g., deviate above or deviate below) from the planned cutting plane.
- different visual properties e.g., colors or textures
- one color (e.g., green) of the virtual overlay element may correspond to deviations below the plarmed cutting plan, and another color (e.g., blue) may correspond to deviations above the planned cutting plane.
- Tracking structure 3106 may not be specific to any patient but may be generic for all patients or tracking structure 3106 may have a limited range of two or more sizes, such as a child size, a small adult size, and a large adult size.
- FIGS. 32A-32E are conceptual diagrams illustrating example guidance displayed by a virtual guidance element 3200 for positioning sawblade 122 at a correct position and orientation, in accordance with techniques of this disclosure.
- MR device 104 may display virtual guidance element 3200 in the vicinity of the surgical site, such that the user may easily see both virtual guidance element 3200 and the surgical site.
- virtual guidance system 320 may determine where to position virtual guidance element 3200 relative to humerus 112 prior to displaying virtual guidance element 3200. In some examples, virtual guidance system 320 determines the position of MR device 104 relative to humerus 112 during a collar-fixing procedure of the registration process. During the collar-fixing procedure of the registration process, an optical marker, e.g., tracking marker 124, may be affixed to saw 120, and the user may position sawblade 122 next to humerus 112 such that sensors, e.g., optical sensors 430, of MR device 104 detect tracking marker 124 and tracking marker 110.
- an optical marker e.g., tracking marker 124
- virtual guidance system 320 may determine where to position virtual guidance element 3200 such that virtual guidance element 3200 is visible but disposed away from saw 120, humerus 112, and any tracking markers, e.g., tracking marker 110, tracking marker 124, and tracking marker 3518.
- virtual guidance element 3200 is not directly superimposed on the surgical site but may appear to be at a location adjacent to the surgical site.
- virtual guidance element 3200 is locked to a position relative to the surgical site. Virtual guidance element 3200 may provide an easy-to- understand way for the user to align the current cutting plane with the planned cutting plane.
- Virtual guidance element 3200 includes a divided ring element 3202.
- Divided ring element 3202 includes an enclosed area (e.g., circle, rectangle, ovoid, etc.) bisected by a line.
- the line corresponds to a planned cutting plane.
- MR device 104 may display virtual guidance element 3200 such that the line bisecting divided ring element 3202 appears to a user to be aligned with a planned cutting plane.
- the line bisecting divided ring element 3202 may extend beyond divided ring element 3202.
- Virtual guidance element 3200 may also include an active element 3204 that provides information about the position and orientation of the current cutting plane of sawblade 122.
- An inner edge of active element 3204 (e.g., the flat side of the semicircular active element 3204 shown in FIGS. 32A - 32D) corresponds to a superior/inferior angle of the current cutting plane of sawblade 122 relative to the planned cutting plane.
- the inner edge of active element 3204 is an edge of active element closer to the line through divided ring element 3200 and an outer edge of active element 3204 is an edge of active element further from the line through divided ring element 3200.
- the current cutting plane of sawblade 122 is angled superiorly or inferiorly relative to the planned cutting plane. If the inner edge of active element 3204 is above or below the line of divided ring element 3202, the resection level of the current cutting plane into the bone is inferior or superior to the resection level of the planned cutting plane into the bone.
- the outer edge of active element 3204 corresponds to an anterior/posterior angle of the current cutting plane of sawblade 122 relative to the planned cutting plane. Greater distances between the inner edge of active element 3204 and a center point of the outer edge of active element 3204 correspond to greater anterior/posterior angles of the current cutting plane of sawblade 122 relative to the planned cutting plane.
- divided ring element 3202 and/or active element 3204 may have shapes other than circles and semicircles, such as full or partial ovals, ellipsoids, squares, rectangles, rhombuses, and so on.
- Virtual guidance element 3200 may be surrounded by a semi-transparent colored, e.g., black, field.
- the colored field may enhance visibility of virtual guidance element 3200.
- the resection level of the current cutting plane is inferior to the resection level of the planned cutting plane (e.g., by 3 mm), the superior/inferior angle of the current cutting plane relative to the planned cutting plane is correct (e.g., 0°), and the current cutting plane is angled anteriorly relative to the planned cutting plane (e.g., by 15°). If the current cutting plane were angled posteriorly relatively to the planned cutting plane, the outer edge of active element 3204 would appear above the inner edge of active element 3204 instead of below the inner edge of active element 3204 as shown in FIGS. 32A-32D. In FIG.
- the resection level of the current cutting plane is inferior to the resection level of the planned cutting plane (e.g., by 3 mm), the current cutting plane is angled superiorly relative to the planned cutting plane (e.g., by 10°), and the current cutting plane is angled anteriorly relative to the planned cutting plane (e.g., by 15°).
- the resection level of the current cutting plane is inferior to the resection level of the planned cutting plane (e.g., by 3 mm)
- the current cutting plane is angled superiorly relative to the planned cutting plane (e.g., by 10°)
- the current cutting plane is angled anteriorly relative to the planned cutting plane by a smaller amount than in FIG. 32B (e.g., by 10°).
- the resection level of the current cutting plane is aligned with the resection level of the planned cutting plane (e.g., 0 mm difference), the current cutting plane is aligned superiorly/inferiorly with the planned cutting plane (e.g., 0°), and the current cutting plane is angled anteriorly relative to the planned cutting plane (e.g., by 4°).
- active element 3204 is not visible because the current cutting plane is correctly aligned with the planned cutting plane.
- processing system 102 may determine a current cutting plane of sawblade 122 of saw 120.
- Processing system 102 may output, for display by MR device 104, a virtual guidance element 3200 that includes a divided ring element 3202 that includes an enclosed area bisected by a line.
- Virtual guidance element 3200 also includes an active element 3204 having an inner edge and an outer edge. A distance between a center of the line and a center of the inner edge of active element 3204 is indicative of a distance between a resection level of a current cutting plane of sawblade 122 of saw 120 into a bone and a resection level of a planned cutting plane through bone 112.
- An angle of the inner edge of active element 3204 relative to the line is indicative of a superior/inferior angle of the current cutting plane of sawblade 122 and the planned cutting plane.
- a length of a line 3206 perpendicular to the inner edge of active element 3204 from the center of the inner edge of active element 3204 to the outer edge of active element 3204 is indicative of an anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
- Line 3206 may be conceptual and may or may not be visible.
- processing system 102 causes MR device 104 to update active element 3204 based on a change to the current cutting plane of sawblade 122.
- Processing system 102 may output, for display by MR device 104, virtual guidance element 3200 so that the line of divided ring element 3202 is aligned with the planned cutting plane. Processing system 102 may output, for display by MR device 104, virtual guidance element 3200 so that virtual guidance element 3200 and bone 112 are simultaneously visible to a user of MR device 104.
- bone 112 is a humerus
- the planned cutting plane is defined in a virtual coordinate system.
- Processing system 102 may receive first signals from the one or more sensors 118 of tracking system 116.
- Processing system 102 may determine, based on the first signals, first points corresponding to a first tracking marker 110 of a tracking structure 106 that comprises first tracking marker 110 and an attachment body 108 positioned at a bicipital groove of the humerus. Processing system 102 may receive second signals from sensors 118 of tracking system 116. Processing system 102 may determine, based on the second signals, second points corresponding to a second tracking marker of a digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot defined in attachment body 108 of tracking structure 106. Processing system 102 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of digitizer 800, third points corresponding to the bicipital groove.
- Positions of the first, second and third points are defined in a physical coordinate system.
- Processing system 102 may generate, based on the third points and a virtual model of the humerus having points defined in the virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system. Additionally, processing system 102 may receive third signals from one or more sensors 118 oftracking system 116. Processing system 102 may determine, based on the third signals, fourth points corresponding to a third tracking marker 124 attached to a body of saw 120. Processing system 102 may determine the current cutting plane based on the fourth points. For instance, processing system 102 may use a determined spatial relationship with between the fourth points and the bottom of sawblade 122 to determine the current cutting plane in the physical coordinates.
- FIG. 33 is a conceptual diagram illustrating an example MR visualization 3300 that includes a virtual guidance element 3302, in accordance with techniques of this disclosure.
- MR device 104 displays virtual guidance element 3302 so that a user can concurrently see both the virtual guidance element 3302 and a corresponding bone 3304 (or bone model).
- Virtual guidance element 3302 is not superimposed on bone 3304.
- Virtual guidance element 3302 includes a bone model and a planned cut plane.
- Virtual guidance element 3302 may or may not be registered with bone 3304. In examples where virtual guidance element 3302 is registered with bone 3304, virtual guidance element 3302 may be rotated to correspond with the position of bone 3304. Virtual guidance element 3302 may be adjacent to bone 3304.
- the user may confirm the registration process was successful based on MR visualization 3300. In some examples, the user confirms the registration process was successful based on determining that the registration process resulted in reasonable results. By providing MR visualization 3300 to allow the user to confirm the registration process was successful, the techniques of this disclosure may prevent errors in the surgery, e.g., the arthroplasty, which may improve patient outcomes.
- the user may provide user input indicating that the registration process was unsuccessful. Based on the user input, processing system 102 may determine to restart the registration process or otherwise modify the registration process. In some examples, the registration process may be unsuccessful due to an incorrect placement of a tracking marker, such as tracking marker 110. Processing circuitry 102 may control user interface 422 to generate a display including instructions to check the placement of tracking marker 110 responsive to receiving user input indicative of the registration process being unsuccessful.
- FIG. 34 is a conceptual diagram illustrating an example MR visualization 3400 that includes an outline of a bone according to techniques of this disclosure.
- MR device 104 displays a virtual outline element 3402.
- Virtual outline element 3402 provides an outline around a visible portion of a bone.
- MR device 104 may display virtual outline element 3402 during the registration process.
- processing system 102 refines the relationship between the virtual model of the bone and the physical bone as more points are taken on the physical bone, the position of virtual outline element 3402 may be refined to better match the physical bone.
- Virtual outline element 3402 may help the user confirm that the registration process has accurately determined the position of the bone.
- FIG. 35A and FIG. 35B are schematic representations of a cutting guide 3502, according to the techniques of this disclosure.
- cutting guide 3502 is configured to facilitate humeral resection along, for example, an entry line element, e.g., entry line element 2104 of FIG. 21.
- FIG. 35A is a side view of cutting guide 3502, and
- FIG. 35B is atop view of cutting guide 3502.
- Cutting guide 3502 includes a handle 3504. Handle 3504 may allow the user or another person to maintain a placement of cutting guide 3502 relative to a bone, e.g., humerus 112, of a patient.
- Cutting guide 3502 includes a body 3516 defining a slot 3506.
- Slot 3506 may be configured to receive a sawblade, e.g., sawblade 122 of FIG. 1, to be inserted to resect humerus 112.
- the user may position a tip of sawblade 122 within slot 3506.
- the user may align sawblade 122 with entry line element 2104.
- body 3516 includes a bone contacting surface 3508.
- Bone contacting surface 3508 may be curved to correspond to a shape of a bone, e.g., humerus 112.
- cutting guide 3502 includes a plurality of bump-like and/or tooth-like projections 3510 on bone contacting surface 3508 of body 3516. Projections 3510 may help prevent slipping or other movement of cutting guide 3502 during humeral resection.
- Cutting guide 3502 can increase stability of sawblade 122 and prevent unwanted movement of sawblade 122.
- handle 3504 and body 3516 define an angle 3512.
- Angle 3512 may be approximately 90 degrees, e.g., ⁇ 10 degrees, as depicted herein. However, different angles are also possible, as described in FIG. 35D.
- FIG. 35C and FIG. 35D are schematic representations of a first alternative version of cutting guide 3502, according to techniques of this disclosure.
- FIG. 35C may be substantially similar to FIG. 35A
- FIG. 35D may be substantially similar to FIG. 35B.
- body 3516 and handle 3504 of cutting guide 3502 define an angle 3514 different from angle 3512.
- angle 3514 is approximately 180 degrees.
- the position of handle 3504 relative to body 3516 can allow the user to perform the resection without cutting guide 3502 obstructing the user’s view or otherwise impeding the resection.
- angle 3514 may be between approximately 90 degrees and 180 degrees to prevent obstructing the user’s view or otherwise impeding the resection. In some examples, when angle 3514 is between approximately 90 degrees and 180 degrees, the user may be able to hold cutting guide 3502 and saw 120 at the same time relatively easily. However, in some examples, angle 3514 may be greater than approximately 180 degrees or an angle less than approximately 90 degrees.
- FIG. 35E and FIG. 35F are schematic representations of a second alternative version of a cutting guide, according to techniques of this disclosure.
- the cutting guide of FIG. 35C and FIG. 35D may be similar in shape and function as cutting guide 3502 as described elsewhere in this disclosure.
- a slot 3526 defined in a body 3522 of the cutting guide is open-ended at an end of body 3522 opposite an end of body 3522 to which a handle is attached. Because slot 3526 is open-ended, a user may be able to insert sawblade 122 through the open-ended side of slot 3526 or may insert sawblade 122 along a path directly into slot 3526.
- body 3522 defines a set of pinguiding holes 3528.
- pins may be inserted through pin-guiding holes 3528 to secure the cutting guide to humerus 112.
- axes 3530 through pin-guiding holes 3528 for different pins may converge to help ensure that the cutting guide is securely attached to humerus 112.
- the cutting guide may have the open-ended slot and not pin-guiding holes 3528 or may have pin-guiding holes 3528 and a closed-ended slot.
- FIG. 35G is a schematic representation of a second alternative version of cutting guide 3502, according to the techniques of this disclosure.
- a tracking marker 3518 is attached to a surface of body 3516.
- body 3516 and tracking marker 3518 are one physical unit, or body 3516 and tracking marker 3518 are separate units assembled together.
- tracking marker 3518 is an optical marker having predefined optical patterns on different faces of a cube.
- tracking marker 3518 may be a cube having different predefined optical patters on each face.
- tracking marker 3518 has 2-dimensional optical barcodes on different faces.
- tracking marker 3518 has a different polyhedral shape than a cube, such as a dodecahedron, a pyramid, or another polyhedral shape.
- tracking marker 3518 may be an ultrasonic emitter, an electromagnetic marker, a passive optical marker that reflects light, an active optical marker that emits light, and so on.
- tracking marker 3518 comprises a set of objects (e.g., balls, cubes, etc.) having predefined sizes and arranged in a predefined spatial configuration.
- virtual guidance system 320 may use tracking marker 3518 in addition to or instead of tracking marker 124 to identify a position of sawblade 122.
- Tracking marker 3518 is a fixed predefined distance and direction from slot 3506. Hence, by tracking the position of tracking marker 3518 while sawblade 112 is inserted into slot 3506 and by tracking the position of tracking marker 110 attached to humerus 112, virtual guidance system 320 may determine the position of sawblade 122 relative to humerus 112.
- Virtual guidance system 320 may generate virtual guidance elements (e.g., virtual guidance elements 3200) by tracking the position of tracking marker 3518.
- the use of tracking marker 3518 obviates the need for tracking structure 1500 and calibrating the sawblade relative to tracking marker 124 of saw 120.
- FIG. 36 is a flowchart illustrating an example operation for providing cut guidance, according to techniques of this disclosure.
- the operation of FIG. 36 is consistent with the virtual guidance element shown in FIGS. 32A-32E.
- virtual guidance system 320 determines a current cutting plane of sawblade 122 of saw 120 (3600).
- Virtual guidance system 320 may determine the current cutting plane of sawblade 122 based on signals from tracking system 116 and the position identification data generated as described above with respect to FIG. 20.
- registration system 316 may receive first signals from one or more sensors of tracking system 116.
- Registration system 316 may determine, based on the first signals, first points corresponding to a first tracking marker 110 of a tracking structure 106 that comprises the first tracking marker 110 and an attachment body 108 positioned at a bicipital groove of the humerus. Additionally, registration system 316 may receive second signals from the sensors of tracking system 116. Registration system 316 may determine, based on the second signals, second points corresponding to a second tracking marker 804 of a digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot defined in the attachment body 108 of tracking structure 106. Registration system 316 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove.
- Positions of the first, second and third points are defined in a physical coordinate system.
- Registration system 316 may generate, based on the third points and a virtual model of the humerus having points defined in the virtual coordinate system, registration data 310 defining a relationship between the physical coordinate system and the virtual coordinate system.
- Virtual guidance system 320 may receive third signals from the one or more sensors of tracking system 116.
- Virtual guidance system 320 may determine, based on the third signals, fourth points corresponding to a third tracking marker 124 attached to a body of saw 120.
- Virtual guidance system 320 may determine the current cutting plane based on the fourth points. For instance, virtual guidance system 320 may use to the position identification data and the fourth points to determine fifth points that specify the current cutting plane in the physical coordinate system.
- virtual guidance system 320 determines the current cutting plane while sawblade 122 of saw 120 is positioned in a slot 3506 defined by a guide device 3502.
- virtual guidance system 320 receives fourth signals from the one or more sensors 118 of tracking system 116.
- Virtual guidance system 320 may determine, based on the fourth signals, fifth points corresponding to a fourth tracking marker 3518 attached to guide device 3502.
- Virtual guidance system 320 may determine the current cutting plane based on the fifth points. For instance, virtual guidance system 320 may determine, based on the fourth signals, a position of tracking marker 3518 relative to tracking marker 110. Because slot 3506 is a predetermined distance from tracking marker 3518 and because sawblade 122 is inserted into slot 3506, virtual guidance system 320 may determine the current cutting plane based on the fifth points.
- Virtual guidance system 320 may output, for display by MR device 104, a virtual guidance element 3200 (3602).
- Virtual guidance element 3200 includes a divided ring element 3202 that includes an enclosed area circle bisected by a line.
- Virtual guidance element 3200 further includes an active element 3204 having an inner edge and an outer edge.
- a distance between a center of the line and a center of the inner edge of active element 3204 is indicative of a distance between a resection level of the current cutting plane of sawblade 122 into a bone and a resection level of a planned cutting plane through the bone.
- An angle of the inner edge of active element 3204 relative to the line is indicative of a superior/inferior angle of the current cutting plane of sawblade 122 and the planned cutting plane.
- a length of a line 3206 perpendicular to the inner edge of active element 3204 from the center of the inner edge of active element 3204 to the outer edge of active element 3204 is indicative of an anterior/posterior angle of the current cutting plane of sawblade 122 and the planned cutting plane.
- Virtual guidance system 320 may determine, based on registration data 310, the distance between the resection level of the current cutting plane of sawblade 122 into the humerus and the resection level of the planned cutting plane through the bone (e.g., humerus). For example, virtual guidance system 320 may use registration data 310 to convert coordinates of the planned cutting plane which are defined in the virtual coordinate system into the physical coordinate system. In this example, virtual guidance system 320 may compare the converted coordinates to coordinates of the current cutting plane to determine the distance. In another example, virtual guidance system 320 may use registration data 310 to convert coordinates of the current cutting plane, which are defined in the physical coordinate system, into the virtual coordinate system.
- virtual guidance system 320 may compare the converted coordinates to coordinates of the planned cutting plane to determine the distance. In a similar way, virtual guidance system 320 may determine, based on registration data 310, the superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane. Likewise, virtual guidance system 320 may determine, based on the registration data, the anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
- FIG. 37 is a flowchart illustrating an example operation for providing cut guidance with a guide instrument, according to techniques of this disclosure.
- registration system 316 performs a registration process (3700).
- registration system 316 may perform the registration process of FIG. 14 to generate registration data 310 that map positions in a virtual coordinate system to positions in a physical coordinate system.
- registration system 316 may generate position identification data that indicate that the lower edge of sawblade 122 is at a specific position in space relative to one or more points on tracking marker 124 attached to saw 120 (3702).
- registration system 3700 may perform the process of FIG. 20 to generate position identification data that indicate that the lower edge of sawblade 122 is at a specific position in space relative to one or more points on tracking marker 124 attached to saw 120.
- Virtual guidance system 320 may determine, based on signals from tracking system 116 and the position identification data, a current cut plane of sawblade 122 (3704). For example, virtual guidance system 320 may use the signals from tracking system 116 to determine the current position of tracking marker 124 relative to tracking marker 110 and then use the current position of tracking marker 124 and the position identification data to determine the current cut plane of sawblade 122.
- virtual guidance system 320 may generate virtual elements based on the current cut plane, a planned cut plane, and the registration data (3706). In some examples, virtual guidance system 320 generates the virtual elements based on the current cut plane, a planned cut plane, and the registration data as described above with respect to FIG. 36. The virtual elements guide the user to resect the humeral head of humerus 112. Example virtual elements include those shown in the examples of FIGS. 23-32. Virtual guidance system 320 may cause MR device 104 to display the virtual elements while the tip of sawblade 122 is inserted into a slot in a handheld guide device, such as cutting guide 3502 (3704).
- FIG. 38 is a flowchart illustrating an example operation for providing cut guidance in conjunction with a guide instrument including an optical marker, according to the techniques of this disclosure.
- registration system 316 may receive a first plurality of signals (e.g., video signals) from one or more sensors 118 of tracking system 116 (3802).
- the first plurality of signals may include first signals, and processing system 102 may determine, based on the first signals, first points corresponding to a tracking marker 110 of humeral tracking structure 106.
- Humeral tracking structure 106 includes tracking marker 110 and attachment body 108, which is positioned at a bicipital groove of humerus 112 of a patient.
- the first plurality of signals may additionally include second signals (e.g., later video signals) from sensors 118 of tracking system 116.
- Registration system 316 may determine, based on the second signals, second points corresponding to tracking marker 804 of digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot defined in attachment body 108 of humeral tracking structure 106.
- Registration system 316 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of digitizer 800, third points corresponding to the bicipital groove. Positions of the first, second and third points may be defined in a physical coordinate system.
- Processing system 102 may then generate, based on the third points and a virtual model of the humerus having points defined in a virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system (3804).
- Registration system 316 may receive a second plurality of signals from one or more sensors 118 of tracking system 116 (3806). Processing system 102 may determine, based on the second plurality of signals, points corresponding to an optical marker (e.g., tracking marker 3518) of cutting guide 3502. Based on the points corresponding to tracking marker 3518, processing system 102 can determine position data that specifies a position of the optical marker of cutting guide 3502 in the physical coordinate system (3808). Processing system 102 may then generate, based on the position data and the registration data, virtual guidance for positioning sawblade 122 while the tip of sawblade 122 is inserted through a slot (e.g., slot 3506) of cutting guide 3502 with respect to humerus 112 (3810).
- a slot e.g., slot 3506
- Processing system 102 may cause MR device 104 to display the virtual guidance (e.g., entry line element 2104) such that the virtual guidance appears to a user to be located on the humerus at locations where a planned cutting plane intersects humerus 112 (3812).
- the virtual guidance e.g., entry line element 2104
- a tracking structure comprising: an attachment body shaped for attachment at a bicipital groove of a humerus of a patient, the attachment body defining a slot having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer; and a tracking marker connected to the attachment body, the tracking marker having a polyhedral shape that includes a plurality of faces that have different optical patterns.
- Clause 2A The tracking structure of clause 1A, wherein a distal end of the slot is open-ended.
- Clause 4A The tracking structure of clause 3A, wherein: the first prong and the second prong are angled relative each other; or the first prong and the second prong are parallel and meet at a curved surface.
- Clause 5A The tracking structure of any of clauses 3A-4A, wherein at least one of the first prong or the second prong includes a textured area.
- Clause 6A The tracking structure of any of clauses 1A-5A, wherein the attachment body defines two or more apertures sized to accommodate fixation members for attachment of the attachment body to the humerus.
- Clause 7A The tracking structure of any of clauses 1A-6A, wherein the attachment body and the tracking marker are physically one unit.
- a computer-implemented method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker of a tracking structure, the tracking structure comprising the first tracking marker and an attachment body positioned at a bicipital groove of a humerus of a patient; receiving, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determining, by the processing system, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; and generating, by
- Clause 10A The computer-implemented method of any of clauses 8A-9A, wherein the virtual element provides guidance for resecting a portion of the humerus.
- Clause 11 A The computer-implemented method of any of clauses 8A-10A, further comprising causing, by the processing system, an MR device to display instructions to palpate the bicipital groove with the digitizer.
- Clause 12A The computer-implemented method of any of clauses 8A-11A, further comprising causing, by the processing system, an MR device to display instructions to position the tracking structure at the bicipital groove of the humerus and to insert fixation elements through apertures of the attachment body into the humerus to attach the tracking structure to the humerus.
- Clause 13 A The computer-implemented method of any of clauses 8A- 12A, wherein: receiving the first signals comprise receiving, by the processing system, first video signals comprising images of optical patterns on two or more faces of the first tracking marker, wherein the each of the faces has a different optical pattern, and determining the positions of the first points comprises determining, by the processing system, based on the images of the optical patterns, positions of vertices of the first tracking marker.
- Clause 14A The computer-implemented method of any of clauses 8A- 13A, wherein: the method further comprises: receiving, by the processing system, third signals from the one or more sensors of the tracking system; determining, based on the third signals, positions of fourth points in the physical coordinate system corresponding to the second tracking marker while the tip of the digitizer palpates a humeral head of the humerus; receiving, by the processing system, fourth signals from the one or more sensors of the tracking system; and determining, based on the fourth signals, positions of fifth points in the physical coordinate system corresponding to the second tracking marker while the tip of the digitizer palpates a metaphysis of the humerus, and generating the registration data comprises generating the registration data based on the third points, the fourth points, the fifth points, and the virtual model of the humerus.
- Clause 15A The computer-implemented method of any of clauses 8A-14A, wherein the tracking system is included in an MR device.
- Clause 16A The computer-implemented method of any of clauses 8A-15A, further comprising: receiving, by the processing system, third signals from the sensors of the tracking system after generating the registration data; determining, by the processing system, based on the registration data and the third signals, a distance between a location on a surface of the humerus and the tip of the digitizer; causing, by the processing system, a mixed reality (MR) device to display a first virtual element at the location on the surface of the humerus and to display a second virtual element indicating the determined distance of a tip of the digitizer to the location on the surface of the humerus; and refining, by the processing system, the registration data based on the determined distance being greater than a threshold while the tip of the digitizer is positioned at the location on the surface of the humerus.
- MR mixed reality
- Clause 18 A A system comprising means for performing the methods of any of clauses 8A-16A.
- a computer-implemented method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker attached to a body of a saw; determining, by the processing system, based on the first signals, second points corresponding to a second tracking marker of a tracking structure while a sawblade of the saw is positioned in a recess defined by a support body of the tracking structure; and generating, by the processing system, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
- Clause 2B The computer-implemented method of clause IB, further comprising: obtaining, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, third points corresponding to the first tracking marker; determining, by the processing system, based on the third points and the position identification data, guidance data that guide the user to position the lower edge of the sawblade at a location on a bone of a patient while cutting the bone; and causing, by the processing system, a mixed reality (MR) device to display the virtual guidance.
- MR mixed reality
- Clause 3B The computer-implemented method of any of clauses 1B-2B, further comprising causing, by the processing system, a mixed reality (MR) device to display instructions to position the sawblade in the recess.
- MR mixed reality
- Clause 4B The computer-implemented method of clause 3B, wherein the instructions instruct a user to position a tip of the sawblade in the recess.
- Clause 5B The computer-implemented method of any of clauses 3B-4B, wherein causing the MR device to display instructions to position the sawblade in the recess comprises displaying the instructions as a virtual representation of one or more surfaces of the support body at a position along a lateral side of the sawblade.
- Clause 6B The computer-implemented method of any of clauses 1B-5B, further comprising: determining, by the processing system, based on the second points, a planar orientation of the tracking structure; and causing, by the processing system, the MR device to display virtual elements that indicate whether the planar orientation of the tracking structure is aligned with a cutting plane of the sawblade.
- a system comprising: a saw comprising a sawblade; a first tracking marker attached to the saw; and a tracking structure that comprises a support body and a second tracking marker, wherein the support body defines a recess to accommodate the sawblade; a processing system comprising one or more processors that are implemented in circuitry and configured to: receive first signals from one or more sensors of a tracking system; determine, based on the first signals, first points corresponding to the first tracking marker attached to a body of the saw; and determine, based on the first signals, second points corresponding to the second tracking marker while the sawblade of the saw is positioned in the recess defined by the support body of the tracking structure; and generate, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
- Clause 8B The system of clause 7B, wherein the recess is a slot having a width corresponding to a width of the sawblade.
- Clause 9B The system of any of clauses 7B-8B, wherein the processing system is further configured to: obtain second signals from the one or more sensors of the tracking system; determine, based on the second signals, third points corresponding to the first tracking marker; determine, based on the third points and the position identification data, guidance data that guide the user to position the lower edge of the sawblade at a location on a bone of a patient while cutting the bone; and cause a mixed reality (MR) device to display the virtual guidance.
- MR mixed reality
- Clause 11B The system of clause 10B, wherein the instructions instruct a user to position a tip of the sawblade in the recess.
- Clause 12B The system of any of clauses 10B-11B, wherein the processing system is configured to cause the MR device to display the instructions as a virtual representation of one or more surfaces of the support body at a position along a lateral side of the sawblade.
- Clause 13B The system of any of clauses 7B-12B, wherein the processing system is further configured to: determine, based on the second points, a planar orientation of the tracking structure; and cause the MR device to display virtual elements that indicate whether the planar orientation of the tracking structure is aligned with a cutting plane of the sawblade.
- Clause 14B A non-transitory computer-readable storage medium having instructions stored thereon that, when executed, cause a processing system to perform the methods of any of clauses 1B-6B.
- Clause 15B A system comprising means for performing the methods of any of clauses 1B-6B.
- a computer-implemented method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, positions for a plurality of planned plane elements; determining, by the processing system, based on the first signals, positions for a plurality of current plane elements, wherein: each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient, the planned plane elements are not contiguous with each other; each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of a sawblade of a saw, and the current plane elements are not contiguous with each other; causing, by the processing system, a mixed reality (MR) device to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements, wherein, for each pair of corresponding current plane elements and planned plane elements, at least one of the current
- Clause 2C The computer-implemented method of clause 1C, further comprising, for each pair of corresponding current plane elements and planned plane elements, determining the visual property based on whether the location indicated by the current plane element of the pair is above or below the planned cutting plane.
- Clause 3C The computer-implemented method of any of clauses 1C-2C, further comprising: causing, by the processing system, the MR device to display a virtual line element such that the virtual line element appears to the user to be located on the bone at locations where the planned cutting plane intersects the bone.
- Clause 4C The computer-implemented method of any of clauses 1C-3C, wherein each of the planned plane elements and/or each of the current plane elements is crescentshaped.
- Clause 5C The computer-implemented method of any of clauses 1C-4C, further comprising causing, by the processing system, the MR device to update the positions for the current plane elements based on changes to the current cutting plane of the sawblade.
- Clause 6C The computer-implemented method of any of clauses 1C-5C, wherein: the planned cutting plane is defined in a virtual coordinate system, the method further comprises, prior to receiving the first signals: receiving, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, first points corresponding to a first tracking marker of a tracking structure that comprises the first tracking marker and an attachment body positioned at a bicipital groove of a humerus of a patient; receiving, by the processing system, third signals from the sensors of the tracking system; determining, by the processing system, based on the third signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determining, by the processing system, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital
- Clause 7C The computer-implemented method of any of clauses 1C-6C, wherein: the method further comprises, prior to receiving the first signals: receiving, by a processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, first points corresponding to a first tracking marker attached to a body of the saw; and determining, by the processing system, based on the second signals, second points corresponding to a second tracking marker of a tracking structure while the sawblade of the saw is positioned in a recess defined by a support body of the tracking structure, wherein the first points and the second points are defined in a physical coordinate system; and generating, by the processing system, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker, determining the positions for the current plane elements comprises determining the positions for the current plane elements based on a position of the first tracking marker in the second signals and based
- a system comprising: a mixed reality (MR) device; and a processing system that includes one or more processors implemented in circuitry, the processing system configured to: receive first signals from one or more sensors of a tracking system; determine, based on the first signals, positions for a plurality of planned plane elements; determine, based on the first signals, positions for a plurality of current plane elements, wherein: each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient, the planned plane elements are not contiguous with each other; each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of a sawblade of a saw, and the current plane elements are not contiguous with each other; and cause the mixed reality (MR) device to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements, wherein, for each pair of corresponding current plane elements and planned plane elements, at least one of the
- Clause 9C The system of clause 8C, wherein the processing system is further configured to, for each pair of corresponding current plane elements and planned plane elements, determine the visual property based on whether the location indicated by the current plane element of the pair is above or below the planned cutting plane.
- Clause 10C The system of any of clauses 8C-9C, wherein the processing system is further configured to cause the MR device to display a virtual line element such that the virtual line element appears to the user to be located on the bone at locations where the planned cutting plane intersects the bone.
- Clause 11C The system of any of clauses 8C-10C, wherein each of the planned plane elements and/or each of the current plane elements is crescent-shaped.
- Clause 12C The system of any of clauses 8C-11C, wherein the processing system is further configured to cause the MR device to update the positions for the current plane elements based on changes to the current cutting plane of the sawblade.
- Clause 14C The system of any of clauses 8C-13C, wherein the processing system is further configured to, prior to receiving the first signals: receive second signals from the one or more sensors of the tracking system; determine, based on the second signals, first points corresponding to a first tracking marker attached to a body of the saw; and determine, based on the second signals, second points corresponding to a second tracking marker of a tracking structure while the sawblade of the saw is positioned in a recess defined by a support body of the tracking structure, wherein the first points and the second points are defined in a physical coordinate system; generate, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker; and determine the positions for the current plane elements based on a position of the first tracking marker in the second signals and based on the position identification data.
- Clause 15C The system of any of clauses 8C-14C, wherein one or more of the processors of the processing
- Clause 16C A non-transitory computer-readable storage medium having instructions stored thereon that, when executed, cause a processing system to perform the methods of any of clauses 1C-7C.
- Clause 17C A system comprising means for performing the methods of any of clauses 1C-7C.
- a computer-implemented method comprising: determining, by a processing system that includes one or more processors implemented in circuitry, a current cutting plane of a sawblade of a saw; and outputting, by the processing system, for display by a mixed reality (MR) device, a virtual guidance element that includes: a divided ring element that includes an enclosed area circle bisected by a line; and an active element having an inner edge and an outer edge, wherein: a distance between a center of the line and a center of the inner edge of the active element is indicative of a distance between a resection level of the current cutting plane of the sawblade into a bone and a resection level of a planned cutting plane through the bone, an angle of the inner edge of the active element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane, and a length of a line perpendicular to the inner edge of the active element from the center of the inner edge of
- MR mixed reality
- Clause 2D The computer-implemented method of clause ID, further comprising causing, by the processing system, the MR device to update the active element based on a change to the current cutting plane of the sawblade.
- Clause 3D The computer-implemented method of any of clauses 1D-2D, wherein outputting the virtual guidance element comprises outputting, by the processing system, for display by the MR device, the virtual guidance element so that the line of the divided ring element is aligned with the planned cutting plane.
- Clause 4D The computer-implemented method of any of clauses 1D-3D, wherein outputting the virtual guidance element comprises outputting, by the processing system, for display by the MR device, the virtual guidance element so that the virtual guidance element and the bone are simultaneously visible to a user of the MR device.
- Clause 6D The computer-implemented method of any of clauses 1D-5D, wherein determining the current cutting plane comprises determining the current cutting plane while the sawblade of the saw is positioned in a slot defined by a guide device.
- Clause 7D The computer-implemented method of clause 6D, wherein the guide device comprises a fourth tracking marker, the method further comprising: receiving, by the processing system, fourth signals from the one or more sensors of the tracking system; determining, by the processing system, based on the fourth signals, fifth points corresponding to a fourth tracking marker attached to the guide device; and determining, by the processing system, the current cutting plane based on the fifth points.
- a system comprising one or more processors implemented in circuitry, wherein one or more processors are configured to: determine a current cutting plane of a sawblade of a saw; and output for display by a mixed reality (MR) device, a virtual guidance element that includes: a divided ring element that includes an enclosed area circle bisected by a line; and an active element having an inner edge and an outer edge, wherein: a distance between a center of the line and a center of the inner edge of the active element is indicative of a distance between a resection level of the current cutting plane of the sawblade into a bone and a resection level of a planned cutting plane through the bone, an angle of the inner edge of the active element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane, and a length of a line perpendicular to the inner edge of the active element from the center of the inner edge of the active element to the outer edge of the active element
- Clause 9D The system of clause 8D, wherein the one or more processors are further configured to cause the MR device to update the active element based on a change to the current cutting plane of the sawblade.
- Clause 10D The system of any of clauses 8D-9D, wherein to output the virtual guidance element, the one or more processors are configured to output, for display by the MR device, the virtual guidance element so that the line of the divided ring element is aligned with the planned cutting plane.
- Clause 11D The system of any of clauses 8D-10D, wherein to output the virtual guidance element, the one or more processors are configured to output, for display by the MR device, the virtual guidance element so that the virtual guidance element and the bone are simultaneously visible to a user of the MR device.
- Clause 13D The system of any of clauses 8D-12D, wherein the processing system is configured to determine the current cutting plane while the sawblade of the saw is positioned in a slot defined by a guide device.
- Clause 14D The system of clause 13D, wherein the processing circuitry is further configured to: receive fourth signals from the one or more sensors of the tracking system; determine, based on the fourth signals, fifth points corresponding to a fourth tracking marker attached to the guide device; and determine the current cutting plane based on the fifth points.
- Clause 15D A non-transitory computer-readable storage medium comprising instructions stored thereon that, when executed by one or more processors of a processing system, cause the processing system to perform the methods of any of clauses 1D-7D.
- a computer-implemented method comprising: receiving, by a processing system, a first plurality of signals from one or more sensors of a tracking system; determining, by the processing system and based on a virtual model of a humerus of a patient and points determined based on the first plurality of signals, registration data defining a relationship between a physical coordinate system and a virtual coordinate system; receiving, by the processing system, a second plurality of signals from the one or more sensors of the tracking system; determining, by the processing system and based on points determined based on the second plurality of signals, position identification data that specify a position of a lower edge of a sawblade of a saw relative to a tracking marker attached to the saw; determining, by the processing system and based on the position identification data, a position of the sawblade in the physical coordinate system; generating, by the processing system and while a tip of the sawblade is inserted into a slot of a guide device configured to maintain a position of the sawbla
- Clause 2E The computer-implemented method of clause IE, the method further comprising: causing, by the processing system, the MR device to display instructions to insert the tip of the sawblade into the slot of the guide device.
- Clause 7E The computer-implemented method of any of clauses 1E-6E, wherein the guide device comprises a handle and a body, the body defining: a slot extending through the body and configured to accommodate the tip of the sawblade; and a bone contacting surface, wherein the bone contacting surface comprises a plurality of bump-like projections extending away from the bone contacting surface.
- Clause 8E The computer-implemented method of any of clauses 1E-7E, wherein the handle and the body of the guide device form an angle between about 90 degrees and 180 degrees.
- Clause 9E The computer-implemented method of any of clauses 1E-8E, wherein the virtual guidance comprises a virtual line element, and wherein the virtual line element appears to a user to be located on the humerus at locations where a planned cutting plane intersects the humerus of the patient.
- a system comprising: a mixed reality (MR) device; and a processing system that includes one or more processors implemented in circuitry, the processing system configured to: receive a first plurality of signals from one or more sensors of a tracking system; determine, based on a virtual model of a humerus of a patient and points determined based on the first plurality of signals, registration data defining a relationship between a physical coordinate system and a virtual coordinate system; receive a second plurality of signals from the one or more sensors of the tracking system; determine, based on points determined based on the second plurality of signals, position identification data that specify a position of a lower edge of a sawblade of a saw; determine, based on the registration data and the position identification data, a position of the sawblade of the saw relative to the humerus of the patient; and generate, while a tip of the sawblade is inserted into a slot of a guide device configured to maintain a position of the sawblade relative to the bone during
- Clause 15E The system of any of clauses 10E-12E, wherein the points determined based on the second plurality of signals comprise points corresponding to a tracking marker attached to a body of the guide device.
- Clause 16E The system of any of clauses 10E-15E, wherein the guide device comprises a handle and a body, the body defining: a slot extending through the body and configured to accept the tip of the sawblade; and a bone contacting surface, wherein the bone contacting surface comprises a plurality of bump-like projections extending away from the bone contacting surface.
- Clause 17E The system of any of clauses 10E-16E, wherein the handle and the body of the guide device form an angle between 90 degrees and 180 degrees.
- Clause 18E The system of any of clauses 10E-17E, wherein the virtual guidance comprises a virtual line element, and wherein the virtual line element appears to a user to be located on the humerus at locations where a planned cutting plane intersects the humerus of the patient.
- a non-transitory computer-readable storage medium comprising instructions stored thereon that, when executed by one or more processors of a processing system, cause the processing system to: receive a first plurality of signals from one or more sensors of a tracking system; determine, based on a virtual model of a humerus of a patient and points determined based on the first plurality of signals, registration data defining a relationship between a physical coordinate system and a virtual coordinate system; receive a second plurality of signals from the one or more sensors of the tracking system; determine, based on points determined based on the second plurality of signals, position identification data that specify a position of a lower edge of a sawblade of a saw; determine, based on the registration data and the position identification data, a position of the sawblade of the saw relative to the humerus of the patient; and generate, while a tip of the sawblade is inserted into a slot of a guide device configured to maintain a position of the sawblade relative to the bone
- a surgical cut guide comprising: a handle; and a guide portion comprising: a bone interface portion configured to contact a bone of a patient, wherein the bone interface portion comprises a plurality of bump-like projections; a slot configured to accommodate a sawblade, wherein the slot is sized so as to stabilize the sawblade during a cutting procedure.
- Clause 2 IE The surgical cut guide of clause 20E, wherein the surgical cut guide further comprises a tracking marker, the tracking marker having a polyhedral shape that includes a plurality of faces that have different optical patterns.
- Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol.
- computer-readable media generally may correspond to (1) tangible computer- readable storage media which is non-transitory or (2) a communication medium such as a signal or carrier wave.
- Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure.
- a computer program product may include a computer-readable medium.
- Such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium.
- coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave are included in the definition of medium.
- DSL digital subscriber line
- computer-readable storage media and data storage media do not include connections, carrier waves, signals, or other transitory media, but are instead directed to non-transitory, tangible storage media.
- Disk and disc includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
- processors may be implemented as fixed-function processing circuits, programmable circuits, or combinations thereof, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or other equivalent integrated or discrete logic circuitry.
- DSPs digital signal processors
- ASICs application specific integrated circuits
- FPGAs field programmable gate arrays
- Fixed-function circuits refer to circuits that provide particular functionality and are preset on the operations that can be performed.
- Programmable circuits refer to circuits that can programmed to perform various tasks and provide flexible functionality in the operations that can be performed. For instance, programmable circuits may execute instructions specified by software or firmware that cause the programmable circuits to operate in the manner defined by instructions of the software or firmware.
- Fixed-function circuits may execute software instructions (e.g., to receive parameters or output parameters), but the types of operations that the fixed-function circuits perform are generally immutable. Accordingly, the terms “processor” and “processing circuity,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein.
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Robotics (AREA)
- Human Computer Interaction (AREA)
- Dentistry (AREA)
- Surgical Instruments (AREA)
Abstract
A tracking structure comprises an attachment body shaped for attachment at a bicipital groove of a humerus of a patient, the attachment body defining a slot having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer. The tracking structure further includes a tracking marker connected to the attachment body, the tracking marker having a polyhedral shape that includes a plurality of faces that have different optical patterns.
Description
HUMERAL MARKER FOR MIXED REALITY SURGICAL NAVIGATION
[0001] This application claims priority to U.S. provisional patent application 63/510,299, filed June 26, 2023, U.S. provisional patent application 63/510,316, filed June 26, 2023, U.S. provisional patent application 63/510,325, filed June 26, 2023, and U.S. provisional patent application 63/563,002, filed March 8, 2024, the entire content of each of which is incorporated herein by reference.
BACKGROUND
[0002] Shoulder arthroplasty is a form of orthopedic surgery in which one or more prostheses are implanted on a patient’s scapula and humerus. In an anatomic shoulder arthroplasty, the humerus prosthesis has a ball-shaped surface that mates with a socket-shaped surface of the scapular implant. In a reverse shoulder arthroplasty, the scapular prosthesis has ball-shaped surface that mates with a socket-shaped surface of a humeral implant. In preparation for implanting the humeral prosthesis, a surgeon may resect the patient’s humeral head. Resecting the humeral head along an appropriate plane may be a significant factor in the success of the surgery.
SUMMARY
[0003] This disclosure describes techniques for providing mixed reality (MR) surgical navigation for shoulder arthroplasty. As described in this disclosure, a tracking marker is affixed at a bicipital groove of a patient’s humerus. The tracking marker includes elements that enable a tracking system to determine a position and orientation of the tracking marker. Because the tracking marker is affixed to the patient’s humerus, by determining the position and orientation of the tracking marker, a processing system may determine the position and orientation of the patient’s humerus in a physical coordinate system. The processing system may perform a humeral registration process to generate registration data that defines a relationship between a virtual coordinate system and the physical coordinate system. Aspects of a surgical plan, such as a cut plane, may be defined in the virtual coordinate system.
[0004] The processing system may cause an MR device to use the registration data to present virtual guidance. For example, the processing system may cause the MR device to display planned plane elements and current plane elements. The planned plane elements indicate locations on a planned cutting plane through a bone. The planned plane elements are not contiguous with each other. The current plane elements indicate locations on a current cutting
plane of a sawblade of a saw. The current plane elements are not contiguous with each other. The processing system may cause the MR device to concurrently display the planned plane elements at their determined positions and the current plane elements at their determined positions. For each pair of corresponding current plane elements and planned plane elements, at least one of the current or planned plane element of the pair has a visual property (e.g., color, texture, etc.) based on the position for the current plane element of the pair relative to the planned cutting plane. Displaying the current plane elements and the planned plane elements in this way may guide a user (e.g., a clinician) to cut the bone at an appropriate position and angle, without unduly obscuring the user’s vision of the bone.
[0005] The virtual guidance can also include a virtual guidance element. For example, the processing system can cause the MR device to display a virtual guidance element. The virtual guidance element can include a divided ring element including an enclosed area circle bisected by a line. The virtual guidance element can also include an active element with an inner edge and an outer edge. The distance between a center of the line of the divided ring element and the inner edge of the active element is indicative of a distance between a resection level of a current cutting plane of the sawblade into the bone and the resection level of the planned cutting plane through the bone. The angle of the inner element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane. A length of the line perpendicular to the inner edge of the active element from the center of the inner edge of the active element to the outer edge of the active element is indicative of an anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane. Displaying the virtual guidance element may guide the clinician to cut the bone at an appropriate position and angle.
[0006] This disclosure also describes techniques for identifying a position of a sawblade for use in providing MR guidance. For example, the processing system may determine, based on signals from sensors of a tracking system, first points corresponding to a first tracking marker attached to a body of a saw. The processing system may determine, based on the signals, second points corresponding to a second tracking marker of a tracking structure while a sawblade of the saw is positioned in a recess defined by a support body of the tracking structure. The processing system may generate, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker. Identifying the position of the sawblade in this way may help to ensure that virtual MR guidance for using the sawblade is displayed at an appropriate location.
[0007] In one example, this disclosure describes a tracking structure comprising: an attachment body shaped for attachment at a bicipital groove of a humerus of a patient, the attachment body defining a slot having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer; and a tracking marker connected to the attachment body, the tracking marker having a polyhedral shape that includes a plurality of faces that have different optical patterns.
[0008] In another example, this disclosure describes a method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker of a tracking structure, the tracking structure comprising the first tracking marker and an attachment body positioned at a bicipital groove of a humerus of a patient; receiving, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determining, by the processing system, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; and generating, by the processing system, based on the third points and a virtual model of the humerus having points defined in a virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system.
[0009] In another example, this disclosure describes a method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker attached to a body of a saw; and determining, by the processing system, based on the first signals, second points corresponding to a second tracking marker of a tracking structure while a sawblade of the saw is positioned in a recess defined by a support body of the tracking structure; and generating, by the processing system, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
[0010] In another example, this disclosure describes a system comprising: a saw comprising a sawblade; a first tracking marker attached to the saw; and a tracking structure that comprises a support body and a second tracking marker, wherein the support body defines a recess to
accommodate the sawblade; a processing system comprising one or more processors that are implemented in circuitry and configured to: receive first signals from one or more sensors of a tracking system; determine, based on the first signals, first points corresponding to the first tracking marker attached to a body of the saw; and determine, based on the first signals, second points corresponding to the second tracking marker while the sawblade of the saw is positioned in the recess defined by the support body of the tracking structure; and generate, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
[0011] In another example, this disclosure describes a method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, positions for a plurality of planned plane elements; determining, by the processing system, based on the first signals, positions for a plurality of current plane elements, wherein: each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient, the planned plane elements are not contiguous with each other; each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of a sawblade of a saw, and the current plane elements are not contiguous with each other; causing, by the processing system, a mixed reality (MR) device to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements, wherein, for each pair of corresponding current plane elements and planned plane elements, at least one of the current plane element of the pair or the planned plane element of the pair has a visual property that is based on the position for the current plane element of the pair relative to the planned cutting plane.
[0012] In another example, this disclosure describes a system comprising: a mixed reality (MR) device; and a processing system that includes one or more processors implemented in circuitry, the processing system configured to: receive first signals from one or more sensors of a tracking system; determine, based on the first signals, positions for a plurality of planned plane elements; determine, based on the first signals, positions for a plurality of current plane elements, wherein: each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient, the planned plane elements are not contiguous with each other; each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of a sawblade of a saw, and the current plane elements are not contiguous with each other; and cause the mixed reality (MR) device to
concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements, wherein, for each pair of corresponding current plane elements and planned plane elements, at least one of the current plane element of the pair or the planned plane element of the pair has a visual property that is based on the position for the current plane element of the pair relative to the planned cutting plane.
[0013] In another example, this disclosure describes a computer-implemented method comprising: determining, by a processing system that includes one or more processors implemented in circuitry, a current cutting plane of a sawblade of a saw; and outputting, by the processing system, for display by a mixed reality (MR) device, a virtual guidance element that includes: a divided ring element that includes an enclosed area circle bisected by a line; and an active element having an inner edge and an outer edge, wherein: a distance between a center of the line and a center of the inner edge of the active element is indicative of a distance between a resection level of the current cutting plane of the sawblade into a bone and a resection level of a planned cutting plane through the bone, an angle of the inner edge of the active element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane, and a length of a line perpendicular to the inner edge of the active element from the center of the inner edge of the active element to the outer edge of the active element is indicative of an anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
[0014] In another example, this disclosure describes a system comprising one or more processors implemented in circuitry, wherein one or more processors are configured to: determine a current cutting plane of a sawblade of a saw; and output for display by a mixed reality (MR) device, a virtual guidance element that includes: a divided ring element that includes an enclosed area circle bisected by a line; and an active element having an inner edge and an outer edge, wherein: a distance between a center of the line and a center of the inner edge of the active element is indicative of a distance between a resection level of the current cutting plane of the sawblade into a bone and a resection level of a planned cutting plane through the bone, an angle of the inner edge of the active element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane, and a length of a line perpendicular to the inner edge of the active element from the center of the inner edge of the active element to the outer edge of the active element is indicative of an anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
[0015] The details of various examples of the disclosure are set forth in the accompanying drawings and the description below. Various features, objects, and advantages will be apparent from the description, drawings, and claims.
BRIEF DESCRIPTION OF DRAWINGS
[0016] FIG. 1 is a conceptual diagram illustrating an example system in which one or more techniques of this disclosure may be performed.
[0017] FIG. 2 is a flowchart illustrating an example process for mixed reality (MR)-based navigation for surgical tasks associated with a humerus of a patient according to techniques of this disclosure.
[0018] FIG. 3 is a block diagram illustrating an example computing system in accordance with one or more techniques of this disclosure.
[0019] FIG. 4 is a schematic representation of a mixed reality (MR) device in accordance with one or more techniques of this disclosure.
[0020] FIG. 5A and FIG. 5B are schematic representations of an example attachment body of a humeral tracking structure according to techniques of this disclosure.
[0021] FIG. 6A and FIG. 6B are schematic representations of an example attachment body of a humeral tracking structure according to techniques of this disclosure.
[0022] FIG. 7A and FIG. 7B are schematic representations of an example attachment body of a humeral tracking structure according to techniques of this disclosure.
[0023] FIG. 7C and FIG. 7D are schematic representations of a first alternative version of the attachment body, according to techniques of this disclosure.
[0024] FIG. 7E and FIG. 7F are schematic representations of a second alternative version of the attachment body, according to techniques of this disclosure.
[0025] FIG. 7G and FIG. 7H are schematic representations of a third alternative version of the attachment body, according to techniques of this disclosure.
[0026] FIG. 71 and FIG. 7J are schematic representations of a fourth alternative version of the attachment body, according to techniques of this disclosure.
[0027] FIG. 7K, FIG. 7L, and FIG. 7M are schematic representations of a fifth alternative version of the attachment body, according to techniques of this disclosure.
[0028] FIG. 8 is a conceptual diagram illustrating an example view of registering a humeral head according to techniques of this disclosure.
[0029] FIG. 9 is a conceptual diagram illustrating an example view of registering a humeral metaphysis according to techniques of this disclosure.
[0030] FIG. 10 is a conceptual diagram illustrating an example view of registering a bicipital groove according to techniques of this disclosure.
[0031] FIG. 11 is a conceptual diagram illustrating validation of registration of a humeral head according to techniques of this disclosure.
[0032] FIG. 12 is a conceptual diagram illustrating validation of registration of a humeral head according to techniques of this disclosure.
[0033] FIG. 13 is a conceptual diagram illustrating validation of registration of a humeral metaphysis according to techniques of this disclosure.
[0034] FIG. 14 is a flowchart illustrating an example registration operation according to techniques of this disclosure.
[0035] FIG. 15 is a conceptual diagram illustrating an example of identifying a position of a saw blade according to techniques of this disclosure.
[0036] FIG. 16 is a conceptual diagram illustrating an example of identifying a position of a saw blade according to techniques of this disclosure.
[0037] FIG. 17A is a conceptual diagram illustrating a profile view of an example tracking structure according to techniques of this disclosure.
[0038] FIG. 17B is a conceptual diagram illustrating a profile view of an example tracking structure according to techniques of this disclosure.
[0039] FIG. 18 is a conceptual diagram illustrating an example of confirming an orientation of a sawblade according to techniques of this disclosure.
[0040] FIG. 19A is a conceptual diagram illustrating an example of identifying a position of a sawblade according to techniques of this disclosure.
[0041] FIG. 19B is a conceptual diagram illustrating an example tracking structure for identifying the position of a sawblade according to techniques of this disclosure.
[0042] FIG. 19C is a conceptual diagram illustrating a bottom view of the tracking structure of FIG. 19B.
[0043] FIG. 19D is a conceptual diagram illustrating an alternative tracking structure for identifying the position of a sawblade according to techniques of the disclosure.
[0044] FIG. 19E is a conceptual diagram illustrating a second alternative tracking structure for identifying the position of a sawblade according to techniques of the disclosure.
[0045] FIG. 20 is a flowchart illustrating an example operation for identifying a position of a sawblade for MR-based guidance according to techniques of this disclosure.
[0046] FIG. 21 is a conceptual diagram illustrating example virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
[0047] FIG. 22 is a flowchart illustrating an example operation for providing cut guidance according to techniques of this disclosure.
[0048] FIG. 23 is a conceptual diagram illustrating an example ring-shaped virtual element for guiding resection of a humeral head according to techniques of this disclosure.
[0049] FIG. 24 is a conceptual diagram illustrating example ring-shaped virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
[0050] FIG. 25 is a conceptual diagram illustrating example planned plane virtual elements and current plane virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
[0051] FIG. 26 is a conceptual diagram illustrating example virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
[0052] FIG. 27 is a conceptual diagram illustrating first example elevation and angular virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
[0053] FIG. 28 is a conceptual diagram illustrating second example elevation and angular virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
[0054] FIG. 29 is a conceptual diagram illustrating example elevation and angle virtual elements for guiding resection of a humeral head according to techniques of this disclosure.
[0055] FIG. 30 is a conceptual diagram illustrating example elevation element and bubblelevel virtual element for guiding resection of a humeral head according to techniques of this disclosure.
[0056] FIG. 31 is a conceptual diagram illustrating an example of confirming accuracy of a humeral resection according to techniques of this disclosure.
[0057] FIGS. 32A - 32E are conceptual diagrams illustrating example guidance displayed by a virtual guidance element for positioning a sawblade at a correct position and orientation, in accordance with techniques of this disclosure.
[0058] FIG. 33 is a conceptual diagram illustrating an example MR visualization that includes a virtual guidance element according with techniques of this disclosure.
[0059] FIG. 34 is a conceptual diagram illustrating an example MR visualization that includes an outline of a bone according to techniques of this disclosure.
[0060] FIG. 35A and FIG. 35B are schematic representations of an example cutting guide, according to the techniques of this disclosure.
[0061] FIG. 35C and FIG. 35D are schematic representations of a first alternative version of the cutting guide, according to techniques of this disclosure.
[0062] FIG. 35E and FIG. 35F are schematic representations of a second alternative version of the cutting guide, according to techniques of this disclosure.
[0063] FIG. 35G is a schematic representation of a third alternative version of the cutting guide, according to the techniques of this disclosure.
[0064] FIG. 36 is a flowchart illustrating an example operation for providing cut guidance according to techniques of this disclosure.
[0065] FIG. 37 is a flowchart illustrating an example operation for providing cut guidance in conjunction with a guide instrument, according to techniques of this disclosure.
[0066] FIG. 38 is a flowchart illustrating an example operation for providing cut guidance in conjunction with a guide instrument including an optical marker, according to the techniques of this disclosure.
DETAILED DESCRIPTION
[0067] During a shoulder arthroplasty, a surgeon may use an oscillating saw to resect the head of a patient’s humerus. Resecting the head of the patient’s humerus allows the surgeon to insert a stem of a humeral implant into the intramedullary canal of the humerus. It is important for the surgeon to resect the humeral head at the correct position and angle. Resecting the humeral head at the wrong position or the wrong angle may lead to restricted patient range of motion, bone fractures, failures of the humeral implant, poor seating of the humeral implant, or other complications.
[0068] To help ensure that the humeral head is resected at the correct position and angle, it is common for surgeons to use physical humeral cut guides. A physical humeral cut guide is a physical object that fits over the humeral head. The physical humeral cut guide defines a slot through which the surgeon can insert the sawblade of an oscillating saw. When the physical humeral cut guide is properly positioned over the humeral head, the slot defined by the physical humeral cut guide is aligned with a planned position and angle of entry for resecting the humeral head. There are several drawbacks to the use of physical humeral cut guides. For example, physical humeral cut guides may need to be designed and manufactured specifically for each patient. This may add delay and expense. In another example, positioning a humeral cut guide during surgery may be relatively time consuming.
[0069] This disclosure describes devices and techniques associated with the user of mixed reality (MR) to aid in humeral head resection. In this disclosure, the term MR may be taken to encompass augmented reality (AR). For example, this disclosure describes a tracking structure that helps a tracking system determine a position of the humerus. This disclosure also describes a technique for identifying a position of a sawblade of an oscillating saw for purposes of providing guidance to a user (e.g., surgeon) while the user is using the oscillating saw to resect the humeral head. Additionally, this disclosure describes techniques for providing virtual guidance to the user to guide resection of the humeral head. The techniques of this disclosure may eliminate the drawbacks associated with physical humeral cut guides. The techniques of this disclosure may improve accuracy and ease of use of virtual guidance.
[0070] FIG. 1 is a conceptual diagram illustrating an example system 100 in which one or more techniques of this disclosure may be performed. In the example of FIG. 1, system 100 includes a processing system 102, a MR device 104, a humeral tracking structure 106, and a tracking system 116.
[0071] Humeral tracking structure 106 includes an attachment body 108 and atracking marker 110. Humeral tracking structure 106 is attached to a humerus 112 using fixation members 114 that pass through apertures defined in attachment body 108. Fixation members 114 may include pins, screws, wires, or other items to attach attachment body 108 to humerus 112.
[0072] In the example of FIG. 1, tracking marker 110 is an optical marker having predefined optical patterns on different faces of a cube. For instance, tracking marker 110 may be a cube having different predefined optical patterns on each face other than a face connected to attachment body 108. In the example of FIG. 1, tracking marker 110 has 2-dimensional optical barcodes on different faces. Furthermore, in other examples, the faces of humeral tracking structure 106 have different predefined optical patterns, such as numbers. In other examples, tracking marker 110 has different polyhedral shapes. For instance, in other examples, tracking marker 110 may be a dodecahedron, pyramid, or another shape. In some examples, tracking marker 110 may be an ultrasonic emitter, an electromagnetic marker, a passive optical marker that reflects light, an active optical marker that emits light, and so on. In some examples, tracking marker 110 comprises a set of objects (e.g., balls, cubes, etc.) having predefined sizes and arranged in a predefined spatial configuration.
[0073] MR device 104 may use various visualization techniques to display MR visualizations to the user. A MR visualization may comprise one or more virtual objects that are viewable by a user at the same time as real -world objects. Thus, what the user sees may be a mixture of real and virtual objects. In such examples, MR device 104 may include a see-through display that
allows the user to directly see real objects. In other examples, MR device 104 may display an MR visualization comprising images that combine virtual objects and video of the real-world environment. Thus, in such examples, the user of MR device 104 does not directly see the real- world environment.
[0074] MR device 104 may comprise various types of devices for presenting MR visualizations. For instance, in some examples, MR device 104 may be a Microsoft HOLOLENS™ headset, such as the HOLOLENS 2 headset, available from Microsoft Corporation, of Redmond, Washington, USA, or a similar device, such as, for example, a MR visualization device that includes waveguides. The HOLOLENS™ device can be used to present 3D virtual objects via holographic lenses, or waveguides, while permitting a user to view actual objects in a real-world scene, i.e., in a real-world environment, through the holographic lenses. In some examples, MR device 104 may be a holographic projector, a headmounted smartphone, a special-purpose MR visualization device, or another type of device for presenting MR visualizations. In some examples, MR device 104 includes a head-mounted unit that communicates with a separate device (e.g., a smartphone, personal computer, tablet computer, etc.) that performs at least some of the processing functionality of MR device 104. In other examples, all functionality of MR device 104 is performed by hardware residing in a head-mounted unit. Discussion in this disclosure of actions performed by processing system 102 may be performed by processors in MR device 104, one or more computing devices separate from MR device 104, or a combination of the one or more computing devices and MR device 104.
[0075] Tracking system 116 may include sensors 118 for tracking the positions of real -world physical objects. For example, tracking system 116 may include depth sensors, Red-Green- Blue (RGB) cameras, infrared sensors, and/or other types of sensors. In some examples, tracking system 116 is incorporated into MR device 104. In other examples, tracking system 116 is separate from MR device 104.
[0076] Processing system 102 may comprise one or more processing units located in one or more computing devices. The computing devices may include MR device 104, server computers, personal computers, smartphones, tablet computers, laptop computers, and other types of computing devices. In examples where processing system 102 includes one or more computing devices in addition to MR device 104, the computing devices may communicate with MR device 104 via one or more wired or wireless communication links.
[0077] A user, such as a surgeon, may wear MR device 104 during a surgery, such as a shoulder arthroplasty or other type of surgery. During the surgery, MR device 104 may display guidance
to the surgeon to help the user determine how to resect the humeral head of humerus 112. Resecting the humeral head may be part of a process to prepare humerus 112 for implantation of a humeral prosthesis. In different examples, the guidance may take different forms. For instance, in some examples, the guidance may include a virtual cut plane that shows the user how to position the blade of an oscillating saw during resection of the humeral head. In another example, the guidance may include a plurality of planned plane elements and a plurality of current plane elements. Each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient. Each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of the sawblade. In this example, the planned plane elements are not contiguous with each other, and the current plane elements are not contiguous with each other. This example is described in greater detail with respect to FIG. 21.
[0078] Processing system 102 performs a registration process to ensure that MR device 104 is able to display the guidance at the correct location relative to humerus 112. Virtual elements, such as virtual guidance for resection of the humeral head, may be defined and generated using a coordinate system, such as an (x, y, z) coordinate system. For ease of explanation, the coordinate system in which virtual elements is defined is referred to as a virtual coordinate system. For example, a virtual model of humerus 112 may be generated based on preoperative images of the patient. Points on the virtual model may be defined in the virtual coordinate system. During a preoperative planning process, a cut plane for resection of the humeral head may be defined in the virtual coordinate system so that the user can see the virtual cut plane relative to the virtual model of humerus 112. However, processing system 102 determines the locations of physical objects, including MR device 104 and humerus 112, in terms of a different coordinate system, which is referred to as a physical coordinate system. The physical coordinate system may be an (x, y, z) coordinate system, or another type of coordinate system. In some examples, a point on or within tracking marker 110 may be the origin of the physical coordinate system. For instance, a comer or a centroid of tracking marker 110 may be the origin of the physical coordinate system. The axes of the physical coordinate system may be aligned with edges of tracking marker 110. The registration process generates registration data that define a relationship between the physical coordinate system and the virtual coordinate system. [0079] Processing system 102 may use the registration data to determine how to position and orient virtual elements within the physical coordinate system so that MR device 104 is able to display the virtual elements, and so that the virtual elements appear to the user to be at the correct locations. For example, if a point on humerus 112 is defined as (xpi, ypi, zpi) in the
physical coordinate system and corresponding point on a virtual model of humerus 112 is defined as (xvi, yvi, zvi), processing system 102 may use registration data to determine that the virtual coordinate (xvi, yvi, zvi) corresponds to (xpi, ypi, zpi).
[0080] To display virtual guidance to the user at the correct location relative to humerus 112, processing system 102 may need to accurately determine the current location of humerus 112. Humerus 112 may move during surgery. Thus, it cannot be assumed that any point on humerus 112 will retain the same coordinates in the physical coordinate system.
[0081] Tracking marker 110 helps processing system 102 determine the current location of humerus 112. Humeral tracking structure 106 is fixed to humerus 112 such that tracking marker 110 of humeral tracking structure 106 cannot move independently of humerus 112. Sensors 118 of tracking system 116 may sense the position of tracking marker 110 in 3 -dimensions. Processing system 102 may thus use signals from sensors 118 to determine the position of tracking marker 110 in the physical coordinate system.
[0082] However, determining the position of tracking marker 110 in the physical coordinate system is not sufficient to determine the position of humerus 112 in the physical coordinate system. This is because of variations on the shapes of humeri and the slight variations in how tracking structures may be fixed to humeri. Thus, a process is performed to determine the position of the humerus relative to tracking marker 110. During this process, a user may use a digitizer to palpate various parts of humerus 112. The digitizer may be a handheld stylusshaped object to which a tracking marker is attached. For instance, the tracking marker may be attached to an end of the digitizer opposite a tip of the digitizer.
[0083] Palpating humerus 112 is a process of touching the tip of the digitizer to humerus 112. In this disclosure, palpating may refer to the act or process of touching to the tip of the digitizer on an anatomical object, such as humerus 112. There is a fixed spatial relationship between the tracking marker of the digitizer and the tip of the digitizer. For example, the tracking marker of the digitizer and the tip of the digitizer may be exactly 10 centimeters. Thus, processing system 102 is able to use the fixed spatial relationship between the tracking marker of the digitizer and the tip of the digitizer along with the 3 -dimensional position of the tracking marker of the digitizer to determine positions on the palpated portions of humerus 112 in the physical coordinate system. Thus, at this point, processing system 102 has data describing the 3- dimensional position of tracking marker 110 and concurrent 3 -dimensional positions of the palpated portions of humerus 112 in the physical coordinate system. Processing system 102 may complete the registration process by determining a transform that maps the 3 -dimensional
positions of the palpated portions of humerus 112 to corresponding portions of the virtual model of humerus 112.
[0084] Attachment body 108 of humeral tracking structure 106 may be attached at a bicipital groove of humerus 112. The bicipital groove of humerus 112 is a groove on humerus 112 that separates the greater tubercle from the lesser tubercle. The bicipital groove of humerus 112 is exposed as part of the process of preparing humerus 112 for implantation of a humeral prosthesis. Attaching attachment body 108 at the bicipital groove of humerus 112 may be advantageous because doing so may help to ensure that tracking marker 110 is out of the way of a sawblade during resection of the humeral head. Additionally, because attachment body 108 may sit at least partially within the bicipital groove, the walls of the bicipital groove may make it harder for attachment body 108 to slip during attachment of attachment body 108 to humerus 112.
[0085] However, it is useful to palpate the bicipital groove of humerus 112 when performing the registration process. The bicipital groove is a highly visible and distinct landmark on humerus 112. Nevertheless, attaching attachment body 108 of humeral tracking structure 106 at the bicipital groove may prevent the user from palpating the bicipital groove. Thus, in accordance with a technique of this disclosure, attachment body 108 may define a slot having dimensions sufficient for palpation of the bicipital groove using a digitizer. The slot may have dimensions sufficient for palpation of the bicipital groove if the dimensions are large enough for a tip of the digitizer to contact one or more locations on the bone tissue of the bicipital groove. In some examples, the slot may have a width ranging from 4 millimeters (mm) to 10mm. In some examples, the slot may have a length ranging from 50mm to 70mm. In other examples, the slot may have other dimensions. In some examples, the user palpates other areas of humerus 112, such as the humeral head and metaphysis, during the registration process.
[0086] In such examples, processing system 102 may receive first signals (e.g., video signals) from one or more sensors 118 of tracking system 116. Processing system 102 may determine, based on the first signals, first points corresponding to a tracking marker 110 of humeral tracking structure 106. Humeral tracking structure 106 includes tracking marker 110 and attachment body 108, which is positioned at a bicipital groove of humerus 112 of a patient. Processing system 102 may also receive second signals (e.g., later video signals) from sensors 118 of tracking system 116. Processing system 102 may determine, based on the second signals, second points corresponding to a tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in attachment body 108 of humeral tracking structure 106. Processing system 102 may determine, based on the second points and
a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove. Positions of the first, second and third points may be defined in a physical coordinate system. Processing system 102 may then generate, based on the third points and a virtual model of the humerus having points defined in a virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system.
[0087] In some examples, upon generating the registration data, processing system 102 may output, for display via MR device 104, a virtual bone model and/or a virtual planned cutting plane adjacent to or superimposed on humerus 112. The virtual bone model and/or virtual planned cutting plane may be rotated and spatially oriented based on the registration data to be rotated and oriented in the same way as humerus 112. The user may confirm the registration was successful based on the display. That is, if registration is successful and humeral tracking structure 106 is attached to the correct position on humerus 112, the rotation and spatial orientation of the virtual bone model as shown by MR device 104 are the same real rotation and spatial orientation of humerus 112. However, if humeral tracking structure 106 is not correctly positioned on humerus 112, the virtual bone model as shown by the MR device 104 may appear to be rotated or oriented differently from humerus 112. Similarly, if humeral tracking structure 106 is not correctly positioned on humerus 112, the virtual planned cutting plane may be at an unreasonable angle relative to humerus 112. Correct positioning of humeral tracking structure 106 on humerus 112 may be challenging in some cases, especially cases of trauma or severe bone erosion. Accordingly, displaying the virtual bone model and/or the virtual planned cutting plane in this way may help assure the user that humeral tracking structure 106 was attached to humerus 112 correctly.
[0088] The user may use an oscillating saw 120 to resect the humeral head of humerus 112. Oscillating saw 120 has a sawblade 122. A tracking marker 124 is attached to a body of oscillating saw 120. The body of saw 120 may be amain section of saw 120 or another physical component of saw 120. Processing system 102 may use signals from tracking system 116 to determine a position of tracking marker 124 in the physical coordinate system . Tracking marker 124 may be at a fixed distance from the body of oscillating saw 120. Thus, processing system 102 may be able to track the position of oscillating saw 120 while the user is using oscillating saw 120 to resect the humeral head of humerus 112. By tracking the position of oscillating saw 120 while the user is using oscillating saw 120 to resect the humeral head of humerus 112, processing system 102 may be able to cause MR device 104 to display real-time guidance
regarding the position of oscillating saw 120 while the user is using the oscillating saw to resect the humeral head.
[0089] When resecting the humeral head of humerus 112 using oscillating saw 120, the position of the bottom of sawblade 122 is what will be the top of the remaining portion of humerus 112. Different sawblades may have different thicknesses. The differences in thickness of sawblade may differ on the order of one or more millimeters. However, such differences, if unaccounted for, may result in the user cutting humerus 112 at the wrong position.
[0090] This disclosure describes techniques that may identify a position of sawblade 122. For instance, in accordance with a technique of this disclosure, processing system 102 may receive signals from one or more sensors 118 of tracking system 116. Processing system 102 may determine, based on the signals, first points corresponding to tracking marker 124 attached to a body of saw 120. Additionally, processing system 102 may determine, based on the signals, second points corresponding to a second tracking marker of a tracking structure (illustrated elsewhere in this disclosure) while sawblade 122 of saw 120 is positioned in a recess defined by a support body of the tracking structure. The first points and the second points are defined in a physical coordinate system. Processing system 102 may generate, based on the first points and the second points, position identification data that specify a position of a lower edge of sawblade 122 relative to tracking marker 124. Processing system 102 may use the position identification data when generating guidance for display by MR device 104.
[0091] Additionally, this disclosure describes techniques for presenting virtual cut guidance. For example, processing system 102 determine positions for planned plane elements and current plane elements. The planned plane elements indicate locations on a planned cutting plane through a bone, such as humerus 112 or another type of bone. The planned plane elements are not contiguous with each other. The current plane elements indicate locations on a current cutting plane of a sawblade of a saw. The current plane elements are not contiguous with each other. Processing system 102 may cause MR device 104 to concurrently display the planned plane elements at their determined positions and the current plane elements at their determined positions. For each pair of corresponding current plane elements and plarmed plane elements, at least one of the current or planned plane element of the pair has a visual property (e.g., color, texture, outlining pattern, etc.) based on the position for the current plane element of the pair relative to the planned cutting plane.
[0092] FIG. 2 is a flowchart illustrating an example process for MR-based navigation for surgical tasks associated with a humerus of a patient according to techniques of this disclosure. In the example of FIG. 2, after a user has attached humeral tracking structure 106 at a bicipital
groove of humerus 112, processing system 102 may perform a registration process (200). The registration process may involve determining a position of tracking marker 110 of humeral tracking structure 106, determining positions on humerus 112 based on positions of a digitizer that palpates humerus 112, and generating registration data that map positions in a virtual coordinate system to positions in a physical coordinate system. Additionally, processing system 102 may perform a sawblade position identification process (202). Processing system 102 may then perform a surgical navigation process (204). During the surgical navigation process, processing system 102 may cause MR device 104 to display virtual elements that guide the user to resect the humeral head of humerus 112. After completing the resection of the humeral head is complete, processing system 102 may perform an accuracy check to ensure that the humeral head was resected at the correct location (206).
[0093] FIG. 3 is a block diagram illustrating an example computing system in accordance with one or more techniques of this disclosure. In the example of FIG. 3, a computing system 300 includes processors 302, memory 304, a communication interface 306, and a display 308. In other examples, computing system 102 may include more, fewer, or different components. The components of computing system 102 may be in one or more computing devices. For example, processors 302 may be in a single computing device or distributed among multiple computing devices of computing system 102, memory 304 may be in a single computing device or distributed among multiple computing devices of computing system 102, and so on.
[0094] Processors 302 may be implemented in circuitry and include one or more microprocessors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), hardware, or any combinations thereof. In general, processors 302 maybe implemented as fixed-function circuits, programmable circuits, or a combination thereof. Fixed-function circuits refer to circuits that provide particular functionality and are preset on the operations that can be performed. Programmable circuits refer to circuits that can be programmed to perform various tasks and provide flexible functionality in the operations that can be performed. For instance, programmable circuits may execute software or firmware that cause the programmable circuits to operate in the manner defined by instructions of the software or firmware. Fixed-function circuits may execute software instructions (e.g., to receive parameters or output parameters), but the types of operations that the fixed-function circuits perform are generally immutable. In some examples, one or more of the units may be distinct circuit blocks (fixed-function or programmable), and in some examples, the one or more units may be integrated circuits.
[0095] Processors 302 may include arithmetic logic units (ALUs), elementary function units (EFUs), digital circuits, analog circuits, and/or programmable cores, formed from programmable circuits. In examples where the operations of processors 302 are performed using software executed by the programmable circuits, memory 304 may store the object code of the software that processors 302 receives and executes, or another memory within processors 302 (not shown) may store such instructions. Examples of the software include software designed for surgical planning. Processors 302 may perform the actions ascribed in this disclosure to processing system 102.
[0096] Memory 304 may store various types of data used by processors 302. Memory 304 may include any of a variety of memory devices, such as dynamic random access memory (DRAM), including synchronous DRAM (SDRAM), magnetoresistive RAM (MRAM), resistive RAM (RRAM), or other types of memory devices. Examples of display 308 include a liquid crystal display (LCD), a plasma display, an organic light emitting diode (OLED) display, or another type of display device.
[0097] Communication interface 306 that allows computing system 102 to output data and instructions to and receive data and instructions from MR device 104, a medical imaging system, or other device via one or more communication links or networks. Communication interface 306 may be hardware circuitry that enables computing system 102 to communicate (e.g., wirelessly or using wires) to other computing systems and devices, such as MR device 104. Example networks may include various types of communication networks including one or more wide-area networks, such as the Internet, local area networks, and so on. In some examples, the network may include wired and/or wireless communication links.
[0098] In the example of FIG. 3, memory 304 stores registration data 310 and plan data 312. Additionally, in the example of FIG. 3, memory 304 stores a registration system 316, aplanning system 318, and virtual guidance system 320. In other examples, memory 304 may store more, fewer, or different types of data or units. Moreover, the data and units illustrated in the example of FIG. 3 are provided for purposes of explanation and may not represent how data is actually stored or how software is actually implemented. Registration system 316, planning system 318, and virtual guidance system 320 may comprise instructions that are executable by processors 302. For ease of explanation, this disclosure may describe registration system 316, planning system 318, and virtual guidance system 320 as performing various actions when processors 302 execute instructions of registration system 316, planning system 318, and virtual guidance system 320.
[0099] Computing system 102 may receive, from tracking system 116, tracking input (e.g., signals) of a scene that includes humerus 112. Registration system 316 may generate registration data 310 that registers humeral tracking structure 106 with a coordinate system. Registration data 310 may define transforms between a virtual coordinate system and the physical coordinate system.
[0100] As part of performing the registration process, registration system 316 may obtain a first point cloud and a second point cloud. The first point cloud includes points on one or more virtual objects, such as a virtual object representing a surface of a bone. The second point cloud may include points on real-world objects, such as tracking marker 110 and tracking marker 124. The points in the first point cloud may be expressed in terms of coordinates in a virtual coordinate system and the points in the second point cloud may be expressed in terms of coordinates in a physical coordinate system. Because virtual objects may be designed with positions that are relative to one another but not relative to any real -world objects, the virtual and physical coordinate systems may be different.
[0101] Registration system 316 may generate the second point cloud using a Simultaneous Localization and Mapping (SLAM) algorithm. By performing the SLAM algorithm, registration system 316 may generate the second point cloud based on the tracking data. Registration system 316 may perform one of various implementations of SLAM algorithms, such as a SLAM algorithm having a particular filter implementation, an extended Kalman filter implementation, a covariance intersection implementation, a GraphSLAM implementation, an ORB SLAM implementation, or another implementation. In some examples, registration system 316 applies an outlier removal process to remove outlying points in the first and/or second point clouds. In some examples, the outlying points may be points lying beyond a certain standard deviation threshold from other points in the point clouds. Applying outlier removal may improve the accuracy of the registration process.
[0102] In some examples, registration system 316 may generate a preliminary spatial relationship between points in the first point cloud and points in the second point cloud. For example, registration system 316 may perform an iterative closest point (ICP) algorithm to determine the preliminary spatial relationship between the points in the first point cloud and the points in the second point cloud. For instance, the ICP algorithm may determine a preliminary spatial relationship between points on a bone in the physical coordinate system and corresponding points in a model of the bone in the virtual coordinate system. The iterative closest point algorithm finds a combination of translational and rotational parameters that minimize the sum of distances between corresponding points in the first and second point
clouds. For example, consider a basic example where landmarks corresponding to points in the first point cloud are at coordinates A, B, and C and the same landmarks correspond to points in the second point cloud are at coordinates A ’ , B ’ , and C ’ . In this example, the iterative closest point algorithm determines a combination of translational and rotational parameters that minimizes AA + AB + AC, where AA is the distance between A and A’, AB is the distance between B and B’, and AC is the distance between C and C’. To minimize the sum of distances between corresponding landmarks in the first and second point clouds, registration system 316 may perform the following steps:
1. For each point of the first point cloud, determine a corresponding point in the second point cloud. The corresponding point may be the closest point in the second point cloud. In this example, the first point cloud includes points corresponding to landmarks on one or more virtual objects and the second point cloud may include points corresponding to landmarks on real-world objects (e.g., tracking marker 110, tracking marker 124).
2. Estimate a combination of rotation and translation parameters using a root mean square point-to-point distance metric minimization technique that best aligns each point of the first point cloud to its corresponding point in the second point cloud.
3. Transform the points of the first point cloud using the estimated combination of rotation and translation parameters.
4. Iterate steps 1-3 using the transformed points of the first point cloud.
In this example, after performing an appropriate number of iterations, registration system 316 may determine rotation and translation parameters that describe a spatial relationship between the original positions of the points in the first point cloud and the final positions of the points in the first point cloud. The determined rotation and translation parameters can therefore express a mapping between the first point cloud and the second point cloud. Registration data 310 may include the determined rotation and translation parameters. In this way, registration system 316 may generate registration data 310.
[0103] Plan data 312 may include data information describing a plan for a user to follow with respect to a patient. In some examples, plan data 312 may include surgical planning data that describe a process to prepare for and conduct a surgery on the patient. For instance, plan data 312 may include data defining a cutting plane for resecting the humeral head of humerus 112. In some examples, plan data 312 may include other information describing a process to prepare
for and conduct the surgery on the patient. For instance, plan data 312 may include information defining a planned reaming axis and position for reaming the patient’s scapula, information defining a planned axis for inserting a surgical pin in a humerus for extracting a bone fragment to use as a bone graft, and other details of the surgery. In some examples, plan data 312 also includes medical images, e.g., x-ray images, computed tomography images or models, and so on.
[0104] Plarming system 318 may enable a user to view plan data 312. For instance, planning system 318 may cause a display device (e.g., MR. visualization device 104, display 308, etc.) to output one or more graphical user interfaces that enable the user to see models of anatomic structures, prostheses, bone grafts, and so on. In some examples, planning system 318 may generate some or all of plan data 312 in response to input from the user. For example, planning system 318 may generate, based on user input, data defining a cutting plane for resecting the humeral head of humerus 112.
[0105] Furthermore, virtual guidance system 320 may cause MR device 104 to output virtual objects for display. For example, in accordance with the techniques of this disclosure, virtual guidance system 320 may determine positions for planned plane elements and current plane elements. The planned plane elements indicate locations on a planned cutting plane through a bone. The plarmed plane elements are not contiguous with each other. The current plane elements indicate locations on a current cutting plane of a sawblade of a saw. The current plane elements are not contiguous with each other. Virtual guidance system 320 may cause MR device 104 to concurrently display the planned plane elements at their determined positions and the current plane elements at their determined positions. For each pair of corresponding current plane elements and planned plane elements, at least one of the current or planned plane element of the pair has a visual property (e.g., color, texture, etc.) based on the position for the current plane element of the pair relative to the planned cutting plane.
[0106] FIG. 4 is a schematic representation of a MR device in accordance with one or more techniques of this disclosure. As shown in the example of FIG. 4, MR device 104 can include a variety of electronic components found in a computing system, including one or more processors 414 (e.g., microprocessors or other types of processing units) and memory 416 that may be mounted on or within a frame 418. In some examples, processors 302 may include processors 414 and/or memory 304 may include memory 416.
[0107] Furthermore, in the example of FIG. 4, MR device 104 may include a transparent screen 420 that is positioned at eye level when MR device 104 is worn by a user. In some examples, screen 420 can include one or more liquid crystal displays (LCDs) or other types of display
screens on which images are perceptible to a user who is wearing or otherwise using MR device 104 via screen 420. Other display examples include organic light emitting diode (OLED) displays. In some examples, MR device 104 can operate to project 3D images onto the user’s retinas using techniques known in the art.
[0108] In some examples, screen 420 includes see-through holographic lenses, sometimes referred to as waveguides, that permit a user to see real -world objects through (e.g., beyond) the lenses and also see holographic imagery projected into the lenses and onto the user’s retinas by displays, such as liquid crystal on silicon (LCoS) display devices, which are sometimes referred to as light engines or projectors, operating as an example of a holographic projection system 838 within MR device 104. In other words, MR device 104 may include one or more see-through holographic lenses to present virtual images to a user. Hence, in some examples, MR device 104 can operate to project 3D images onto the user’s retinas via screen 420, e.g., formed by holographic lenses. In this manner, MR device 104 may be configured to present a 3D virtual image to a user within a real-world view observed through screen 420, e.g., such that the virtual image appears to form part of the real-world environment. In some examples, MR device 104 may be a Microsoft HOLO LENS™ headset, available from Microsoft Corporation, of Redmond, Washington, USA, or a similar device, such as, for example, a similar MR visualization device that includes waveguides. The HOLOLENS ™ device can be used to present 3D virtual objects via holographic lenses, or waveguides, while permitting a user to view actual objects in a real-world scene, i.e., in a real-world environment, through the holographic lenses.
[0109] Although the example of EIG. 4 illustrates MR device 104 as a head-wearable device, MR device 104 may have other forms and form factors. Lor instance, in some examples, MR device 104 may be a handheld smartphone or tablet. In other examples, MR device 104 is a supported by an armature that allows the user to move MR device 104 into and out of a position for viewing a patient’s anatomy without the user wearing MR device 104.
[0110] MR device 104 can also generate a user interface (UI) 422 that is visible to the user, e.g., as holographic imagery projected into see-through holographic lenses as described above. Lor example, UI 422 can include a variety of selectable widgets 424 that allow the user to interact with a MR system. Imagery presented by MR device 104 may include, for example, one or more 2D or 3D virtual objects. MR device 104 also can include a speaker or other sensory devices 426 that may be positioned adjacent the user’s ears. Sensory devices 426 can convey audible information or other perceptible information (e.g., vibrations) to assist the user of MR device 104.
[0111] MR device 104 can also include a transceiver 428 to connect MR device 104 to a network, a computing cloud, such as via a wired communication protocol or a wireless protocol, e.g., Wi-Fi, Bluetooth, etc. MR device 104 also includes a variety of sensors to collect sensor data, such as one or more optical sensors 430 and one or more depth sensors 432 (or other depth sensors), mounted to, on or within frame 418. In some examples, optical sensor(s) 430 are operable to scan the geometry of the physical environment in which a user of MR device 104 is located (e.g., an operating room) and collect two-dimensional (2D) optical image data (either monochrome or color). Depth sensor(s) 432 are operable to provide 3D image data, such as by employing time of flight, stereo or other known or future -developed techniques for determining depth and thereby generating image data in three dimensions. Other sensors can include motion sensors 433 (e.g., Inertial Mass Unit (IMU) sensors, accelerometers, etc.) to assist with tracking movement.
[0112] Processing system 102 may receive tracking data from sensors of MR device 104. The tracking data may include data from optical sensors 430, depth sensors 432, motion sensors 433, and so on. Processing system 102 may process the tracking data so that geometric, environmental, textural, or other types of landmarks (e.g., comers, edges or other lines, walls, floors, objects) in the user’s environment or “scene” can be defined and movements within the scene can be detected. As an example, the various types of tracking data can be combined or fused so that the user of MR device 104 can perceive virtual objects that can be positioned, or fixed and/or moved within the scene. When a virtual object is fixed in the scene, the user can walk around the virtual object, view the virtual object from different perspectives, and manipulate the virtual object within the scene using hand gestures, voice commands, gaze line (or direction) and/or other control inputs. In some examples, processing system 102 may process the tracking data so that the user can position a 3D virtual object on an observed physical object in the scene and/or orient the 3D virtual object with other virtual objects displayed in the scene. In some examples, processing system 102 may process the tracking data so that the user can position and fix a virtual representation of the surgical plan (or other widget, image or information) onto a surface, such as a wall of the operating room. In some examples, computing system 102 may use the tracking data to recognize surgical instruments and determine the positions of those surgical instruments.
[0113] MR device 104 may include one or more processors 414 and memory 416, e.g., within frame 418 of MR device 104. In some examples, one or more external computing resources 436 process and store information, such as sensor data, instead of or in addition to in-frame processor(s) 414 and memory 416. For example, external computing resources 436 may
include processing circuitry, memory, and/or other computing resources of computing system 102 (FIG. 1). In this way, data processing and storage may be performed by one or more processors 414 and memory 416 within MR device 104 and/or some of the processing and storage requirements may be offloaded from MR device 104. Hence, in some examples, one or more processors that control the operation of MR device 104 may be within MR device 104, e.g., as processor(s) 414. Alternatively, in some examples, at least one of the processors that controls the operation of MR device 104 may be external to MR device 104, e.g., as processor(s) 414. Likewise, operation of MR device 104 may, in some examples, be controlled in part by a combination of one or more processors 414 within the visualization device and one or more processors external to MR device 104.
[0114] In some examples, processing of tracking data can be performed by processor(s) 414 in conjunction with memory 416 or memory 304. In some examples, processor(s) 414 and memory 416 mounted to frame 418 may provide sufficient computing resources to process the tracking data collected by optical sensor(s) 430, depth sensor(s) 432 and motion sensors 433. In some examples, the tracking data can be processed using a Simultaneous Localization and Mapping (SLAM) algorithm, or other algorithms for processing and mapping 2D and 3D image data and tracking the position of MR device 104 in the 3D scene. In some examples, image tracking may be performed using sensor processing and tracking functionality provided by the Microsoft HOLOLENS™ system, e.g., by one or more sensors and processors 414 within a MR device 104 substantially conforming to the Microsoft HOLOLENS™ device or a similar mixed reality (MR) visualization device.
[0115] In some examples, system 100 can also include user-operated control device(s) 434 that allow the user to operate MR device 104, use MR device 104 in spectator mode (either as master or observer), interact with UI 422 and/or otherwise provide commands or requests to processors(s) 814 or other systems connected to a network. As examples, control device(s) 834 can include a microphone, a touch pad, a control panel, a motion sensor or other types of control input devices with which the user can interact.
[0116] FIG. 5A and FIG. 5B are schematic representations of an example attachment body 500 of a humeral tracking structure according to techniques of this disclosure. FIG. 5 A shows a non-contact side of attachment body 500. The non-contact side of attachment body 500 is a side of attachment body 500 that does not contact humerus 112 when attachment body 500 is attached to humerus 112. FIG. 5B shows a lateral view of attachment body 500. The lateral view shown in FIG. 5B is rotated 90° from the view shown in FIG. 5A. Attachment body 500 defines fixation apertures 502A, 502B, 502C, 502D, and 502E (collectively, “fixation apertures
502”). Fixation members (e.g., pins, wires, or screws, etc.) may be passed through fixation apertures 502 to affix attachment body 500 to humerus 112. In some examples, the fixation members could intrude into cancellous bone, e.g., cancellous bone of humerus 112. In examples in which a humeral prosthesis includes a portion, e.g., a stem or peg, configured to be placed within the cancellous bone for press-fit fixation, the fixation members could impede the implantation process if the fixation members intrude the cancellous bone of the humerus. In some examples, the fixation members may be sized such that the fixation members do not intrude into cancellous bone of humerus 112. In some examples, the fixation members may include depth stop elements to prevent intrusion into cancellous bone of humerus 112. In examples in which the fixation elements are screws, the user may use a screwdriver to tighten the screws to affix attachment body 500 to humerus 112. Fixation aperture 502A and fixation aperture 502B lead to fixation aperture 502C. The two fixation apertures 502A, 502B leading to the single fixation aperture 502 may enable attachment body 500 to be used for either a left or a right humerus. Fixation aperture 502D leads to fixation aperture 502E. Attachment body 500 defines a slot 504 having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer.
[0117] FIG. 6A and FIG. 6B are schematic representations of an example attachment body 600 of a humeral tracking structure according to techniques of this disclosure. FIG. 6A shows a non-contact side of attachment body 600. The non-contact side of attachment body 600 is a side of attachment body 600 that does not contact humerus 112 when attachment body 600 is attached to humerus 112. FIG. 6B shows a lateral view of attachment body 600. The lateral view shown in FIG. 6B is rotated 90° from the view shown in FIG. 6A. Attachment body 600 defines fixation apertures 602A, 602B, 602C, 602D, and 602E (collectively, “fixation apertures 602”). Fixation members (e.g., pins, wires, or screws, etc.) may be passed through fixation apertures 602 to affix attachment body 600 to humerus 112. In some examples, the fixation members may be sized such that the fixation members do not intrude into cancellous bone of humerus 112. In examples in which the fixation elements are screws, the user may use a screwdriver to tighten the screws to affix attachment body 600 to humerus 112. Fixation aperture 602A and fixation aperture 602B lead to a single fixation aperture (not shown) on the contact side of attachment body 600. The two fixation apertures 602A, 602B leading to the single fixation aperture may enable attachment body 600 to be used for either a left or a right humerus. Fixation aperture 602D leads to fixation aperture 602E.
[0118] Attachment body 600 defines a slot 604 having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer. In the example of FIG. 6A and FIG. 6B, In the
example of FIG. 6A and FIG. 6B, a distal end of slot 604 is open-ended. That is, attachment body 600 includes a first prong 606A on a first side of slot 604 and a second prong 606B on a second side of slot 604. Each of first prong 606A and second prong 606B define respective apertures (fixation aperture 602C, 602D) sized to accommodate fixation members for attachment of attachment body 600 to humerus 112.
[0119] FIG. 7A and FIG. 7B are schematic representations of an example attachment body 700 of a humeral tracking structure according to techniques of this disclosure. Attachment body 700 is similar to attachment body 600, except has greater curvature.
[0120] FIG. 7C and FIG. 7D are schematic representations of a first alternative version of attachment body 700. In the example of FIG. 7C and 7D, prongs 706A and 706B are angled relative to one another. FIG. 7E and FIG. 7F are schematic representations of a second alternative version of attachment body 700. In the example of FIG. 7E and 7F, prongs 706A and 706B are parallel and meet at a curved structure FIG. 7G and FIG. 7H are schematic representations of a third alternative version of attachment body 700, FIG. 71 and FIG. 7J are schematic representations of a fourth alternative version of attachment body 700, according to techniques of this disclosure. FIG. 7K, FIG. 7L, and FIG. 7M are schematic representations of a fifth alternative version of attachment body 700, according to techniques of this disclosure. As shown in the example of FIG. 7K, depth-stop screws 702 may be used to attach attachment body 700 to humerus 112. In FIGS. 7K, 7L, and 7M, prongs 706A and 706B have a textured area 708 that may improve a user’s ability to handle attachment body 700.
[0121] FIG. 8 is a conceptual diagram illustrating an example view of registering a humeral head 802 according to techniques of this disclosure. A user may use a digitizer 800 to palpate humeral head 802 of humerus 112. Digitizer 800 includes a tracking marker 804. MR device 104 may display virtual instructions 806 to the user instructing the user which part of humerus 112 to palpate. Additionally, a ring-shaped virtual element 810 indicates an amount of progress toward completion of palpating humeral head 802. Palpation of humeral head 802 may be complete when registration system 316 determines at least a sufficient quantity of points on humeral head 802. In some examples, tubercles of the humerus are not palpated during registration.
[0122] FIG. 9 is a conceptual diagram illustrating an example view of registering a humeral metaphysis 900 according to techniques of this disclosure. A user may use digitizer 800 to palpate humeral metaphysis 900 of humerus 112. MR device 104 may display virtual instructions 906 to the user instructing the user which part of humerus 112 to palpate. Additionally, a ring-shaped virtual element 910 indicates an amount of progress toward
completion of palpating humeral metaphysis 900. Palpation of humeral metaphysis 900 may be complete when registration system 316 determines at least a sufficient quantity of points on humeral metaphysis 900.
[0123] FIG. 10 is a conceptual diagram illustrating an example view of registering a bicipital groove 1000 according to techniques of this disclosure. A user may use digitizer 800 to palpate bicipital groove 1000 of humerus 112 via a slot defined in an attachment body 108 of humeral tracking structure 106. MR device 104 may display virtual instructions 1006 to the user instructing the user which part of humerus 112 to palpate. Additionally, a ring-shaped virtual element 1010 indicates an amount of progress toward completion of palpating bicipital groove 1000. Palpation of bicipital groove 1000 may be complete when registration system 316 determines at least a sufficient quantity of points on bicipital groove 1000. In some examples, palpation of bicipital groove 1000 does not include an upper end of bicipital groove 1000.
[0124] FIG. 11 is a conceptual diagram illustrating validation of registration of humeral head 802 according to techniques of this disclosure. In the example of FIG. 11, MR device 104 displays a virtual element 1100 at a location on humeral head 802. The user is prompted to position digitizer 800 at a location on humeral head 802 indicated by virtual element 1100. Processing system 102 tracks the positions of digitizer 800 and humerus 112 based on the positions of tracking marker 804 and tracking marker 110, respectively. Based on the positions of digitizer 800 and humerus 112, processing system 102 may determine a distance from a tip of digitizer 800 to the location on humeral head 802 indicated by virtual element 1100. MR device 104 displays a virtual element 1102 that indicates the distance from the tip of digitizer 800 to the location on humeral head 802 indicated by virtual element 1100. If the distance is non-zero when the tip of digitizer 800 is in contact with the location on humeral head 802 indicated by virtual element 1100, there registration data may be inaccurate. Accordingly, if the distance is non-zero when the tip of digitizer element 800 is in contact with the location on humeral head 802 indicated by virtual element 1100, processing system 102 may perform one or more actions to refine registration data 310 or generate new registration data. For example, if the distance is non-zero when the tip of digitizer element 800 is in contact with the location on humeral head 802, processing system 102 may restart the registration process. If the registration process is not successful, MR device 104 does not display virtual guidance.
[0125] FIG. 12 is a conceptual diagram illustrating validation of registration of humeral head 802 according to techniques of this disclosure. MR device 104 may display virtual elements at multiple points on humeral head 802 to validate registration. For instance, in addition to displaying virtual element 1100 at a superior point of humeral head 802, MR device 104 may
display virtual element 1200 at a most-medial point of humeral head 802. Similar to FIG. 11, MR device 104 may display a virtual element 1202 that indicates a distance of the tip of digitizer 800 to the point indicated by virtual element 1200. In some examples, MR device 104 changes the color of virtual elements 1100, 1200 based on the determined distance of the tip of digitizer 800 to the indicated point. For instance, the virtual element may be blue if the distance is non-zero and green if the distance is zero.
[0126] FIG. 13 is a conceptual diagram illustrating validation of registration of a humeral metaphysis according to techniques of this disclosure. Similar to FIG. 11 and FIG. 12, MR device 104 may display a virtual element 1302 at a location on the humeral metaphysis 1300 of humerus 112. MR device 104 may display a virtual element 1304 that indicates a distance of a tip of digitizer 800 to the location on humeral metaphysis 1300.
[0127] FIG. 14 is a flowchart illustrating an example registration operation according to techniques of this disclosure. In the example of FIG. 14, processing system 102 may receive first signals from one or more sensors 118 of tracking system 116 (1400). Sensors 118 may include video and/or depth cameras and the signals may include video signals and depth image signals. In some examples, tracking system 116 is included in MR device 104.
[0128] Registration system 316 may determine, based on the first signals, first points corresponding to tracking marker 110 of humeral tracking structure 106 (1402). Humeral tracking structure 106 includes tracking marker 110 and an attachment body 108 positioned at a bicipital groove of humerus 112 of a patient. In some examples, attachment body 108 defines two or more apertures (e.g., apertures 502, 602) sized to accommodate fixation members for attachment of attachment body 108 to humerus 112. Furthermore, in some examples, attachment body 108 is manufactured to have a shape that is specific to humerus 112 of the patient. In other examples, attachment body 108 is not specific to any patient but may be generic for all patients, or attachment body 108 may have a limited range of two or more sizes. In some examples, attachment body 108 and tracking marker 110 are one physical unit, or attachment body 108 and tracking marker 110 are separate units assembled together.
[0129] In some examples, the first signals may comprise images of optical patterns on two or more faces of tracking marker 110. Each of the faces of tracking marker 110 may have a different optical pattern. Registration system 316 may determine positions of vertices of tracking marker 110 based on the images of the optical patterns. For instance, registration system may use a SLAM algorithm to determine the positions of the vertices of tracking marker 110.
[0130] In some examples, prior to determining the first points, registration system 316 may cause MR device 104 to display instructions to position humeral tracking structure 106 at the bicipital groove of humerus 112 and to insert fixation elements 114 through apertures of attachment body 108 into humerus 112 to attach humeral tracking structure 106 to humerus 112.
[0131] Additionally, processing system 102 may receive second signals from sensors 118 of tracking system 116 (1404). Sensors 118 may include video and/or depth cameras and the second signals may include video signals and depth image signals from a time later than the first signals.
[0132] Registration system 316 may determine, based on the second signals, second points corresponding to tracking marker 804 of digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot (e.g., slot 504, 604) defined in attachment body 108 of humeral tracking structure 106 (1406). For example, the second signals may comprise images of optical patterns on two or more faces of tracking marker 804. Each of the faces of tracking marker 804 may have a different optical pattern. Registration system 316 may determine positions of vertices of tracking marker 804 based on the images of the optical patterns. For instance, registration system may use a SLAM algorithm to determine the positions of the vertices of tracking marker 804.
[0133] Registration system 316 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove (1408). Positions of the first, second and third points are defined in a physical coordinate system. In some examples, registration system 316 may cause MR device 104 to display instructions 1006 to palpate the bicipital groove with the digitizer 800, e.g., as shown in FIG. 10.
[0134] Registration system 316 may generate, based on the third points and a virtual model of the humerus having points defined in a virtual coordinate system, registration data 310 defining a relationship between the physical coordinate system and the virtual coordinate system (1410). Registration system 316 may use the process described above with reference to FIG. 2 to generate registration data 310.
[0135] In some examples, registration system 316 may generate registration data 310 based on additional information. For instance, registration system 316 may receive third signals from sensors 118 of tracking system 116 and determine, based on the third signals, positions of fourth points in the physical coordinate system corresponding to tracking marker 804 while the tip of digitizer 800 palpates a humeral head of humerus 112. Additionally, registration system
316 may receive fourth signals from sensors 118 of tracking system 116 and determine, based on the fourth signals, positions of fifth points in the physical coordinate system corresponding to tracking marker 804 while the tip of digitizer 800 palpates a metaphysis of humerus 112. Registration system 316 may generate registration data 310 comprises generating registration data 310 based on the third points, the fourth points, the fifth points, and the virtual model of the humerus, e.g., using an ICP algorithm.
[0136] In some examples, registration system 316 may perform a process (e.g., as shown in the examples of FIG. 11 to FIG. 13) to validate the registration. In such examples, registration system 316 may receive third signals from sensors 118 of tracking system 116 after generating registration data 310. Registration system 316 may determine, based on registration data 310 and the third signals, a distance between a location on a surface of humerus 112 and the tip of digitizer 800. Registration system 316 may cause MR device 106 to display a first virtual element (e.g., virtual element 1100, 1200, 1302) at the location on the surface of humerus 112 and to display a second virtual element (e.g., virtual element 1102, 1202, 1304) indicating the determined distance of a tip of digitizer 800 to the location on the surface of humerus 112. Registration system 316 may refine registration data 310 based on the determined distance being greater than a threshold while the tip of the digitizer is positioned at the location on the surface of the humerus.
[0137] Furthermore, in some examples, processing system 102 may determine, based on registration data 310, points in the physical coordinate system corresponding to points of a virtual element defined in the virtual coordinate system. Virtual guidance system 320 may cause MR device 104 to display a virtual element (e.g., virtual elements 1100, 1200, 1302, virtual elements for resecting a portion of the humerus, etc.) such that the virtual element appears to a user to be located at the determined points in the virtual coordinate system.
[0138] In some examples, in addition to performing a process to validate the registration, processors 302 of computing system 300 may be configured to control communication interface 306 to output instructions to MR device 104 to display, via user interface 422, a virtual bone model (not depicted) and a virtual planned cutting plane (not depicted) adjacent to a bone, e.g., humerus 112, based on registration data 310. The virtual bone model and the virtual planned cutting plane may be rotated to correspond to the position of humerus 112. The user may determine whether the registration process was successful based on the display of the virtual bone model and virtual planned cutting plane adjacent to humerus 112. In examples in which the user confirms the registration process was successful, the user may proceed with a next step of the surgery, e.g., the arthroplasty. In some examples, the registration process may
be unsuccessful due to an incorrect placement of a tracking marker, e.g., tracking marker 110. In examples in which the user determines the registration process was unsuccessful, the user may determine to restart the registration process or otherwise update registration data 310.
[0139] FIG. 15 is a conceptual diagram illustrating an example of identifying a position of sawblade 122 according to techniques of this disclosure. In the example of FIG. 15, oscillating saw 120 includes a sawblade 122. A tracking marker 124 is attached to a body of saw 120. [0140] A tracking structure 1500 includes a support body 1502 and a tracking marker 1504. Tracking marker 1504 is connected to support body 1502. Support body 1502 defines a recess to accommodate sawblade 122. In some examples, the recess is a rectangular opening that passes through support body 1502. In some examples, the recess is an indented region on a lower surface of support body 1502. The lower surface of support body 1502 may be a surface of support body 1502 opposite the surface to which tracking marker 124 is attached. In some examples, such as the example of FIG. 15, the recess is an indented region on an upper surface of support body 1502. The upper surface of support body 1502 may be the surface of support body 1502 to which tracking marker 124 is attached. In some examples, support body 1502 defines multiple recesses, such as multiple slots. Each of the slots may correspond to a different sawblade thickness.
[0141] As discussed above, when saw 120 is used to resect the humeral head, the bottom of sawblade 122 is in contact with the remaining portion of humerus 112, and consequently defines the surface of the remaining portion of humerus 112. Thus, in order to guide the user with respect to the placement of sawblade 122, registration system 316 determines a spatial relationship between tracking marker 124 and the bottom of sawblade 122. To do so, a user inserts sawblade 122 into the recess defined in support body 1502 of tracking structure 1500. Because the spatial relationship between tracking marker 1504 and sawblade 122 is predefined and because the spatial relationship between tracking marker 1504 and the recess is predefined, registration system 316 may determine the spatial relationship between tracking marker 124 and the bottom of sawblade 122. For instance, in the example of FIG. 15, if sawblade 122 is thicker, tracking marker 1504 and tracking marker 124 are further from one other vertically. If sawblade 122 is thinner, tracking marker 1504 and tracking marker 124 are closer to each other vertically.
[0142] FIG. 16 is a conceptual diagram illustrating an example of identifying a position of sawblade 122 according to techniques of this disclosure. In the example of FIG. 16, MR device 104 displays a virtual element 1600 that guides the user where to place support body 1502 during a process of identifying a position of sawblade 122. Thus, registration system 316 may
cause MR device 104 to display instructions to position the sawblade in a recess of support body 1502 as a virtual representation (e.g., virtual element 1600) of one or more surfaces of support body 1502 at a position along a lateral side (non-tip side) of sawblade 122.
[0143] FIG. 17A is a conceptual diagram illustrating a profile view of an example tracking structure 1500 according to techniques of this disclosure. In the example of FIG. 17A, support body 1502 defines a recess 1700 into which sawblade 122 may be positioned during the process of identifying the position of sawblade 122.
[0144] FIG. 17B is a conceptual diagram illustrating a profile view of an example tracking structure 1500 according to techniques of this disclosure. In the example of FIG. 17B, support body 1502 is shaped to define a first recess 1750 onto which sawblade 122 may be positioned during the process of identifying the position of sawblade 122. Additionally, support body 1502 is shaped to define a second recess 1752 onto which sawblade 122 may be positioned during a step of confirming the position of sawblade 122.
[0145] FIG. 18 is a conceptual diagram illustrating an example of identifying a position of a spatial orientation of sawblade 122 according to techniques of this disclosure. The process of FIG. 18 for identifying a position of sawblade 122 may be used with respect to a surgery to resect a humeral head of humerus 112 or with respect to any other bone or organ. Ensuring that sawblade 122 is correctly angled with respect saw 120 may be another important factor when providing virtual guidance on how to make a bone cut. For example, if sawblade is angled upward or downward relative to saw 120, and this is not accounted for, it may be possible for MR device 104 to indicate cut guidance at the wrong position. Additionally, sawblade 122 (or tracking marker 124) may be rotated relative to the body of saw 120. If this rotation is not properly accounted for, it may be possible for MR device 104 to indicate cut guidance at the wrong angle.
[0146] Accordingly, in the example of FIG. 18, while sawblade 122 is positioned in the recess of support body 1502, MR device 104 may display a set of expected plane elements 1800A, 1800B, 1800C, and 1800D (collectively, “expected plane elements 1800”). Registration system 316 may determine the positions of expected plane elements 1800 based on the position of tracking marker 124. Expected plane elements 1800 are termed “expected” because they indicate a plane in which sawblade 122 would be expected to operate (i.e., an expected operating plane of sawblade 122), given the position of tracking marker 124.
[0147] In addition to expected plane elements 1800, MR device 104 may display a set of current plane elements 1802A, 1802B, 1802C, and 1802D (collectively, “current plane
elements 1802”). Registration system 316 may determine the positions of current plane elements 1802 based on the position of tracking marker 1504 of tracking structure 1500.
[0148] In some examples, for each of current plane elements 1802, MR device 104 may change the color (or some other attribute such as texture) of the current plane element depending on whether the current plane element is above or below its corresponding expected plane element. In some examples, for each pair of corresponding current plane elements and planned plane elements, at least one of the current or planned plane element of the pair has a visual property (e.g., color, texture, etc.) based on the position for the current plane element of the pair relative to the expected plane of sawblade 122. In the example of FIG. 18, current plane element 1802A is below its corresponding expected plane element 1800A, so current plane element 1802A is a first color. Current plane element 1802B is above its corresponding expected plane element 1800B, so current plane element 1802B is a second different color. Current plane elements 1802 may have a third color (e.g., the same color as expected plane markers 1800 or another color) if current plane elements 1802 are at the same position as their corresponding expected plane elements 1800. Thus, a user may be able to determine, based on the colors of current plane elements 1802 whether tracking structure 1500 is aligned with the plane in which sawblade 122 would be expected to operate. Tracking structure 1500 might not be aligned with the expected operating plane of sawblade 122 because sawblade 122 is not correctly positioned in the recess of tracking structure 1500, because sawblade 122 itself is not in the expected operating plane of sawblade 122, or both. In any of these scenarios, the user may be able to use expected plane elements 1800 and current plane elements 1802 to make adjustments.
[0149] In the example of FIG. 18, expected plane elements 1800 and current plane elements 1802 have crescent shapes. In other examples, expected plane elements 1800 and current plane elements 1802 have other shapes. For example, expected plane elements 1800 and current plane elements 1802 may have circular shapes, linear shapes, angled shapes, square shapes, and so on. Furthermore, in different examples, expected plane elements 1800 and current plane elements 1802 may have 2-dimensional or 3 -dimensional shapes.
[0150] FIG. 19A is a conceptual diagram illustrating an example of identifying a position of a sawblade 122 according to techniques of this disclosure. In the example of FIG. 18A, support body 1502 of tracking structure 1500 is positioned at a tip of sawblade 122 instead of alongside sawblade 122. The design of support body 1502 is an alternative to the design shown in FIGS. 15-17. In some examples, registration system 316 may cause MR device 104 to display instructions to position sawblade 122 in a recess of support body 1502. For instance, registration system 316 may cause MR device 104 to instruct the user to position a tip of
sawblade 122 in a slot-shaped recess 1902. In some examples, recess 1902 of support body 1502 is closed-ended such that sawblade 122 cannot be inserted completely through support body 1502. In other examples, recess 1902 of support body 1502 is open-ended such that sawblade 122 can be inserted completely through support body 1502.
[0151] In some examples, support body 1502 of tracking structure defines a second slot-shaped recess. The user may position the tip of sawblade 122 in the second recess. The second recess has a predefined depth and a predefined distance from marker 1504. When the tip of sawblade 122 is positioned in the second recess, registration system 316 may use the predefined depth and the predefined distance, along with the position of marker 154 to confirm the position of the tip of sawblade 122. The second recess may be on the same side or a different side of support body 1502 as recess 1902.
[0152] FIG. 19B is a conceptual diagram illustrating an example tracking structure 1920 for identifying the position of a sawblade according to techniques of this disclosure. In the example of FIG. 19B, tracking structure 1920 includes a support body 1922 and a tracking marker 1924. Support body 1922 defines a slot 1926 into which a tip of sawblade 122 may be inserted during the process of identifying a position of sawblade 122. FIG. 19C is a conceptual diagram illustrating a bottom view of tracking structure 1920. Slot 1926 may be open-ended or closed- ended.
[0153] FIG. 19D is a conceptual diagram illustrating an alternative tracking structure 1940 for identifying the position of a sawblade according to techniques of the disclosure. In the example of FIG. 19D, tracking structure 1940 includes a support body 1942 and a tracking marker 1944. Support body 1942 defines a slot 1946 into which a side of sawblade 122 may be inserted during the process of identifying a position of sawblade 122. Support body 1942 defines a verification slot 1948 into which the side of sawblade 122 may be inserted to confirm the position of sawblade 122. Slot 1946 and verification slot 1948 may be defined on the same side of support body 1942. In the example shown in FIG. 19D, slot 1946 and slot 1948 are formed parallel with one another and generally parallel with the major outer surfaces of support body 1942.
[0154] FIG. 19E is a conceptual diagram illustrating a second alternative tracking structure 1960 for identifying the position of a sawblade according to techniques of the disclosure. In the example of FIG. 19E, tracking structure 1960 includes a support body 1962 and a tracking marker 1964. Support body 1962 defines a slot 1966 into which a side of sawblade 122 may be inserted during the process of identifying a position of sawblade 122. Support body 1962 defines a verification slot 1968 into which the side of sawblade 122 may be inserted to confirm
the position of sawblade 122. Slot 1966 and verification slot 1968 may be defined diagonally relative to a top and bottom surface of support body 1962.
[0155] FIG. 20 is a flowchart illustrating an example operation for identifying a position of sawblade 122 for MR-based guidance according to techniques of this disclosure. In the example of FIG. 20, registration system 316 may receive first signals from one or more sensors 118 of tracking system 116 (2000). The first signals may include RGB images, depth images, etc. In some examples, prior to receiving the first signals, registration system 316 may cause MR device 104 to display instructions to position sawblade 122 in a recess of support body 1502 of tracking structure 1500.
[0156] Registration system 316 may determine, based on the first signals, first points corresponding to a first tracking marker 124 attached to a body of a saw 120 (2002). The first points may have coordinates defined in a physical coordinate system and may be associated with data that identify the points. For example, the first points may include points at the comers of tracking marker 124. Registration system 316 may generate the data that identify the comer points based on the patterns on the faces of tracking marker 124. For instance, based on pattern A being visible on a first face of tracking marker 124 and a pattern B being visible on a second face of tracking marker 124, registration system 316 may determine that a first comer point is the top comer between the first and second faces, a second comer is the bottom comer between the first and second faces, and so on.
[0157] Additionally, registration system 316 may determine, based on the first signals, second points corresponding to a second tracking marker 1504 of tracking structure 1500 while sawblade 122 of saw 120 is positioned in recess 1700 defined by support body 1502 of tracking stmcture 1500 (2004). The second points are also defined in the physical coordinate system. In some examples, registration system 316 may determine, based on the second points, a planar orientation of the tracking stmcture and may cause MR device 104 to display virtual elements (e.g., current plane elements 1802 and expected plane elements 1800 of FIG. 18) that indicate whether the planar orientation of the tracking stmcture is aligned with a cutting plane of sawblade 122.
[0158] Registration system 316 may generate, based on the first points and the second points, position identification data that specify a position of a lower edge of sawblade 122 relative to first tracking marker 124 (2006). For example, the position identification data generated by registration system 316 may indicate that the lower edge of sawblade 122 is at a specific position in space relative to one or more points on tracking marker 124. The position
identification data may be defined in terms of one or more vectors in the physical coordinate system.
[0159] In some examples, after generating the position identification data, virtual guidance system 320 may obtain second signals from sensors 118 of tracking system 116. Virtual guidance system 320 may determine, based on the second signals, third points corresponding to tracking marker 124. Virtual guidance system 320 may determine, based on the third points and the position identification data, guidance data that guide the user to position the lower edge of the sawblade at a location on a bone of a patient while cutting the bone. Virtual guidance system 320 may cause MR device 106 to display the virtual guidance. An example of displaying guidance data is provided below.
[0160] FIG. 21 is a conceptual diagram illustrating example virtual elements for guiding resection of a humeral head 802 according to techniques of this disclosure. In the example of FIG. 19, MR device 104 displays planned plane elements 2100A, 2100B, 2100C (collectively, “planned plane elements 2100”). In other examples, there may be more or fewer plarmed plane elements 2100. For instance, one or more of planned plane elements 2100 may be hidden from the perspective shown in FIG. 21. Each of planned plane elements 2100 may indicate a location on a planned cutting plane through a bone (e.g., humerus 112) of a patient. The planned cutting plane may be a 2-dimensional plane along which sawblade 122 is planned to move while cutting the bone according to a surgical plan.
[0161] Additionally, MR device 104 displays current plane elements 2102A, 2102B, 2102C (collectively, “current plane elements 2102”). In other examples, there may be more or fewer current plane elements 2102. For instance, one or more of current plane elements 2102 may be hidden from the perspective shown in FIG. 21. Each of current plane elements 2102 may indicate a location on a current cutting plane of sawblade 122 (not shown in FIG. 21). MR device 104 may determine the positions of planned plane elements 2100 based on the position of tracking marker 110. MR device 104 may determine the positions of current plane elements 2102 based on the position of tracking marker 110 and tracking marker 124 attached to the body of saw 120.
[0162] For each of current plane elements 2102, MR device 104 may change the color (or some other attribute) of the current plane element depending on whether the current plane element is above or below its corresponding planned plane element. For instance, in the example of FIG. 21, current plane element 2102A is below its corresponding planned plane element 1700A, so current plane element 2102A is a first color. Current plane element 2102B is above its corresponding plarmed plane element 2100B, so current plane element 2102B is a second
different color. Current plane elements 2102 may have a third color (e.g., the same color as planned plane markers 2100 or another color) if current plane elements 2102 are at the same position as their corresponding planned plane elements 2100. For instance, in the example of FIG. 21, planned plane element 2100C and current plane element 2102C are at the same position, which may make planned plane element 2100 indistinguishable from current plane element 2102C. Thus, a user may be able to determine, based on the colors of current plane elements 2102 whether a bottom of sawblade 122 is aligned with a planned cutting plane through the bone (e.g., a planned cutting plane to resect humeral head 802 from humerus 112). [0163] In the example of FIG. 21, planned plane elements 2100 and current plane elements 2102 have crescent shapes. In other examples, planned plane elements 2100 and current plane elements 2102 have other shapes. For example, planned plane elements 2100 and current plane elements 2102 may have circular shapes, linear shapes, angled shapes, square shapes, and so on. Furthermore, in different examples, planned plane elements 2100 and current plane elements 2102 may have 2-dimensional or 3 -dimensional shapes.
[0164] Plarmed plane elements 2100 are not contiguous with each other. Similarly, current plane elements 2102 are not contiguous with each other. In other words, each of planned plane elements 2100 and current plane elements 2102 is discrete. Using discrete plane elements to indicate where to make a bone cut may be advantageous over displaying a full plane of the bone cut. For instance, displaying a rectangular plane element oriented in 3 dimensions along the planned cut plane may obscure the user’s vision of the bone and may not be as helpful in indicating how to adjust the position of sawblade. Displaying two full planes (one for the planned cut plane and one indicating the current operating plane of the sawblade) may further hinder the user’s vision of the surgical site and it may be more difficult for the user to understand how to reorient the sawblade to align the sawblade with the planned cut plane.
[0165] As shown in the example of FIG. 21, MR device 104 may also display an entry line element 2104. Entry line element 2104 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone. In other words, virtual guidance system 320 may cause MR device 104 to display a virtual line element (e.g., entry line element 2104) such that the virtual line element appears to the user to be located on the bone at locations where the planned cutting plane intersects the bone.
[0166] Furthermore, in some examples, MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) humeral head 802. The virtual humeral model is not shown in the figures because it would overlap with humerus 112 itself. The virtual humeral model may be a virtual model of a portion of humerus 112 that includes humeral head
802. The virtual humeral model may be semitransparent, which may have the effect of darkening humeral head 802, which may make it easier for the user to see entry line element 2104 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with humerus 112 that registration between humerus 112 and the virtual elements remains valid.
[0167] FIG. 22 is a flowchart illustrating an example operation for providing cut guidance according to techniques of this disclosure. In the example of FIG. 22, virtual guidance system 320 may receive first signals from one or more sensors of a tracking system (2200).
[0168] Virtual guidance system 320 may determine, based on the first signals, positions for a plurality of planned plane elements 2100 (2202). Additionally, virtual guidance system 320 may determine, based on the first signals, positions for a plurality of current plane elements 2102 (2204). Each of planned plane elements 2100 indicates a location on a planned cutting plane through a bone (e.g., humerus 112) of a patient. Planned plane elements 2100 are not contiguous with each other. In other words, plarmed plane elements 2100 are visually separate virtual elements and not part of the same visible virtual element, such as a single virtual element representing the planned cutting plane. Furthermore, each of current plane elements 2102 corresponds to one of planned plane elements 2100. For instance, in the example of FIG. 21, current plane element 2102A corresponds to planned plane element 2100A, current plane element 2102B corresponds to planned plane element 2100B, and so on. Each of current plane elements 2102 indicates a location on a current cutting plane of the sawblade. Current plane elements 2102 are not contiguous with each other.
[0169] Virtual guidance system 320 may cause MR device 104 to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements (2206). For one or more pairs of corresponding current plane elements and planned plane elements, at least one of the current plane element of the pair or the planned plane element of the pair has a visual property (e.g., color, texture, etc.) that is based on the position for the current plane element of the pair relative to the planned cutting plane. For instance, virtual guidance system 320 may determine the visual property based on whether the location indicated by the current plane element of the pair is above or below the planned cutting plane. In the example of FIG. 21, current plane element 2102A has a first color/texture because current plane element 2102A is below the planned cutting plane (as indicated by planned plane element 2100A) and current plane element 2102B has a second, different color/texture because current plane element 2102B is above planned cutting plane (as indicated by planned plane element 2100B). Virtual
guidance system 320 may cause MR device 104 to update positions of the current plane elements based on changes to the current cutting plane of the sawblade.
[0170] The operation of FIG. 22 may be used in conjunction with other examples of this disclosure. For instance, the planned cutting plane may be defined in a virtual coordinate system and, prior to receiving the first signals, registration system 316 may receive, second signals from sensors 118 of tracking system 116. Registration system 316 may determine, based on the second signals, first points corresponding to tracking marker 110 of humeral tracking structure 106. Humeral tracking structure 106 includes tracking marker 110 and attachment body 108 that positioned at a bicipital groove of humerus 112 of a patient. Registration system 316 may receive third signals from sensors 118 of tracking system 116. Registration system 316 may determine, based on the third signals, second points corresponding to tracking marker 804 of digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot defined in attachment body 108 of humeral tracking structure 106. Registration system 316 may determine, based on the second points and a predetermined spatial relationship between the tracking marker 804 and the tip of digitizer 800, third points corresponding to the bicipital groove. Positions of the first, second and third points may be defined in a physical coordinate system. Registration system 316 may generate, based on the third points and a virtual model of humerus 112 having points defined in the virtual coordinate system, registration data 310 defining a relationship between the physical coordinate system and the virtual coordinate system. Virtual guidance system 320 may determine the positions for planned plane elements 2100 in the physical coordinate system based on registration data 310 and a position of tracking marker 110 in the physical coordinate system.
[0171] In some examples, prior to receiving the first signals, registration system 316 may receive second signals from sensors 118 of tracking system 116. Registration system 316 may determine, based on the second signals, first points corresponding to tracking marker 124 attached to a body of saw 120. Registration system 316 may determine, based on the second signals, second points corresponding to tracking marker 1504 of tracking structure 1500 while sawblade 122 of saw 120 is positioned in a recess 1700 defined by support body 1502 of tracking structure 1500. The first points and the second points are defined in a physical coordinate system. Registration system 316 may generate, based on the first points and the second points, position identification data that specify a position of a lower edge of sawblade 122 relative to tracking marker 124. Virtual guidance system 320 may determine the positions for current plane elements 2102 based on a position of tracking marker 124 in the second signals and based on the position identification data.
[0172] FIG. 23 is a conceptual diagram illustrating an example ring-shaped virtual element 2300 for guiding resection of a humeral head according to techniques of this disclosure. Virtual guidance system 320 may cause MR device 104 to display ring-shaped virtual element 2300 such that ring-shaped virtual element 2300 is centered on an axis passing through a center of curvature of humeral head 802 and normal to an anatomical neck of humerus 112. In other examples, ring-shaped virtual element 2300 may be centered on a different axis. For instance, ring-shaped virtual element 2300 may be centered on an axis passing through the center of curvature of humeral head 802 and normal to a planned cutting plane of humerus 112. Thus, as the elevation of sawblade 122 changes relative to humeral head 802, virtual guidance system 320 may cause MR device 104 to display ring-shaped virtual element 2300 at different positions centered along the axis. Ring-shaped virtual element 2300 may be partially transparent.
[0173] Ring-shaped virtual element 2300 is spatially aligned with a current cutting plane of sawblade 122. Thus, as the user changes the angle of the sawblade 122 relative to humeral head 802, virtual guidance system 320 may cause MR device 104 to tilt ring-shaped virtual element 2300 in 3D space to match the current cutting plane of sawblade 122 while keeping a center of ring-shaped virtual element 2300 centered on the axis.
[0174] Virtual guidance system 320 may cause MR device 104 to display different portions of ring-shaped virtual element 2300 with different visual properties to indicate whether those portions correspond to the planned cutting plane. For instance, a first portion of ring-shaped virtual element 2300 may have a first visual property (e.g., a first color) if the first portion is above the planned cutting plane, a second portion of ring-shaped virtual element 2300 may have a second visual property (e.g., a second color) if the second portion is below the planned cutting plane, and a third portion of ring-shaped virtual element 2300 may have a third visual property (e.g., a third color) if the third portion intersects the planned cutting plane.
[0175] Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2304. Entry line element 2304 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone. Additionally, virtual guidance system 320 may cause MR device 104 to display a virtual element 2306 indicating a difference in height (AHeight), a difference in frontal angle (AFrontal), and a difference in Sagittal angle (ASagittal) of the current cutting plane of sawblade 122 relative to the planned cutting plane.
[0176] Furthermore, MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) humeral head 802. The virtual humeral model may be a
virtual model of a portion of humerus 112 that includes humeral head 802. The virtual humeral model may be semitransparent, which may have the effect of darkening humeral head 802, which may make it easier for the user to see entry line element 2304 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with humerus 112 that registration between humerus 112 and the virtual elements remains valid. Furthermore, in some examples, a line corresponding to the current cutting plane may be shown on the virtual humeral model.
[0177] FIG. 24 is a conceptual diagram illustrating example ring-shaped virtual elements for guiding resection of a humeral head according to techniques of this disclosure. In the example of FIG. 24, virtual guidance system 320 may cause MR device 104 to display ring-shaped virtual element 2400. Ring-shaped virtual element 2400 may operate in the same way as ringshaped virtual element 2300 as described above with respect to FIG. 23. However, virtual guidance system 320 may also cause MR device 104 to display supplemental elements 2402. Supplemental elements 2402 may each have a circular shape, e.g., as shown in FIG. 24, or may have one or more other shapes. Each of supplemental elements 2402 is a version of ring-shaped virtual element 2400 but rotated orthogonal to a plane of ring-shaped virtual element 2400. Visual properties (e.g., colors, textures, etc.) of supplemental elements 2402 may change as the angle and position of sawblade 122 changes. Supplemental elements 2402 may help the user determine the angle and position of sawblade 122 even when the user cannot see all of ringshaped virtual element 2400.
[0178] Furthermore, MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) humeral head 802. The virtual humeral model may be a virtual model of a portion of humerus 112 that includes humeral head 802. The virtual humeral model may be semitransparent, which may have the effect of darkening humeral head 802, which may make it easier for the user to see entry line element 2404 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with humerus 112 that registration between humerus 112 and the virtual elements remains valid. Furthermore, in some examples, a line corresponding to the current cutting plane may be shown on the virtual humeral model.
[0179] FIG. 25 is a conceptual diagram illustrating example planned plane virtual elements 2500A, 2500B and current plane virtual elements 2502A, 2502B for guiding resection of a humeral head according to techniques of this disclosure. This disclosure may refer to planned plane virtual element 2500A and planned plane virtual element 2500B collectively as “planned plane virtual elements 2500” and may refer to current plane virtual element 2502A and current
plane virtual element 2502B collectively as “current plane virtual elements 2502.” Each of planned plane virtual elements 2500 and current plane virtual elements 2502 includes a circle element and a line element that passes through the center of the circle. The line elements of planned plane virtual elements 2500 correspond to a planned cutting plane. Planned plane virtual elements 2500 may be separated from each other on the planned cutting plane by a 90° angle, or another angle. Current plane virtual elements 2502 may be separated from each other on the current cutting plane by a 90° angle, or another angle. The line elements of current plane virtual elements 2502 correspond to a current cutting plane. Thus, the user may be able to determine that sawblade 122 is aligned with the planned cutting plane when the line elements of current plane elements 2502 match the line elements of planned plane elements 2502. MR device 104 may position planned plane virtual elements 2500 and current plane virtual elements 2502 so that the planned plane virtual elements 2500 and current plane virtual elements 2502 appear to the user as if there were invisible mirrors behind and next to the humeral head, which reflect images of the planned and current cutting planes.
[0180] Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2504. Entry line element 2504 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone. Additionally, virtual guidance system 320 may cause MR device 104 to display a virtual element 2506 indicating a difference in height (AHeight), a difference in frontal angle (AFrontal), and a difference in Sagittal angle (ASagittal) of the current cutting plane of sawblade 122 relative to the planned cutting plane.
[0181] Furthermore, MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head. The virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head. The virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2504 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid. Furthermore, in some examples, a line 2508 corresponding to the current cutting plane may be shown on the virtual humeral model.
[0182] FIG. 26 is a conceptual diagram illustrating example virtual elements for guiding resection of a humeral head according to techniques of this disclosure. In the example of FIG. 26, virtual guidance system 320 may cause MR device 104 to display a planned virtual element 2600 and a current virtual element 2602. Planned virtual element 2600 includes a circle 2604
corresponding to the planned cutting plane through the humerus. Planned virtual element 2600 also includes a circle 2606 in a plane orthogonal to the planned cutting plane and orthogonal to a planned insertion axis of sawblade 122. Similarly, current virtual element 2602 includes a circle 2608 corresponding to a current cutting plane. Current virtual element 2602 also includes a circle 2610 that is orthogonal to the current cutting plane and orthogonal to a current axis of sawblade 122.
[0183] Virtual guidance system 320 may cause MR device 104 to update one or more visual properties of planned virtual element 2600 and current virtual element 2602 based on the alignment of the planned cutting plane and the current cutting plane. For instance, MR device 104 may update the visual properties of circles 2604, 2606, 2608, 2610 based on the alignment of the planned cutting plane and the current cutting plane such that the visual properties of circles 2604, 2606, 2608, 2610 match when the planned cutting plane and the current cutting plane are aligned. The visual properties of circles 2604, 2606, 2608, and 2610 may change to indicate a direction or angle by which the planned cutting plane and the current cutting plane are not aligned. For instance, a first color (e.g., red) in a first portion of circles 2604, 2606, 2608, and 2610 may indicate that a corresponding portion of the current cutting plane is below the planned cutting plane while a second color (e.g., green) in a second portion of circles 2604, 2606, 2608, and 2610 may indicate that a corresponding portion of the current cutting plane is above the planned cutting plane.
[0184] Additionally, virtual guidance system 320 may cause MR device 104 to display a transverse virtual element 2612. Transverse virtual element 2612 may be orthogonal to the current cutting plane and have a diameter aligned with a lengthwise axis of sawblade 122. Transverse virtual element 2612 may help the user visualize the angle of sawblade 122.
[0185] Furthermore, MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head. The virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head. The virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2616 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid. Furthermore, in some examples, a line corresponding to the current cutting plane may be shown on the virtual humeral model.
[0186] FIG. 27 is a conceptual diagram illustrating first example elevation and angular virtual elements for guiding resection of a humeral head according to techniques of this disclosure. In
the example of FIG. 27, virtual guidance system 320 causes MR device 104 to display a frontal angle element 2700, a sagittal angle element 2702, and an elevation element 2704. Frontal angle element 2700 includes a marker 2706 that indicates a frontal angle of a planned cutting plane and a marker 2708 that indicates a frontal angle of a current cutting plane of sawblade 122. Similarly, sagittal angle element 2702 includes a marker 2710 that indicates a sagittal angle of the planned cutting plane and a marker 2712 that indicates a sagittal angle of the current cutting plane of sawblade 122. Elevation element 2704 includes a marker 2714 and a marker 2716 that indicate the relative elevation of the current cutting plane relative to the elevation of the planned cutting plane. Thus, by aligning marker 2706 with marker 2708, aligning marker 2710 with marker 2712, and aligning marker 2714 and marker 2716, the user may align the current cutting plane with the planned cutting plane.
[0187] Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2718. Entry line element 2718 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone. Furthermore, MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head. The virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head. The virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2718 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid. In the example of FIG. 27, the virtual humeral model may be divided into two parts, so that a line 2720 is defined, along the current cutting plane.
[0188] FIG. 28 is a conceptual diagram illustrating second example elevation and angular virtual elements for guiding resection of a humeral head according to techniques of this disclosure. In the example of FIG. 28, frontal angle element 2800, sagittal angle element 2802, and elevation element 2804 have the same functionality as frontal angle element 2700, sagittal angle element 2702, and elevation element 2704 of FIG. 27.
[0189] Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2806. Entry line element 2806 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone. Furthermore, MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head. The virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head. The virtual humeral model may be
semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2104 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid. Furthermore, in some examples, a line corresponding to the current cutting plane may be shown on the virtual humeral model.
[0190] FIG. 29 is a conceptual diagram illustrating example elevation and angle virtual elements for guiding resection of a humeral head according to techniques of this disclosure. In the example of FIG. 29, virtual guidance system 320 causes MR device 104 to display a frontal angle element 2900, a sagittal angle element 2902, and an elevation element 2904. Each of frontal angle element 2900, sagittal angle element 2902, and elevation element 2904 are divided into two halves. For each of frontal angle element 2900, sagittal angle element 2902, and elevation element 2904, both halves may have the same visual property (e.g., same color or texture) if the current cutting plane is aligned with the planned cutting plane. Otherwise, the visual properties of the different halves of frontal angle element 2900, sagittal angle element 2902, and elevation element 2904 are different in a way that indicates how to adjust the current cutting plane. For instance, an upper half of elevation element 2904 may be highlighted to indicate that the user should increase the elevation of the current cutting plane. In another example, a front or back half of sagittal angle element 2902 may be highlighted to indicate that the user should tilt sawblade more upward or downward in the sagittal plane. In another example, a left or right half of frontal angle element 2900 may be highlighted to indicate that the user should tilt sawblade more leftward or rightward in the frontal plane.
[0191] Virtual guidance system 320 may also cause MR device 104 to display an entry line element 2906. Entry line element 2906 is a virtual element that indicates a line in the bone at which the user is to insert sawblade 122 to make the planned cut on the bone. Furthermore, MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head. The virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head. The virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2906 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid. Furthermore, in some examples, a line 2910 corresponding to the current cutting plane may be shown on the virtual humeral model.
[0192] FIG. 30 is a conceptual diagram illustrating an example elevation element 3000 and bubble-level element 3002 for guiding resection of a humeral head according to techniques of this disclosure. Elevation element 3000 includes a marker 3004 and marker 3006. Virtual guidance system 340 may cause MR device 104 to update the position of marker 3006 relative to marker 3004 based on a difference in elevation between the planned cutting plane and the current cutting plane. Thus, when marker 3004 and marker 3006 overlap, the plarmed cutting plane and the current cutting plane have the same elevation.
[0193] Furthermore, bubble-level element 3002 includes a crosshair pattern and a marker element 3008. MR device 104 may update the position of marker element 3008 based on angles (e.g., frontal and sagittal angles) of the planned cutting plane relative to the current cutting plane. Thus, when the planned cutting plane is aligned with the current cutting plane, marker element 3008 is centered on bubble-level element 3002. For instance, marker element 3008 may be centered on the center of the crosshair pattern of bubble-level element 3002.
[0194] MR device 104 may display a virtual humeral model that visually overlaps (e.g., is superimposed on) the humeral head. The virtual humeral model may be a virtual model of a portion of the humerus that includes the humeral head. The virtual humeral model may be semitransparent, which may have the effect of darkening the humeral head, which may make it easier for the user to see entry line element 2104 and/or other virtual elements. Additionally, the user may be able to determine, based on alignment of the virtual humeral model with the humerus that registration between the humerus and the virtual elements remains valid. Furthermore, in some examples, a line 3012 corresponding to the current cutting plane may be shown on the virtual humeral model.
[0195] In some examples, a combination of features and virtual elements from two or more of the examples of FIG. 21 and FIGS. 23-30 may be used. For instance, in one example, MR device 104 may display ring-shaped virtual element 2300 of FIG. 23 along with planned plane virtual elements 2500 and current plane virtual elements 2502. In another example, MR device 104 may display frontal angle element 2700, sagittal angle element 2702, and/or elevation element 2704 of FIG. 27 along with bubble-level element 3002 of FIG. 30. In some examples, virtual guidance system 320 may receive data indicating user preferences on which virtual elements or combinations of virtual elements to display for guiding humeral head resection. Furthermore, the operation of FIG. 22 may apply, with appropriate modifications, with respect to FIGS. 23-30. The features and virtual elements of FIG. 21 and FIGS. 23-30 may be designed to increase the user’s ability to clearly see both the humerus and the sawblade, as well as to intuitively understand how the current cutting plane relates to the planned cutting plane.
Furthermore, the examples of FIGS. 23-30 show a tracking marker connected to a clamping structure. In other versions of the examples of FIGS. 23-30, tracking markers may be connected to a humeral tracking structure positioned at a bicipital groove of the humerus as described elsewhere in this disclosure.
[0196] FIG. 31 is a conceptual diagram illustrating an example of confirming accuracy of a humeral resection according to techniques of this disclosure. In the example of FIG. 31, a resected surface 3100 of humerus 112 results from resecting a humeral head of humerus 3102. A tracking structure 3106 includes a tracking marker 3108 and a support body 3110. Support body 3110 has a lower surface 3112 opposite an upper surface 3114 that is connected to tracking marker 3108. Lower surface 3112 is flat. After the humeral head has been resected, a user (e.g., a surgeon) may hold the flat lower surface 3112 of support body 3110 against resected surface 3100. In some examples, tracking structure 3106 is the same instrument as tracking structure 1500 of FIGS. 15, 16, 17A, 17B, 18, and 19A. In examples in which tracking structure 3106 is the same as tracking structure 1500, tracking structure 3106 may be configured to identify a position and/or thickness of sawblade 122 according to the techniques described with respect to FIG. 15. In other examples, tracking structure 3106 is a separate instrument from tracking structure 1500. In examples in which tracking structure 3106 is a separate instrument from tracking structure 1500, tracking structure may be similar to tracking structure 1500 in shape and size but may not include recesses, such as slots (e.g., slots 1946 and 1948).
[0197] Processing system 102 may determine a position of tracking marker 3108 within a physical coordinate system. Humerus 112 is registered to the physical coordinate system based on tracking marker 110. Processing system 102 may therefore be able to determine the location in the physical coordinate system of lower surface 3112 of support body 2410, and hence the location of the resected surface 3100 of humerus 112. Furthermore, because the physical coordinate system is registered with a virtual coordinate system in which a planned cutting plane is defined, processing system 102 may determine whether the resected surface 3100 of humerus 112 is positioned and oriented along the planned cutting plane. In the example of FIG. 31, MR device 104 may output a virtual element 3116 that indicates a resection level, version angle, and inclination angle of resected surface 3100 determined based on tracking structure 3106. The resection level indicates a height of the cut. Additionally, MR device 104 may output a virtual element 3116 indicating a perimeter of a planned cutting plane. MR device 104 may additionally output a virtual overlay element (not depicted) on resected surface 3100 to indicate which areas of resected surface 3100 deviate (e.g., deviate above or deviate below) from the
planned cutting plane. In some examples, different visual properties (e.g., colors or textures) may correspond to different deviations of resected surface 3100 from the planned cutting plane. As an example, one color (e.g., green) of the virtual overlay element may correspond to deviations below the plarmed cutting plan, and another color (e.g., blue) may correspond to deviations above the planned cutting plane.
[0198] Tracking structure 3106 may not be specific to any patient but may be generic for all patients or tracking structure 3106 may have a limited range of two or more sizes, such as a child size, a small adult size, and a large adult size.
[0199] FIGS. 32A-32E are conceptual diagrams illustrating example guidance displayed by a virtual guidance element 3200 for positioning sawblade 122 at a correct position and orientation, in accordance with techniques of this disclosure. MR device 104 may display virtual guidance element 3200 in the vicinity of the surgical site, such that the user may easily see both virtual guidance element 3200 and the surgical site.
[0200] In some examples, virtual guidance system 320 may determine where to position virtual guidance element 3200 relative to humerus 112 prior to displaying virtual guidance element 3200. In some examples, virtual guidance system 320 determines the position of MR device 104 relative to humerus 112 during a collar-fixing procedure of the registration process. During the collar-fixing procedure of the registration process, an optical marker, e.g., tracking marker 124, may be affixed to saw 120, and the user may position sawblade 122 next to humerus 112 such that sensors, e.g., optical sensors 430, of MR device 104 detect tracking marker 124 and tracking marker 110. During the collar-fixing procedure, the user’s body is expected to be positioned in a substantially similar way to the way the user’s body will be positioned during the cutting procedure. Thus, virtual guidance system 320 may determine where to position virtual guidance element 3200 such that virtual guidance element 3200 is visible but disposed away from saw 120, humerus 112, and any tracking markers, e.g., tracking marker 110, tracking marker 124, and tracking marker 3518. In some examples, virtual guidance element 3200 is not directly superimposed on the surgical site but may appear to be at a location adjacent to the surgical site. In some examples, virtual guidance element 3200 is locked to a position relative to the surgical site. Virtual guidance element 3200 may provide an easy-to- understand way for the user to align the current cutting plane with the planned cutting plane.
[0201] Virtual guidance element 3200 includes a divided ring element 3202. Divided ring element 3202 includes an enclosed area (e.g., circle, rectangle, ovoid, etc.) bisected by a line. The line corresponds to a planned cutting plane. MR device 104 may display virtual guidance element 3200 such that the line bisecting divided ring element 3202 appears to a user to be
aligned with a planned cutting plane. In some examples, the line bisecting divided ring element 3202 may extend beyond divided ring element 3202. In some examples, the portions of the line extending beyond divided ring element 3202 may be curved or “winged.” Virtual guidance element 3200 may also include an active element 3204 that provides information about the position and orientation of the current cutting plane of sawblade 122. An inner edge of active element 3204 (e.g., the flat side of the semicircular active element 3204 shown in FIGS. 32A - 32D) corresponds to a superior/inferior angle of the current cutting plane of sawblade 122 relative to the planned cutting plane. The inner edge of active element 3204 is an edge of active element closer to the line through divided ring element 3200 and an outer edge of active element 3204 is an edge of active element further from the line through divided ring element 3200. Thus, if the inner edge of active element 3204 is angled relative to the line of divided ring element 3202, the current cutting plane of sawblade 122 is angled superiorly or inferiorly relative to the planned cutting plane. If the inner edge of active element 3204 is above or below the line of divided ring element 3202, the resection level of the current cutting plane into the bone is inferior or superior to the resection level of the planned cutting plane into the bone.
[0202] The outer edge of active element 3204 (e.g., the rounded side of semicircular active element 3204 shown in FIGS. 32A - 32D) corresponds to an anterior/posterior angle of the current cutting plane of sawblade 122 relative to the planned cutting plane. Greater distances between the inner edge of active element 3204 and a center point of the outer edge of active element 3204 correspond to greater anterior/posterior angles of the current cutting plane of sawblade 122 relative to the planned cutting plane.
[0203] In other examples, divided ring element 3202 and/or active element 3204 may have shapes other than circles and semicircles, such as full or partial ovals, ellipsoids, squares, rectangles, rhombuses, and so on.
[0204] Virtual guidance element 3200 may be surrounded by a semi-transparent colored, e.g., black, field. In some examples, the colored field may enhance visibility of virtual guidance element 3200.
[0205] Thus, in FIG. 32A, the resection level of the current cutting plane is inferior to the resection level of the planned cutting plane (e.g., by 3 mm), the superior/inferior angle of the current cutting plane relative to the planned cutting plane is correct (e.g., 0°), and the current cutting plane is angled anteriorly relative to the planned cutting plane (e.g., by 15°). If the current cutting plane were angled posteriorly relatively to the planned cutting plane, the outer edge of active element 3204 would appear above the inner edge of active element 3204 instead of below the inner edge of active element 3204 as shown in FIGS. 32A-32D. In FIG. 32B, the
resection level of the current cutting plane is inferior to the resection level of the planned cutting plane (e.g., by 3 mm), the current cutting plane is angled superiorly relative to the planned cutting plane (e.g., by 10°), and the current cutting plane is angled anteriorly relative to the planned cutting plane (e.g., by 15°). In FIG. 32C, the resection level of the current cutting plane is inferior to the resection level of the planned cutting plane (e.g., by 3 mm), the current cutting plane is angled superiorly relative to the planned cutting plane (e.g., by 10°), and the current cutting plane is angled anteriorly relative to the planned cutting plane by a smaller amount than in FIG. 32B (e.g., by 10°). In FIG. 32D, the resection level of the current cutting plane is aligned with the resection level of the planned cutting plane (e.g., 0 mm difference), the current cutting plane is aligned superiorly/inferiorly with the planned cutting plane (e.g., 0°), and the current cutting plane is angled anteriorly relative to the planned cutting plane (e.g., by 4°). In FIG. 32E, active element 3204 is not visible because the current cutting plane is correctly aligned with the planned cutting plane.
[0206] Thus, processing system 102 may determine a current cutting plane of sawblade 122 of saw 120. Processing system 102 may output, for display by MR device 104, a virtual guidance element 3200 that includes a divided ring element 3202 that includes an enclosed area bisected by a line. Virtual guidance element 3200 also includes an active element 3204 having an inner edge and an outer edge. A distance between a center of the line and a center of the inner edge of active element 3204 is indicative of a distance between a resection level of a current cutting plane of sawblade 122 of saw 120 into a bone and a resection level of a planned cutting plane through bone 112. An angle of the inner edge of active element 3204 relative to the line is indicative of a superior/inferior angle of the current cutting plane of sawblade 122 and the planned cutting plane. A length of a line 3206 perpendicular to the inner edge of active element 3204 from the center of the inner edge of active element 3204 to the outer edge of active element 3204 is indicative of an anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane. Line 3206 may be conceptual and may or may not be visible. In some examples, processing system 102 causes MR device 104 to update active element 3204 based on a change to the current cutting plane of sawblade 122. Processing system 102 may output, for display by MR device 104, virtual guidance element 3200 so that the line of divided ring element 3202 is aligned with the planned cutting plane. Processing system 102 may output, for display by MR device 104, virtual guidance element 3200 so that virtual guidance element 3200 and bone 112 are simultaneously visible to a user of MR device 104.
[0207] In some examples, bone 112 is a humerus, and the planned cutting plane is defined in a virtual coordinate system. Processing system 102 may receive first signals from the one or more sensors 118 of tracking system 116. Processing system 102 may determine, based on the first signals, first points corresponding to a first tracking marker 110 of a tracking structure 106 that comprises first tracking marker 110 and an attachment body 108 positioned at a bicipital groove of the humerus. Processing system 102 may receive second signals from sensors 118 of tracking system 116. Processing system 102 may determine, based on the second signals, second points corresponding to a second tracking marker of a digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot defined in attachment body 108 of tracking structure 106. Processing system 102 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of digitizer 800, third points corresponding to the bicipital groove. Positions of the first, second and third points are defined in a physical coordinate system. Processing system 102 may generate, based on the third points and a virtual model of the humerus having points defined in the virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system. Additionally, processing system 102 may receive third signals from one or more sensors 118 oftracking system 116. Processing system 102 may determine, based on the third signals, fourth points corresponding to a third tracking marker 124 attached to a body of saw 120. Processing system 102 may determine the current cutting plane based on the fourth points. For instance, processing system 102 may use a determined spatial relationship with between the fourth points and the bottom of sawblade 122 to determine the current cutting plane in the physical coordinates.
[0208] FIG. 33 is a conceptual diagram illustrating an example MR visualization 3300 that includes a virtual guidance element 3302, in accordance with techniques of this disclosure. MR device 104 displays virtual guidance element 3302 so that a user can concurrently see both the virtual guidance element 3302 and a corresponding bone 3304 (or bone model). Virtual guidance element 3302 is not superimposed on bone 3304. Virtual guidance element 3302 includes a bone model and a planned cut plane.
[0209] Virtual guidance element 3302 may or may not be registered with bone 3304. In examples where virtual guidance element 3302 is registered with bone 3304, virtual guidance element 3302 may be rotated to correspond with the position of bone 3304. Virtual guidance element 3302 may be adjacent to bone 3304. In some examples, the user may confirm the registration process was successful based on MR visualization 3300. In some examples, the user confirms the registration process was successful based on determining that the registration
process resulted in reasonable results. By providing MR visualization 3300 to allow the user to confirm the registration process was successful, the techniques of this disclosure may prevent errors in the surgery, e.g., the arthroplasty, which may improve patient outcomes.
[0210] In some examples, if the user determines the registration process was unsuccessful, the user may provide user input indicating that the registration process was unsuccessful. Based on the user input, processing system 102 may determine to restart the registration process or otherwise modify the registration process. In some examples, the registration process may be unsuccessful due to an incorrect placement of a tracking marker, such as tracking marker 110. Processing circuitry 102 may control user interface 422 to generate a display including instructions to check the placement of tracking marker 110 responsive to receiving user input indicative of the registration process being unsuccessful.
[0211] FIG. 34 is a conceptual diagram illustrating an example MR visualization 3400 that includes an outline of a bone according to techniques of this disclosure. In the example of FIG. 34, MR device 104 displays a virtual outline element 3402. Virtual outline element 3402 provides an outline around a visible portion of a bone. MR device 104 may display virtual outline element 3402 during the registration process. As processing system 102 refines the relationship between the virtual model of the bone and the physical bone as more points are taken on the physical bone, the position of virtual outline element 3402 may be refined to better match the physical bone. Virtual outline element 3402 may help the user confirm that the registration process has accurately determined the position of the bone.
[0212] FIG. 35A and FIG. 35B are schematic representations of a cutting guide 3502, according to the techniques of this disclosure. In some examples, cutting guide 3502 is configured to facilitate humeral resection along, for example, an entry line element, e.g., entry line element 2104 of FIG. 21. FIG. 35A is a side view of cutting guide 3502, and FIG. 35B is atop view of cutting guide 3502. Cutting guide 3502 includes a handle 3504. Handle 3504 may allow the user or another person to maintain a placement of cutting guide 3502 relative to a bone, e.g., humerus 112, of a patient. Cutting guide 3502 includes a body 3516 defining a slot 3506. Slot 3506 may be configured to receive a sawblade, e.g., sawblade 122 of FIG. 1, to be inserted to resect humerus 112.
[0213] The user may position a tip of sawblade 122 within slot 3506. The user may align sawblade 122 with entry line element 2104. In some examples, body 3516 includes a bone contacting surface 3508. Bone contacting surface 3508 may be curved to correspond to a shape of a bone, e.g., humerus 112. In some examples, cutting guide 3502 includes a plurality of bump-like and/or tooth-like projections 3510 on bone contacting surface 3508 of body 3516.
Projections 3510 may help prevent slipping or other movement of cutting guide 3502 during humeral resection. Without cutting guide 3502 and projections 3510, motion of sawblade 122 during operation may be more likely to cause the user to inadvertently allow sawblade 122 to veer away from the planned cut designated by entry line element 2104. Cutting guide 3502 can increase stability of sawblade 122 and prevent unwanted movement of sawblade 122.
[0214] In some examples, handle 3504 and body 3516 define an angle 3512. Angle 3512 may be approximately 90 degrees, e.g., ±10 degrees, as depicted herein. However, different angles are also possible, as described in FIG. 35D.
[0215] FIG. 35C and FIG. 35D are schematic representations of a first alternative version of cutting guide 3502, according to techniques of this disclosure. FIG. 35C may be substantially similar to FIG. 35A, and FIG. 35D may be substantially similar to FIG. 35B. However, in the example of FIG. 35D, body 3516 and handle 3504 of cutting guide 3502 define an angle 3514 different from angle 3512. In the example of FIG. 35D, angle 3514 is approximately 180 degrees. In some examples, the position of handle 3504 relative to body 3516 can allow the user to perform the resection without cutting guide 3502 obstructing the user’s view or otherwise impeding the resection. In some examples, angle 3514 may be between approximately 90 degrees and 180 degrees to prevent obstructing the user’s view or otherwise impeding the resection. In some examples, when angle 3514 is between approximately 90 degrees and 180 degrees, the user may be able to hold cutting guide 3502 and saw 120 at the same time relatively easily. However, in some examples, angle 3514 may be greater than approximately 180 degrees or an angle less than approximately 90 degrees.
[0216] FIG. 35E and FIG. 35F are schematic representations of a second alternative version of a cutting guide, according to techniques of this disclosure. The cutting guide of FIG. 35C and FIG. 35D may be similar in shape and function as cutting guide 3502 as described elsewhere in this disclosure. However, as shown in the example of FIG. 35E and FIG. 35F, a slot 3526 defined in a body 3522 of the cutting guide is open-ended at an end of body 3522 opposite an end of body 3522 to which a handle is attached. Because slot 3526 is open-ended, a user may be able to insert sawblade 122 through the open-ended side of slot 3526 or may insert sawblade 122 along a path directly into slot 3526.
[0217] Additionally, in the example of FIG. 35E and FIG. 35F, body 3522 defines a set of pinguiding holes 3528. After a user has positioned the cutting guide on humerus 112, pins may be inserted through pin-guiding holes 3528 to secure the cutting guide to humerus 112. As shown in FIG. 35E, axes 3530 through pin-guiding holes 3528 for different pins may converge to help ensure that the cutting guide is securely attached to humerus 112. In some examples, the cutting
guide may have the open-ended slot and not pin-guiding holes 3528 or may have pin-guiding holes 3528 and a closed-ended slot.
[0218] FIG. 35G is a schematic representation of a second alternative version of cutting guide 3502, according to the techniques of this disclosure. In the example of FIG. 35G, a tracking marker 3518 is attached to a surface of body 3516. In some examples, body 3516 and tracking marker 3518 are one physical unit, or body 3516 and tracking marker 3518 are separate units assembled together. In some examples, tracking marker 3518 is an optical marker having predefined optical patterns on different faces of a cube. For instance, tracking marker 3518 may be a cube having different predefined optical patters on each face. In the example of FIG. 35G, tracking marker 3518 has 2-dimensional optical barcodes on different faces. In other examples, tracking marker 3518 has a different polyhedral shape than a cube, such as a dodecahedron, a pyramid, or another polyhedral shape. In some examples, tracking marker 3518 may be an ultrasonic emitter, an electromagnetic marker, a passive optical marker that reflects light, an active optical marker that emits light, and so on. In some examples, tracking marker 3518 comprises a set of objects (e.g., balls, cubes, etc.) having predefined sizes and arranged in a predefined spatial configuration.
[0219] In some examples, virtual guidance system 320 may use tracking marker 3518 in addition to or instead of tracking marker 124 to identify a position of sawblade 122. Tracking marker 3518 is a fixed predefined distance and direction from slot 3506. Hence, by tracking the position of tracking marker 3518 while sawblade 112 is inserted into slot 3506 and by tracking the position of tracking marker 110 attached to humerus 112, virtual guidance system 320 may determine the position of sawblade 122 relative to humerus 112. Virtual guidance system 320 may generate virtual guidance elements (e.g., virtual guidance elements 3200) by tracking the position of tracking marker 3518. In some examples, the use of tracking marker 3518 obviates the need for tracking structure 1500 and calibrating the sawblade relative to tracking marker 124 of saw 120.
[0220] FIG. 36 is a flowchart illustrating an example operation for providing cut guidance, according to techniques of this disclosure. The operation of FIG. 36 is consistent with the virtual guidance element shown in FIGS. 32A-32E. In the example of FIG. 36, virtual guidance system 320 determines a current cutting plane of sawblade 122 of saw 120 (3600). Virtual guidance system 320 may determine the current cutting plane of sawblade 122 based on signals from tracking system 116 and the position identification data generated as described above with respect to FIG. 20. For instance, registration system 316 may receive first signals from one or more sensors of tracking system 116. Registration system 316 may determine, based on
the first signals, first points corresponding to a first tracking marker 110 of a tracking structure 106 that comprises the first tracking marker 110 and an attachment body 108 positioned at a bicipital groove of the humerus. Additionally, registration system 316 may receive second signals from the sensors of tracking system 116. Registration system 316 may determine, based on the second signals, second points corresponding to a second tracking marker 804 of a digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot defined in the attachment body 108 of tracking structure 106. Registration system 316 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove. Positions of the first, second and third points are defined in a physical coordinate system. Registration system 316 may generate, based on the third points and a virtual model of the humerus having points defined in the virtual coordinate system, registration data 310 defining a relationship between the physical coordinate system and the virtual coordinate system. Virtual guidance system 320 may receive third signals from the one or more sensors of tracking system 116. Virtual guidance system 320 may determine, based on the third signals, fourth points corresponding to a third tracking marker 124 attached to a body of saw 120. Virtual guidance system 320 may determine the current cutting plane based on the fourth points. For instance, virtual guidance system 320 may use to the position identification data and the fourth points to determine fifth points that specify the current cutting plane in the physical coordinate system. [0221] In some examples, virtual guidance system 320 determines the current cutting plane while sawblade 122 of saw 120 is positioned in a slot 3506 defined by a guide device 3502. In some such examples, virtual guidance system 320 receives fourth signals from the one or more sensors 118 of tracking system 116. Virtual guidance system 320 may determine, based on the fourth signals, fifth points corresponding to a fourth tracking marker 3518 attached to guide device 3502. Virtual guidance system 320 may determine the current cutting plane based on the fifth points. For instance, virtual guidance system 320 may determine, based on the fourth signals, a position of tracking marker 3518 relative to tracking marker 110. Because slot 3506 is a predetermined distance from tracking marker 3518 and because sawblade 122 is inserted into slot 3506, virtual guidance system 320 may determine the current cutting plane based on the fifth points.
[0222] Virtual guidance system 320 may output, for display by MR device 104, a virtual guidance element 3200 (3602). Virtual guidance element 3200 includes a divided ring element 3202 that includes an enclosed area circle bisected by a line. Virtual guidance element 3200 further includes an active element 3204 having an inner edge and an outer edge. A distance
between a center of the line and a center of the inner edge of active element 3204 is indicative of a distance between a resection level of the current cutting plane of sawblade 122 into a bone and a resection level of a planned cutting plane through the bone. An angle of the inner edge of active element 3204 relative to the line is indicative of a superior/inferior angle of the current cutting plane of sawblade 122 and the planned cutting plane. A length of a line 3206 perpendicular to the inner edge of active element 3204 from the center of the inner edge of active element 3204 to the outer edge of active element 3204 is indicative of an anterior/posterior angle of the current cutting plane of sawblade 122 and the planned cutting plane.
[0223] Virtual guidance system 320 may determine, based on registration data 310, the distance between the resection level of the current cutting plane of sawblade 122 into the humerus and the resection level of the planned cutting plane through the bone (e.g., humerus). For example, virtual guidance system 320 may use registration data 310 to convert coordinates of the planned cutting plane which are defined in the virtual coordinate system into the physical coordinate system. In this example, virtual guidance system 320 may compare the converted coordinates to coordinates of the current cutting plane to determine the distance. In another example, virtual guidance system 320 may use registration data 310 to convert coordinates of the current cutting plane, which are defined in the physical coordinate system, into the virtual coordinate system. In this example, virtual guidance system 320 may compare the converted coordinates to coordinates of the planned cutting plane to determine the distance. In a similar way, virtual guidance system 320 may determine, based on registration data 310, the superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane. Likewise, virtual guidance system 320 may determine, based on the registration data, the anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
[0224] FIG. 37 is a flowchart illustrating an example operation for providing cut guidance with a guide instrument, according to techniques of this disclosure. In the example of FIG. 37, after a user has attached humeral tracking structure 106 at a bicipital groove of humerus 112, registration system 316 performs a registration process (3700). For example, registration system 316 may perform the registration process of FIG. 14 to generate registration data 310 that map positions in a virtual coordinate system to positions in a physical coordinate system. [0225] Additionally, registration system 316 may generate position identification data that indicate that the lower edge of sawblade 122 is at a specific position in space relative to one or more points on tracking marker 124 attached to saw 120 (3702). For example, registration
system 3700 may perform the process of FIG. 20 to generate position identification data that indicate that the lower edge of sawblade 122 is at a specific position in space relative to one or more points on tracking marker 124 attached to saw 120.
[0226] Virtual guidance system 320 may determine, based on signals from tracking system 116 and the position identification data, a current cut plane of sawblade 122 (3704). For example, virtual guidance system 320 may use the signals from tracking system 116 to determine the current position of tracking marker 124 relative to tracking marker 110 and then use the current position of tracking marker 124 and the position identification data to determine the current cut plane of sawblade 122.
[0227] Additionally, virtual guidance system 320 may generate virtual elements based on the current cut plane, a planned cut plane, and the registration data (3706). In some examples, virtual guidance system 320 generates the virtual elements based on the current cut plane, a planned cut plane, and the registration data as described above with respect to FIG. 36. The virtual elements guide the user to resect the humeral head of humerus 112. Example virtual elements include those shown in the examples of FIGS. 23-32. Virtual guidance system 320 may cause MR device 104 to display the virtual elements while the tip of sawblade 122 is inserted into a slot in a handheld guide device, such as cutting guide 3502 (3704).
[0228] FIG. 38 is a flowchart illustrating an example operation for providing cut guidance in conjunction with a guide instrument including an optical marker, according to the techniques of this disclosure. In the example of FIG. 38, registration system 316 may receive a first plurality of signals (e.g., video signals) from one or more sensors 118 of tracking system 116 (3802). The first plurality of signals may include first signals, and processing system 102 may determine, based on the first signals, first points corresponding to a tracking marker 110 of humeral tracking structure 106. Humeral tracking structure 106 includes tracking marker 110 and attachment body 108, which is positioned at a bicipital groove of humerus 112 of a patient. The first plurality of signals may additionally include second signals (e.g., later video signals) from sensors 118 of tracking system 116. Registration system 316 may determine, based on the second signals, second points corresponding to tracking marker 804 of digitizer 800 while a tip of digitizer 800 palpates the bicipital groove via a slot defined in attachment body 108 of humeral tracking structure 106. Registration system 316 may determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of digitizer 800, third points corresponding to the bicipital groove. Positions of the first, second and third points may be defined in a physical coordinate system. Processing system 102 may then generate, based on the third points and a virtual model of the humerus having points
defined in a virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system (3804).
[0229] Registration system 316 may receive a second plurality of signals from one or more sensors 118 of tracking system 116 (3806). Processing system 102 may determine, based on the second plurality of signals, points corresponding to an optical marker (e.g., tracking marker 3518) of cutting guide 3502. Based on the points corresponding to tracking marker 3518, processing system 102 can determine position data that specifies a position of the optical marker of cutting guide 3502 in the physical coordinate system (3808). Processing system 102 may then generate, based on the position data and the registration data, virtual guidance for positioning sawblade 122 while the tip of sawblade 122 is inserted through a slot (e.g., slot 3506) of cutting guide 3502 with respect to humerus 112 (3810). Processing system 102 may cause MR device 104 to display the virtual guidance (e.g., entry line element 2104) such that the virtual guidance appears to a user to be located on the humerus at locations where a planned cutting plane intersects humerus 112 (3812).
[0230] The following is a non-limited list of clauses in accordance with one or more techniques of this disclosure.
[0231] Clause 1A. A tracking structure comprising: an attachment body shaped for attachment at a bicipital groove of a humerus of a patient, the attachment body defining a slot having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer; and a tracking marker connected to the attachment body, the tracking marker having a polyhedral shape that includes a plurality of faces that have different optical patterns.
[0232] Clause 2A. The tracking structure of clause 1A, wherein a distal end of the slot is open-ended.
[0233] Clause 3A. The tracking structure of clause 2A, wherein: the attachment body includes a first prong on a first side of the slot and a second prong on a second side of the slot, each of the first prong and the second prong define respective apertures sized to accommodate fixation members for attachment of the attachment body to the humerus.
[0234] Clause 4A. The tracking structure of clause 3A, wherein: the first prong and the second prong are angled relative each other; or the first prong and the second prong are parallel and meet at a curved surface.
[0235] Clause 5A. The tracking structure of any of clauses 3A-4A, wherein at least one of the first prong or the second prong includes a textured area.
[0236] Clause 6A. The tracking structure of any of clauses 1A-5A, wherein the attachment body defines two or more apertures sized to accommodate fixation members for attachment of the attachment body to the humerus.
[0237] Clause 7A. The tracking structure of any of clauses 1A-6A, wherein the attachment body and the tracking marker are physically one unit.
[0238] Clause 8A. A computer-implemented method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker of a tracking structure, the tracking structure comprising the first tracking marker and an attachment body positioned at a bicipital groove of a humerus of a patient; receiving, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determining, by the processing system, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; and generating, by the processing system, based on the third points and a virtual model of the humerus having points defined in a virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system.
[0239] Clause 9A. The computer-implemented method of clause 8 A, wherein: points of a virtual element are defined in the virtual coordinate system; the method further comprises: determining, by the processing system, based on the registration data, points in the physical coordinate system corresponding to the points of the virtual element; causing, by the processing system, a mixed reality (MR) device to display the virtual element such that the virtual element appears to a user to be located at the determined points in the virtual coordinate system.
[0240] Clause 10A. The computer-implemented method of any of clauses 8A-9A, wherein the virtual element provides guidance for resecting a portion of the humerus.
[0241] Clause 11 A. The computer-implemented method of any of clauses 8A-10A, further comprising causing, by the processing system, an MR device to display instructions to palpate the bicipital groove with the digitizer.
[0242] Clause 12A. The computer-implemented method of any of clauses 8A-11A, further comprising causing, by the processing system, an MR device to display instructions to position
the tracking structure at the bicipital groove of the humerus and to insert fixation elements through apertures of the attachment body into the humerus to attach the tracking structure to the humerus.
[0243] Clause 13 A. The computer-implemented method of any of clauses 8A- 12A, wherein: receiving the first signals comprise receiving, by the processing system, first video signals comprising images of optical patterns on two or more faces of the first tracking marker, wherein the each of the faces has a different optical pattern, and determining the positions of the first points comprises determining, by the processing system, based on the images of the optical patterns, positions of vertices of the first tracking marker.
[0244] Clause 14A. The computer-implemented method of any of clauses 8A- 13A, wherein: the method further comprises: receiving, by the processing system, third signals from the one or more sensors of the tracking system; determining, based on the third signals, positions of fourth points in the physical coordinate system corresponding to the second tracking marker while the tip of the digitizer palpates a humeral head of the humerus; receiving, by the processing system, fourth signals from the one or more sensors of the tracking system; and determining, based on the fourth signals, positions of fifth points in the physical coordinate system corresponding to the second tracking marker while the tip of the digitizer palpates a metaphysis of the humerus, and generating the registration data comprises generating the registration data based on the third points, the fourth points, the fifth points, and the virtual model of the humerus.
[0245] Clause 15A. The computer-implemented method of any of clauses 8A-14A, wherein the tracking system is included in an MR device.
[0246] Clause 16A. The computer-implemented method of any of clauses 8A-15A, further comprising: receiving, by the processing system, third signals from the sensors of the tracking system after generating the registration data; determining, by the processing system, based on the registration data and the third signals, a distance between a location on a surface of the humerus and the tip of the digitizer; causing, by the processing system, a mixed reality (MR) device to display a first virtual element at the location on the surface of the humerus and to display a second virtual element indicating the determined distance of a tip of the digitizer to the location on the surface of the humerus; and refining, by the processing system, the registration data based on the determined distance being greater than a threshold while the tip of the digitizer is positioned at the location on the surface of the humerus.
[0247] Clause 17A. A non-transitory computer-readable storage medium comprising instructions stored thereon that, when executed by one or more processors of a processing system, cause the processing system to perform the methods of any of clauses 8A-16A.
[0248] Clause 18 A. A system comprising means for performing the methods of any of clauses 8A-16A.
[0249] Clause IB. A computer-implemented method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker attached to a body of a saw; determining, by the processing system, based on the first signals, second points corresponding to a second tracking marker of a tracking structure while a sawblade of the saw is positioned in a recess defined by a support body of the tracking structure; and generating, by the processing system, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
[0250] Clause 2B. The computer-implemented method of clause IB, further comprising: obtaining, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, third points corresponding to the first tracking marker; determining, by the processing system, based on the third points and the position identification data, guidance data that guide the user to position the lower edge of the sawblade at a location on a bone of a patient while cutting the bone; and causing, by the processing system, a mixed reality (MR) device to display the virtual guidance.
[0251] Clause 3B. The computer-implemented method of any of clauses 1B-2B, further comprising causing, by the processing system, a mixed reality (MR) device to display instructions to position the sawblade in the recess.
[0252] Clause 4B. The computer-implemented method of clause 3B, wherein the instructions instruct a user to position a tip of the sawblade in the recess.
[0253] Clause 5B. The computer-implemented method of any of clauses 3B-4B, wherein causing the MR device to display instructions to position the sawblade in the recess comprises displaying the instructions as a virtual representation of one or more surfaces of the support body at a position along a lateral side of the sawblade.
[0254] Clause 6B. The computer-implemented method of any of clauses 1B-5B, further comprising: determining, by the processing system, based on the second points, a planar orientation of the tracking structure; and causing, by the processing system, the MR device to
display virtual elements that indicate whether the planar orientation of the tracking structure is aligned with a cutting plane of the sawblade.
[0255] Clause 7B. A system comprising: a saw comprising a sawblade; a first tracking marker attached to the saw; and a tracking structure that comprises a support body and a second tracking marker, wherein the support body defines a recess to accommodate the sawblade; a processing system comprising one or more processors that are implemented in circuitry and configured to: receive first signals from one or more sensors of a tracking system; determine, based on the first signals, first points corresponding to the first tracking marker attached to a body of the saw; and determine, based on the first signals, second points corresponding to the second tracking marker while the sawblade of the saw is positioned in the recess defined by the support body of the tracking structure; and generate, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker.
[0256] Clause 8B. The system of clause 7B, wherein the recess is a slot having a width corresponding to a width of the sawblade.
[0257] Clause 9B. The system of any of clauses 7B-8B, wherein the processing system is further configured to: obtain second signals from the one or more sensors of the tracking system; determine, based on the second signals, third points corresponding to the first tracking marker; determine, based on the third points and the position identification data, guidance data that guide the user to position the lower edge of the sawblade at a location on a bone of a patient while cutting the bone; and cause a mixed reality (MR) device to display the virtual guidance. [0258] Clause 10B. The system of any of clauses 7B-9B, wherein the processing system is further configured to cause a mixed reality (MR) device to display instructions to position the sawblade in the recess.
[0259] Clause 11B. The system of clause 10B, wherein the instructions instruct a user to position a tip of the sawblade in the recess.
[0260] Clause 12B. The system of any of clauses 10B-11B, wherein the processing system is configured to cause the MR device to display the instructions as a virtual representation of one or more surfaces of the support body at a position along a lateral side of the sawblade.
[0261] Clause 13B. The system of any of clauses 7B-12B, wherein the processing system is further configured to: determine, based on the second points, a planar orientation of the tracking structure; and cause the MR device to display virtual elements that indicate whether the planar orientation of the tracking structure is aligned with a cutting plane of the sawblade.
[0262] Clause 14B. A non-transitory computer-readable storage medium having instructions stored thereon that, when executed, cause a processing system to perform the methods of any of clauses 1B-6B.
[0263] Clause 15B. A system comprising means for performing the methods of any of clauses 1B-6B.
[0264] Clause 1C. A computer-implemented method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, positions for a plurality of planned plane elements; determining, by the processing system, based on the first signals, positions for a plurality of current plane elements, wherein: each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient, the planned plane elements are not contiguous with each other; each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of a sawblade of a saw, and the current plane elements are not contiguous with each other; causing, by the processing system, a mixed reality (MR) device to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements, wherein, for each pair of corresponding current plane elements and planned plane elements, at least one of the current plane element of the pair or the planned plane element of the pair has a visual property that is based on the position for the current plane element of the pair relative to the planned cutting plane.
[0265] Clause 2C. The computer-implemented method of clause 1C, further comprising, for each pair of corresponding current plane elements and planned plane elements, determining the visual property based on whether the location indicated by the current plane element of the pair is above or below the planned cutting plane.
[0266] Clause 3C. The computer-implemented method of any of clauses 1C-2C, further comprising: causing, by the processing system, the MR device to display a virtual line element such that the virtual line element appears to the user to be located on the bone at locations where the planned cutting plane intersects the bone.
[0267] Clause 4C. The computer-implemented method of any of clauses 1C-3C, wherein each of the planned plane elements and/or each of the current plane elements is crescentshaped.
[0268] Clause 5C. The computer-implemented method of any of clauses 1C-4C, further comprising causing, by the processing system, the MR device to update the positions for the current plane elements based on changes to the current cutting plane of the sawblade.
[0269] Clause 6C. The computer-implemented method of any of clauses 1C-5C, wherein: the planned cutting plane is defined in a virtual coordinate system, the method further comprises, prior to receiving the first signals: receiving, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, first points corresponding to a first tracking marker of a tracking structure that comprises the first tracking marker and an attachment body positioned at a bicipital groove of a humerus of a patient; receiving, by the processing system, third signals from the sensors of the tracking system; determining, by the processing system, based on the third signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determining, by the processing system, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; and generating, by the processing system, based on the third points and a virtual model of the humerus having points defined in the virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system, and determining the positions for the planned plane elements comprises determining, by the processing system, the positions for the planned plane elements in the physical coordinate system based on the registration data and a position of the first tracking marker in the physical coordinate system.
[0270] Clause 7C. The computer-implemented method of any of clauses 1C-6C, wherein: the method further comprises, prior to receiving the first signals: receiving, by a processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, first points corresponding to a first tracking marker attached to a body of the saw; and determining, by the processing system, based on the second signals, second points corresponding to a second tracking marker of a tracking structure while the sawblade of the saw is positioned in a recess defined by a support body of the tracking structure, wherein the first points and the second points are defined in a physical coordinate system; and generating, by the processing system, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker, determining the positions for the current plane elements
comprises determining the positions for the current plane elements based on a position of the first tracking marker in the second signals and based on the position identification data.
[0271] Clause 8C. A system comprising: a mixed reality (MR) device; and a processing system that includes one or more processors implemented in circuitry, the processing system configured to: receive first signals from one or more sensors of a tracking system; determine, based on the first signals, positions for a plurality of planned plane elements; determine, based on the first signals, positions for a plurality of current plane elements, wherein: each of the planned plane elements indicates a location on a planned cutting plane through a bone of a patient, the planned plane elements are not contiguous with each other; each of the current plane elements corresponds to one of the planned plane elements and indicates a location on a current cutting plane of a sawblade of a saw, and the current plane elements are not contiguous with each other; and cause the mixed reality (MR) device to concurrently display the planned plane elements at the determined positions for the planned plane elements and the current plane elements at the determined positions for the current plane elements, wherein, for each pair of corresponding current plane elements and planned plane elements, at least one of the current plane element of the pair or the planned plane element of the pair has a visual property that is based on the position for the current plane element of the pair relative to the planned cutting plane.
[0272] Clause 9C. The system of clause 8C, wherein the processing system is further configured to, for each pair of corresponding current plane elements and planned plane elements, determine the visual property based on whether the location indicated by the current plane element of the pair is above or below the planned cutting plane.
[0273] Clause 10C. The system of any of clauses 8C-9C, wherein the processing system is further configured to cause the MR device to display a virtual line element such that the virtual line element appears to the user to be located on the bone at locations where the planned cutting plane intersects the bone.
[0274] Clause 11C. The system of any of clauses 8C-10C, wherein each of the planned plane elements and/or each of the current plane elements is crescent-shaped.
[0275] Clause 12C. The system of any of clauses 8C-11C, wherein the processing system is further configured to cause the MR device to update the positions for the current plane elements based on changes to the current cutting plane of the sawblade.
[0276] Clause 13C. The system of any of clauses 8C-12C, wherein: the planned cutting plane is defined in a virtual coordinate system, the processing system is further configured to, prior to receiving the first signals: receive second signals from the one or more sensors of the
tracking system; determine, based on the second signals, first points corresponding to a first tracking marker of a tracking structure that comprises the first tracking marker and an attachment body positioned at a bicipital groove of a humerus of a patient; receive third signals from the sensors of the tracking system; determine, based on the third signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; generate, based on the third points and a virtual model of the humerus having points defined in the virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system; and determine the positions for the planned plane elements in the physical coordinate system based on the registration data and a position of the first tracking marker in the physical coordinate system.
[0277] Clause 14C. The system of any of clauses 8C-13C, wherein the processing system is further configured to, prior to receiving the first signals: receive second signals from the one or more sensors of the tracking system; determine, based on the second signals, first points corresponding to a first tracking marker attached to a body of the saw; and determine, based on the second signals, second points corresponding to a second tracking marker of a tracking structure while the sawblade of the saw is positioned in a recess defined by a support body of the tracking structure, wherein the first points and the second points are defined in a physical coordinate system; generate, based on the first points and the second points, position identification data that specify a position of a lower edge of the sawblade relative to the first tracking marker; and determine the positions for the current plane elements based on a position of the first tracking marker in the second signals and based on the position identification data. [0278] Clause 15C. The system of any of clauses 8C-14C, wherein one or more of the processors of the processing system are included in the MR device.
[0279] Clause 16C. A non-transitory computer-readable storage medium having instructions stored thereon that, when executed, cause a processing system to perform the methods of any of clauses 1C-7C.
[0280] Clause 17C. A system comprising means for performing the methods of any of clauses 1C-7C.
[0281] Clause ID . A computer-implemented method comprising: determining, by a processing system that includes one or more processors implemented in circuitry, a current
cutting plane of a sawblade of a saw; and outputting, by the processing system, for display by a mixed reality (MR) device, a virtual guidance element that includes: a divided ring element that includes an enclosed area circle bisected by a line; and an active element having an inner edge and an outer edge, wherein: a distance between a center of the line and a center of the inner edge of the active element is indicative of a distance between a resection level of the current cutting plane of the sawblade into a bone and a resection level of a planned cutting plane through the bone, an angle of the inner edge of the active element relative to the line is indicative of a superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane, and a length of a line perpendicular to the inner edge of the active element from the center of the inner edge of the active element to the outer edge of the active element is indicative of an anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
[0282] Clause 2D. The computer-implemented method of clause ID, further comprising causing, by the processing system, the MR device to update the active element based on a change to the current cutting plane of the sawblade.
[0283] Clause 3D. The computer-implemented method of any of clauses 1D-2D, wherein outputting the virtual guidance element comprises outputting, by the processing system, for display by the MR device, the virtual guidance element so that the line of the divided ring element is aligned with the planned cutting plane.
[0284] Clause 4D. The computer-implemented method of any of clauses 1D-3D, wherein outputting the virtual guidance element comprises outputting, by the processing system, for display by the MR device, the virtual guidance element so that the virtual guidance element and the bone are simultaneously visible to a user of the MR device.
[0285] Clause 5D. The computer-implemented method of any of clauses 1D-4D, wherein: the bone is a humerus, the planned cutting plane is defined in a virtual coordinate system, the method further comprises: receiving, by the processing system, first signals from one or more sensors of a tracking system; determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker of a tracking structure that comprises the first tracking marker and an attachment body positioned at a bicipital groove of the humerus; receiving, by the processing system, second signals from the sensors of the tracking system; determining, by the processing system, based on the second signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determining, by the processing system, based on the second points and a predetermined spatial relationship
between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; generating, by the processing system, based on the third points and a virtual model of the humerus having points defined in the virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system; receiving, by the processing system, third signals from the one or more sensors of the tracking system; determining, by the processing system, based on the third signals, fourth points corresponding to a third tracking marker attached to a body of the saw; and determining, by the processing system, the current cutting plane based on the fourth points; determining, by the processing system, based on the registration data, the distance between the resection level of the current cutting plane of the sawblade into the humerus and the resection level of the planned cutting plane through the humerus; determining, by the processing system, based on the registration data, the superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane; and determining, by the processing system, based on the registration data, the anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
[0286] Clause 6D. The computer-implemented method of any of clauses 1D-5D, wherein determining the current cutting plane comprises determining the current cutting plane while the sawblade of the saw is positioned in a slot defined by a guide device.
[0287] Clause 7D. The computer-implemented method of clause 6D, wherein the guide device comprises a fourth tracking marker, the method further comprising: receiving, by the processing system, fourth signals from the one or more sensors of the tracking system; determining, by the processing system, based on the fourth signals, fifth points corresponding to a fourth tracking marker attached to the guide device; and determining, by the processing system, the current cutting plane based on the fifth points.
[0288] Clause 8D. A system comprising one or more processors implemented in circuitry, wherein one or more processors are configured to: determine a current cutting plane of a sawblade of a saw; and output for display by a mixed reality (MR) device, a virtual guidance element that includes: a divided ring element that includes an enclosed area circle bisected by a line; and an active element having an inner edge and an outer edge, wherein: a distance between a center of the line and a center of the inner edge of the active element is indicative of a distance between a resection level of the current cutting plane of the sawblade into a bone and a resection level of a planned cutting plane through the bone, an angle of the inner edge of the active element relative to the line is indicative of a superior/inferior angle of the current
cutting plane of the sawblade and the planned cutting plane, and a length of a line perpendicular to the inner edge of the active element from the center of the inner edge of the active element to the outer edge of the active element is indicative of an anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
[0289] Clause 9D. The system of clause 8D, wherein the one or more processors are further configured to cause the MR device to update the active element based on a change to the current cutting plane of the sawblade.
[0290] Clause 10D. The system of any of clauses 8D-9D, wherein to output the virtual guidance element, the one or more processors are configured to output, for display by the MR device, the virtual guidance element so that the line of the divided ring element is aligned with the planned cutting plane.
[0291] Clause 11D. The system of any of clauses 8D-10D, wherein to output the virtual guidance element, the one or more processors are configured to output, for display by the MR device, the virtual guidance element so that the virtual guidance element and the bone are simultaneously visible to a user of the MR device.
[0292] Clause 12D. The system of any of clauses 8D-1 ID, wherein: the bone is a humerus, the planned cutting plane is defined in a virtual coordinate system, and the one or more processors are further configured to : receive first signals from one or more sensors of a tracking system; determine, based on the first signals, first points corresponding to a first tracking marker of a tracking structure that comprises the first tracking marker and an attachment body positioned at a bicipital groove of the humerus; receive second signals from the sensors of the tracking system; determine, based on the second signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determine, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; generate, based on the third points and a virtual model of the humerus having points defined in the virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system; receive third signals from the one or more sensors of the tracking system; determine, based on the third signals, fourth points corresponding to a third tracking marker attached to a body of the saw; and determine the current cutting plane based on the fourth points; determine, based on the registration data, the distance between the resection level of the current cutting plane of the sawblade into the humerus and the resection
level of the planned cutting plane through the humerus; determine, based on the registration data, the superior/inferior angle of the current cutting plane of the sawblade and the planned cutting plane; and determine, based on the registration data, the anterior/posterior angle of the current cutting plane of the sawblade and the planned cutting plane.
[0293] Clause 13D. The system of any of clauses 8D-12D, wherein the processing system is configured to determine the current cutting plane while the sawblade of the saw is positioned in a slot defined by a guide device.
[0294] Clause 14D. The system of clause 13D, wherein the processing circuitry is further configured to: receive fourth signals from the one or more sensors of the tracking system; determine, based on the fourth signals, fifth points corresponding to a fourth tracking marker attached to the guide device; and determine the current cutting plane based on the fifth points. [0295] Clause 15D. A non-transitory computer-readable storage medium comprising instructions stored thereon that, when executed by one or more processors of a processing system, cause the processing system to perform the methods of any of clauses 1D-7D.
[0296] Clause IE. A computer-implemented method comprising: receiving, by a processing system, a first plurality of signals from one or more sensors of a tracking system; determining, by the processing system and based on a virtual model of a humerus of a patient and points determined based on the first plurality of signals, registration data defining a relationship between a physical coordinate system and a virtual coordinate system; receiving, by the processing system, a second plurality of signals from the one or more sensors of the tracking system; determining, by the processing system and based on points determined based on the second plurality of signals, position identification data that specify a position of a lower edge of a sawblade of a saw relative to a tracking marker attached to the saw; determining, by the processing system and based on the position identification data, a position of the sawblade in the physical coordinate system; generating, by the processing system and while a tip of the sawblade is inserted into a slot of a guide device configured to maintain a position of the sawblade relative to the bone during a resection procedure, virtual guidance based on the registration data and the position of the sawblade; and causing, by the processing system, a mixed reality (MR) device to display the virtual guidance.
[0297] Clause 2E. The computer-implemented method of clause IE, the method further comprising: causing, by the processing system, the MR device to display instructions to insert the tip of the sawblade into the slot of the guide device.
[0298] Clause 3E. The computer-implemented method of any of clauses 1E-2E, wherein the points determined based on the first plurality of signals comprise first points corresponding
to a tracking marker of a first tracking structure that comprises the tracking marker and an attachment body positioned at a bicipital groove of the humerus of the patient, second points corresponding to a tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the first tracking structure, and third points corresponding to the bicipital groove and based on the second points corresponding to the tracking marker of the digitizer and a predetermined spatial relationship between the tracking marker of the digitizer and the tip of the digitizer.
[0299] Clause 4E. The computer-implemented method of any of clauses 1E-3E, wherein the points determined based on the second plurality of signals comprise fourth points corresponding to a tracking marker attached to a body of the saw and fifth points corresponding to a tracking marker of a second tracking structure while the sawblade of the saw is positioned in a recess defined by a support body of the second tracking structure.
[0300] Clause 5E. The computer-implemented method of any of clauses 1E-4E, wherein the points determined based on the second plurality of signals additionally comprise sixth points corresponding to a tracking marker attached to the guide device.
[0301] Clause 6E. The computer-implemented method of any of clauses 1E-3E, wherein the points determined based on the second plurality of signals comprise fourth points corresponding to a tracking marker attached to a body of the guide device.
[0302] Clause 7E. The computer-implemented method of any of clauses 1E-6E, wherein the guide device comprises a handle and a body, the body defining: a slot extending through the body and configured to accommodate the tip of the sawblade; and a bone contacting surface, wherein the bone contacting surface comprises a plurality of bump-like projections extending away from the bone contacting surface.
[0303] Clause 8E. The computer-implemented method of any of clauses 1E-7E, wherein the handle and the body of the guide device form an angle between about 90 degrees and 180 degrees.
[0304] Clause 9E. The computer-implemented method of any of clauses 1E-8E, wherein the virtual guidance comprises a virtual line element, and wherein the virtual line element appears to a user to be located on the humerus at locations where a planned cutting plane intersects the humerus of the patient.
[0305] Clause 10E. A system comprising: a mixed reality (MR) device; and a processing system that includes one or more processors implemented in circuitry, the processing system configured to: receive a first plurality of signals from one or more sensors of a tracking system; determine, based on a virtual model of a humerus of a patient and points determined based on
the first plurality of signals, registration data defining a relationship between a physical coordinate system and a virtual coordinate system; receive a second plurality of signals from the one or more sensors of the tracking system; determine, based on points determined based on the second plurality of signals, position identification data that specify a position of a lower edge of a sawblade of a saw; determine, based on the registration data and the position identification data, a position of the sawblade of the saw relative to the humerus of the patient; and generate, while a tip of the sawblade is inserted into a slot of a guide device configured to maintain a position of the sawblade relative to the bone during a resection procedure, virtual guidance based on the registration data and the position of the sawblade; and cause the MR device to display the virtual guidance.
[0306] Clause HE. The system of clause 10E, wherein the processing system is further configured to: cause the MR device to display instructions to slot the guide device onto the tip of the sawblade.
[0307] Clause 12E. The system of any of clauses 10E-1 IE, wherein the points determined based on the first plurality of signals comprise first points corresponding to a tracking marker of a first tracking structure that comprises the tracking marker and an attachment body positioned at a bicipital groove of the humerus of the patient, second points corresponding to a tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the first tracking structure, and third points corresponding to the bicipital groove and based on the second points corresponding to the tracking marker of the digitizer and a predetermined spatial relationship between the tracking marker of the digitizer and the tip of the digitizer.
[0308] Clause 13E. The system of any of clauses 10E-12E, wherein the points determined based on the second plurality of signals comprise fourth points corresponding to a tracking marker attached to a body of the saw and fifth points corresponding to a tracking marker of a second tracking structure while the sawblade of the saw is positioned in a recess defined by a support body of the second tracking structure.
[0309] Clause 14E. The system of any of clauses 10E-13E, wherein the points determined based on the second plurality of signals additionally comprise points corresponding to a tracking marker attached to the guide device.
[0310] Clause 15E. The system of any of clauses 10E-12E, wherein the points determined based on the second plurality of signals comprise points corresponding to a tracking marker attached to a body of the guide device.
[0311] Clause 16E. The system of any of clauses 10E-15E, wherein the guide device comprises a handle and a body, the body defining: a slot extending through the body and configured to accept the tip of the sawblade; and a bone contacting surface, wherein the bone contacting surface comprises a plurality of bump-like projections extending away from the bone contacting surface.
[0312] Clause 17E. The system of any of clauses 10E-16E, wherein the handle and the body of the guide device form an angle between 90 degrees and 180 degrees.
[0313] Clause 18E. The system of any of clauses 10E-17E, wherein the virtual guidance comprises a virtual line element, and wherein the virtual line element appears to a user to be located on the humerus at locations where a planned cutting plane intersects the humerus of the patient.
[0314] Clause 19E. A non-transitory computer-readable storage medium comprising instructions stored thereon that, when executed by one or more processors of a processing system, cause the processing system to: receive a first plurality of signals from one or more sensors of a tracking system; determine, based on a virtual model of a humerus of a patient and points determined based on the first plurality of signals, registration data defining a relationship between a physical coordinate system and a virtual coordinate system; receive a second plurality of signals from the one or more sensors of the tracking system; determine, based on points determined based on the second plurality of signals, position identification data that specify a position of a lower edge of a sawblade of a saw; determine, based on the registration data and the position identification data, a position of the sawblade of the saw relative to the humerus of the patient; and generate, while a tip of the sawblade is inserted into a slot of a guide device configured to maintain a position of the sawblade relative to the bone during a resection procedure, virtual guidance based on the registration data and the position of the sawblade; and causing a mixed reality (MR) device to display the virtual guidance.
[0315] Clause 20E. A surgical cut guide comprising: a handle; and a guide portion comprising: a bone interface portion configured to contact a bone of a patient, wherein the bone interface portion comprises a plurality of bump-like projections; a slot configured to accommodate a sawblade, wherein the slot is sized so as to stabilize the sawblade during a cutting procedure.
[0316] Clause 2 IE. The surgical cut guide of clause 20E, wherein the surgical cut guide further comprises a tracking marker, the tracking marker having a polyhedral shape that includes a plurality of faces that have different optical patterns.
[0317] While the techniques been disclosed with respect to a limited number of examples, those skilled in the art, having the benefit of this disclosure, will appreciate numerous
modifications and variations there from. For instance, it is contemplated that any reasonable combination of the described examples may be performed. It is intended that the appended claims cover such modifications and variations as fall within the true spirit and scope of the invention.
[0318] It is to be recognized that depending on the example, certain acts or events of any of the techniques described herein can be performed in a different sequence, may be added, merged, or left out altogether (e.g., not all described acts or events are necessary for the practice of the techniques). Moreover, in certain examples, acts or events may be performed concurrently, e.g., through multi-threaded processing, interrupt processing, or multiple processors, rather than sequentially.
[0319] In one or more examples, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium and executed by a hardware -based processing unit. Computer-readable media may include computer-readable storage media, which corresponds to a tangible medium such as data storage media, or communication media including any medium that facilitates transfer of a computer program from one place to another, e.g., according to a communication protocol. In this manner, computer-readable media generally may correspond to (1) tangible computer- readable storage media which is non-transitory or (2) a communication medium such as a signal or carrier wave. Data storage media may be any available media that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementation of the techniques described in this disclosure. A computer program product may include a computer-readable medium.
[0320] By way of example, and not limitation, such computer-readable storage media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage, or other magnetic storage devices, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable storage media and data storage media do not include connections, carrier waves, signals, or
other transitory media, but are instead directed to non-transitory, tangible storage media. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc, where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
[0321] Operations described in this disclosure may be performed by one or more processors, which may be implemented as fixed-function processing circuits, programmable circuits, or combinations thereof, such as one or more digital signal processors (DSPs), general purpose microprocessors, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Fixed-function circuits refer to circuits that provide particular functionality and are preset on the operations that can be performed. Programmable circuits refer to circuits that can programmed to perform various tasks and provide flexible functionality in the operations that can be performed. For instance, programmable circuits may execute instructions specified by software or firmware that cause the programmable circuits to operate in the manner defined by instructions of the software or firmware. Fixed-function circuits may execute software instructions (e.g., to receive parameters or output parameters), but the types of operations that the fixed-function circuits perform are generally immutable. Accordingly, the terms “processor” and “processing circuity,” as used herein may refer to any of the foregoing structures or any other structure suitable for implementation of the techniques described herein.
Claims
1. A tracking structure comprising: an attachment body shaped for attachment at a bicipital groove of a humerus of a patient, the attachment body defining a slot having dimensions sufficient for palpation of the bicipital groove using a handheld digitizer; and a tracking marker connected to the attachment body, the tracking marker having a polyhedral shape that includes a plurality of faces that have different optical patterns.
2. The tracking structure of claim 1, wherein a distal end of the slot is open-ended.
3. The tracking structure of claim 2, wherein: the attachment body includes a first prong on a first side of the slot and a second prong on a second side of the slot, each of the first prong and the second prong define respective apertures sized to accommodate fixation members for attachment of the attachment body to the humerus.
4. The tracking structure of claim 3, wherein: the first prong and the second prong are angled relative each other; or the first prong and the second prong are parallel and meet at a curved surface.
5. The tracking structure of any of claims 3-4, wherein at least one of the first prong or the second prong includes a textured area.
6. The tracking structure of any of claims 1-5, wherein the attachment body defines two or more apertures sized to accommodate fixation members for attachment of the attachment body to the humerus.
7. The tracking structure of any of claims 1-6, wherein the attachment body and the tracking marker are physically one unit.
8. A computer-implemented method comprising: receiving, by a processing system, first signals from one or more sensors of a tracking system;
determining, by the processing system, based on the first signals, first points corresponding to a first tracking marker of a tracking structure, the tracking structure comprising the first tracking marker and an attachment body positioned at a bicipital groove of a humerus of a patient; receiving, by the processing system, second signals from the one or more sensors of the tracking system; determining, by the processing system, based on the second signals, second points corresponding to a second tracking marker of a digitizer while a tip of the digitizer palpates the bicipital groove via a slot defined in the attachment body of the tracking structure; determining, by the processing system, based on the second points and a predetermined spatial relationship between the second tracking marker and the tip of the digitizer, third points corresponding to the bicipital groove, wherein positions of the first, second and third points are defined in a physical coordinate system; and generating, by the processing system, based on the third points and a virtual model of the humerus having points defined in a virtual coordinate system, registration data defining a relationship between the physical coordinate system and the virtual coordinate system.
9. The computer-implemented method of claim 8, wherein: points of a virtual element are defined in the virtual coordinate system; the method further comprises: determining, by the processing system, based on the registration data, points in the physical coordinate system corresponding to the points of the virtual element; and causing, by the processing system, a mixed reality (MR) device to display the virtual element such that the virtual element appears to a user to be located at the determined points in the virtual coordinate system.
10. The computer-implemented method of any of claims 8-9, wherein the virtual element provides guidance for resecting a portion of the humerus.
11. The computer-implemented method of any of claims 8-10, further comprising causing, by the processing system, an MR device to display instructions to palpate the bicipital groove with the digitizer.
12. The computer-implemented method of any of claims 8-11, further comprising causing, by the processing system, an MR device to display instructions to position the tracking structure at the bicipital groove of the humerus and to insert fixation elements through apertures of the attachment body into the humerus to attach the tracking structure to the humerus.
13. The computer-implemented method of any of claims 8-12, wherein: receiving the first signals comprise receiving, by the processing system, first video signals comprising images of optical patterns on two or more faces of the first tracking marker, wherein the each of the faces has a different optical pattern, and determining the positions of the first points comprises determining, by the processing system, based on the images of the optical patterns, positions of vertices of the first tracking marker.
14. The computer-implemented method of any of claims 8-13, wherein: the method further comprises: receiving, by the processing system, third signals from the one or more sensors of the tracking system; determining, based on the third signals, positions of fourth points in the physical coordinate system corresponding to the second tracking marker while the tip of the digitizer palpates a humeral head of the humerus; receiving, by the processing system, fourth signals from the one or more sensors of the tracking system; and determining, based on the fourth signals, positions of fifth points in the physical coordinate system corresponding to the second tracking marker while the tip of the digitizer palpates a metaphysis of the humerus, and generating the registration data comprises generating the registration data based on the third points, the fourth points, the fifth points, and the virtual model of the humerus.
15. The computer-implemented method of any of claims 8-14, wherein the tracking system is included in an MR device.
16. The computer-implemented method of any of claims 8-15, further comprising: receiving, by the processing system, third signals from the sensors of the tracking system after generating the registration data;
determining, by the processing system, based on the registration data and the third signals, a distance between a location on a surface of the humerus and the tip of the digitizer; causing, by the processing system, a mixed reality (MR) device to display a first virtual element at the location on the surface of the humerus and to display a second virtual element indicating the determined distance of a tip of the digitizer to the location on the surface of the humerus; and refining, by the processing system, the registration data based on the determined distance being greater than a threshold while the tip of the digitizer is positioned at the location on the surface of the humerus.
17. One or more non-transitory computer-readable storage media comprising instructions stored thereon that, when executed by one or more processors of a processing system, cause the processing system to perform the methods of any of claims 8-16.
18. A system comprising means for performing the methods of any of claims 8-16.
Applications Claiming Priority (8)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363510316P | 2023-06-26 | 2023-06-26 | |
| US202363510325P | 2023-06-26 | 2023-06-26 | |
| US202363510299P | 2023-06-26 | 2023-06-26 | |
| US63/510,325 | 2023-06-26 | ||
| US63/510,316 | 2023-06-26 | ||
| US63/510,299 | 2023-06-26 | ||
| US202463563002P | 2024-03-08 | 2024-03-08 | |
| US63/563,002 | 2024-03-08 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025003910A1 true WO2025003910A1 (en) | 2025-01-02 |
Family
ID=91898180
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/056216 Pending WO2025003910A1 (en) | 2023-06-26 | 2024-06-26 | Humeral marker for mixed reality surgical navigation |
| PCT/IB2024/056207 Pending WO2025003902A1 (en) | 2023-06-26 | 2024-06-26 | Mixed reality humeral cut navigation |
| PCT/IB2024/056213 Pending WO2025003908A1 (en) | 2023-06-26 | 2024-06-26 | Sawblade position identification process for mixed reality surgical navigation |
Family Applications After (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/056207 Pending WO2025003902A1 (en) | 2023-06-26 | 2024-06-26 | Mixed reality humeral cut navigation |
| PCT/IB2024/056213 Pending WO2025003908A1 (en) | 2023-06-26 | 2024-06-26 | Sawblade position identification process for mixed reality surgical navigation |
Country Status (1)
| Country | Link |
|---|---|
| WO (3) | WO2025003910A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106413591B (en) * | 2014-05-27 | 2019-06-04 | 阿斯卡拉波股份有限公司 | medical system |
| WO2022265983A1 (en) * | 2021-06-17 | 2022-12-22 | Howmedica Osteonics Corp. | Clamping tool mounted registration marker for orthopedic surgical procedures |
| US20230113383A1 (en) * | 2020-02-20 | 2023-04-13 | One Ortho | Augmented Reality Guidance System For Guiding Surgical Operations On An Articulating Portion Of A Bone |
| US11638613B2 (en) * | 2019-05-29 | 2023-05-02 | Stephen B. Murphy | Systems and methods for augmented reality based surgical navigation |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE19842798C1 (en) * | 1998-09-18 | 2000-05-04 | Howmedica Leibinger Gmbh & Co | Calibration device |
| US7213598B2 (en) * | 2002-05-28 | 2007-05-08 | Brainlab Ag | Navigation-calibrating rotationally asymmetrical medical instruments or implants |
| EP1690503B1 (en) * | 2005-02-15 | 2013-07-24 | BrainLAB AG | User guidance for adjusting the cutting guides for the bones |
| US20060235290A1 (en) * | 2005-04-04 | 2006-10-19 | Aesculap Ag & Co. Kg | Method and apparatus for positioning a cutting tool for orthopedic surgery using a localization system |
| US20060271056A1 (en) * | 2005-05-10 | 2006-11-30 | Smith & Nephew, Inc. | System and method for modular navigated osteotome |
| EP1919390B1 (en) * | 2005-08-05 | 2012-12-19 | DePuy Orthopädie GmbH | Computer assisted surgery system |
| US8560047B2 (en) * | 2006-06-16 | 2013-10-15 | Board Of Regents Of The University Of Nebraska | Method and apparatus for computer aided surgery |
| US11911117B2 (en) * | 2011-06-27 | 2024-02-27 | Board Of Regents Of The University Of Nebraska | On-board tool tracking system and methods of computer assisted surgery |
| KR102378417B1 (en) * | 2018-06-20 | 2022-03-25 | 테크마 메디컬 엘엘씨 | METHODS AND DEVICES FOR KNEE SURGERY WITH INERTIAL SENSORS |
| US11553969B1 (en) * | 2019-02-14 | 2023-01-17 | Onpoint Medical, Inc. | System for computation of object coordinates accounting for movement of a surgical site for spinal and other procedures |
-
2024
- 2024-06-26 WO PCT/IB2024/056216 patent/WO2025003910A1/en active Pending
- 2024-06-26 WO PCT/IB2024/056207 patent/WO2025003902A1/en active Pending
- 2024-06-26 WO PCT/IB2024/056213 patent/WO2025003908A1/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106413591B (en) * | 2014-05-27 | 2019-06-04 | 阿斯卡拉波股份有限公司 | medical system |
| US11638613B2 (en) * | 2019-05-29 | 2023-05-02 | Stephen B. Murphy | Systems and methods for augmented reality based surgical navigation |
| US20230113383A1 (en) * | 2020-02-20 | 2023-04-13 | One Ortho | Augmented Reality Guidance System For Guiding Surgical Operations On An Articulating Portion Of A Bone |
| WO2022265983A1 (en) * | 2021-06-17 | 2022-12-22 | Howmedica Osteonics Corp. | Clamping tool mounted registration marker for orthopedic surgical procedures |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025003908A1 (en) | 2025-01-02 |
| WO2025003902A1 (en) | 2025-01-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| AU2020275280B2 (en) | Bone wall tracking and guidance for orthopedic implant placement | |
| AU2020316076B2 (en) | Positioning a camera for perspective sharing of a surgical site | |
| US12042341B2 (en) | Registration marker with anti-rotation base for orthopedic surgical procedures | |
| AU2021267483B2 (en) | Mixed reality-based screw trajectory guidance | |
| US12496135B2 (en) | Mixed-reality humeral-head sizing and placement | |
| US20230113848A1 (en) | Computer-implemented surgical planning based on bone loss during orthopedic revision surgery | |
| US12465374B2 (en) | Surgical guidance for surgical tools | |
| AU2022292552B2 (en) | Clamping tool mounted registration marker for orthopedic surgical procedures | |
| WO2025003910A1 (en) | Humeral marker for mixed reality surgical navigation | |
| AU2021246607B2 (en) | Mixed reality guidance for bone-graft harvesting | |
| AU2022339494B2 (en) | Using mixed-reality hardware for range of motion estimation during robot-assisted orthopedic surgery | |
| EP4583802A1 (en) | Mixed reality bone graft shaping | |
| WO2024182690A1 (en) | Target-centered virtual element to indicate hold time for mr registration during surgery |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24740571 Country of ref document: EP Kind code of ref document: A1 |