[go: up one dir, main page]

US20220020219A1 - Augmented reality bone landmark display - Google Patents

Augmented reality bone landmark display Download PDF

Info

Publication number
US20220020219A1
US20220020219A1 US17/376,676 US202117376676A US2022020219A1 US 20220020219 A1 US20220020219 A1 US 20220020219A1 US 202117376676 A US202117376676 A US 202117376676A US 2022020219 A1 US2022020219 A1 US 2022020219A1
Authority
US
United States
Prior art keywords
location
landmark
augmented reality
virtual
bone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/376,676
Inventor
Ramnada CHAV
Pierre Couture
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orthosoft ULC
Original Assignee
Orthosoft ULC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orthosoft ULC filed Critical Orthosoft ULC
Priority to US17/376,676 priority Critical patent/US20220020219A1/en
Assigned to ORTHOSOFT ULC reassignment ORTHOSOFT ULC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COUTURE, PIERRE, CHAV, RAMNADA
Publication of US20220020219A1 publication Critical patent/US20220020219A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B50/00Containers, covers, furniture or holders specially adapted for surgical or diagnostic appliances or instruments, e.g. sterile covers
    • A61B50/30Containers specially adapted for packaging, protecting, dispensing, collecting or disposing of surgical or diagnostic appliances or instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/24Apparatus using programmed or automatic operation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00203Electrical control of surgical instruments with speech control or speech recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00207Electrical control of surgical instruments with hand gesture control or hand gesture recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • A61B2034/256User interfaces for surgical systems having a database of accessory information, e.g. including context sensitive help or scientific articles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/90Identification means for patients or instruments, e.g. tags
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/02Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor using physical phenomena
    • A61L2/04Heat
    • A61L2/06Hot gas
    • A61L2/07Steam
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/02Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor using physical phenomena
    • A61L2/08Radiation
    • A61L2/10Ultraviolet radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/16Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor using chemical substances
    • A61L2/18Liquid substances or solutions comprising solids or dissolved gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2/00Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor
    • A61L2/16Methods or apparatus for disinfecting or sterilising materials or objects other than foodstuffs or contact lenses; Accessories therefor using chemical substances
    • A61L2/20Gaseous substances, e.g. vapours
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2202/00Aspects relating to methods or apparatus for disinfecting or sterilising materials or objects
    • A61L2202/10Apparatus features
    • A61L2202/12Apparatus for isolating biocidal substances from the environment
    • A61L2202/122Chambers for sterilisation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2202/00Aspects relating to methods or apparatus for disinfecting or sterilising materials or objects
    • A61L2202/10Apparatus features
    • A61L2202/14Means for controlling sterilisation processes, data processing, presentation and storage means, e.g. sensors, controllers, programs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2202/00Aspects relating to methods or apparatus for disinfecting or sterilising materials or objects
    • A61L2202/10Apparatus features
    • A61L2202/16Mobile applications, e.g. portable devices, trailers, devices mounted on vehicles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2202/00Aspects relating to methods or apparatus for disinfecting or sterilising materials or objects
    • A61L2202/10Apparatus features
    • A61L2202/18Aseptic storing means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61LMETHODS OR APPARATUS FOR STERILISING MATERIALS OR OBJECTS IN GENERAL; DISINFECTION, STERILISATION OR DEODORISATION OF AIR; CHEMICAL ASPECTS OF BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES; MATERIALS FOR BANDAGES, DRESSINGS, ABSORBENT PADS OR SURGICAL ARTICLES
    • A61L2202/00Aspects relating to methods or apparatus for disinfecting or sterilising materials or objects
    • A61L2202/20Targets to be treated
    • A61L2202/24Medical instruments, e.g. endoscopes, catheters, sharps
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • Surgical advancements have allowed surgeons to use preoperative planning, display devices within a surgical field, optical imaging, and guides to improve surgical outcomes and customize surgery for a patient. While these advances have allowed for quicker and more successful surgeries, they ultimately rely on physical objects, which have costs and time requirements for manufacturing and configuration. Physical objects and devices may also obstruct portions of a surgical field, detracting from their benefits.
  • Computer-assisted surgery is a growing field that encompasses a wide range of devices, uses, procedures, and computing techniques, such as surgical navigation, pre-operative planning, and various robotic techniques.
  • a robotic system may be used in some surgical procedures, such as orthopedic procedures, to aid a surgeon in completing the procedures more accurately, quicker, or with less fatigue.
  • FIG. 1 illustrates surgical field in accordance with some embodiments.
  • FIG. 2 illustrates an AR instrument identification display in accordance with some embodiments.
  • FIG. 3 illustrates a system for displaying virtual representations of a landmark in accordance with some embodiments.
  • FIG. 4 illustrates a flowchart showing a technique for displaying virtual representations of a landmark in accordance with some embodiments.
  • FIG. 5 illustrates a surgical field including a virtual representation of a remote surgical field, for example for use with an augmented reality display in accordance with some embodiments.
  • FIG. 6 illustrates a flowchart showing a technique for displaying a virtual representation of a remote surgical field within a local surgical field in accordance with some embodiments.
  • FIG. 7 illustrates a robot sterilization system in accordance with some embodiments.
  • FIG. 8 illustrates a flowchart showing a technique for storing a sterilized instrument using a robotic system in accordance with some embodiments.
  • FIG. 9 illustrates a system for surgical instrument identification using an augmented reality display in accordance with some embodiments.
  • FIG. 10 illustrates generally an example of a block diagram of a machine upon which any one or more of the techniques discussed herein may perform in accordance with some embodiments.
  • FIGS. 11A-11B illustrate user interface components for landmark planning and plan evaluation in accordance with some embodiments.
  • An augmented reality (AR) device allows a user to view displayed virtual objects that appear to be projected into the real environment, which is also visible.
  • AR devices typically include two display lenses or screens, including one for each eye of a user. Light is permitted to pass through the two display lenses such that aspects of the real environment are visible while also projecting light to make virtual elements visible to the user of the AR device.
  • FIG. 1 illustrates surgical field 100 in accordance with some embodiments.
  • the surgical field 100 is illustrated in FIG. 1 including a surgeon 102 , a patient 108 , and may include a camera 112 .
  • the surgeon 102 is wearing an augmented reality (AR) device 104 which may be used to display a virtual object 110 to the surgeon 102 .
  • the virtual object 110 may not be visible to others within the surgical field 100 (e.g., surgical assistant 114 or nurse 120 ), though they may wear AR devices 116 and 122 respectively.
  • AR augmented reality
  • the person may not be able to see the virtual object 110 or may be able to see the virtual object 110 in a shared augmented reality with the surgeon 102 , or may be able to see a modified version of the virtual object 110 (e.g., according to customizations unique to the surgeon 102 or the person) or may see different virtual objects entirely.
  • Augmented reality is explained in more detail below.
  • Augmented reality is a technology for displaying virtual or “augmented” objects or visual effects overlaid on a real environment.
  • the real environment may include a room or specific area (e.g., the surgical field 100 ), or may be more general to include the world at large.
  • the virtual aspects overlaid on the real environment may be represented as anchored or in a set position relative to one or more aspects of the real environment.
  • the virtual object 110 may be configured to appear to be resting on a table.
  • An AR system may present virtual aspects that are fixed to a real object without regard to a perspective of a viewer or viewers of the AR system (e.g., the surgeon 102 ).
  • the virtual object 110 may exist in a room, visible to a viewer of the AR system within the room and not visible to a viewer of the AR system outside the room.
  • the virtual object 110 in the room may be displayed to the viewer outside the room when the viewer enters the room.
  • the room may act as a real object that the virtual object 110 is fixed to in the AR system.
  • the AR device 104 may include one or more screens, such as a single screen or two screens (e.g., one per eye of a user).
  • the screens may allow light to pass through the screens such that aspects of the real environment are visible while displaying the virtual object 110 .
  • the virtual object 110 may be made visible to the surgeon 102 by projecting light.
  • the virtual object 110 may appear to have a degree of transparency or may be opaque (i.e., blocking aspects of the real environment).
  • An AR system may be viewable to one or more viewers, and may include differences among views available for the one or more viewers while retaining some aspects as universal among the views.
  • a heads-up display may change between two views while virtual objects may be fixed to a real object or area in both views. Aspects such as a color of an object, lighting, or other changes may be made among the views without changing a fixed position of at least one virtual object.
  • a user may see the virtual object 110 presented in an AR system as opaque or as including some level of transparency.
  • the user may interact with the virtual object 110 , such as by moving the virtual object 110 from a first position to a second position.
  • the user may move an object with his or her hand. This may be done in the AR system virtually by determining that the hand has moved into a position coincident or adjacent to the object (e.g., using one or more cameras, which may be mounted on an AR device, such as AR device camera 106 or separate, and which may be static or may be controlled to move), and causing the object to move in response.
  • Virtual aspects may include virtual representations of real world objects or may include visual effects, such as lighting effects, etc.
  • the AR system may include rules to govern the behavior of virtual objects, such as subjecting a virtual object to gravity or friction, or may include other predefined rules that defy real world physical constraints (e.g., floating objects, perpetual motion, etc.).
  • An AR device 104 may include a camera 106 on the AR device 104 (not to be confused with the camera 112 , separate from the AR device 104 ).
  • the AR device camera 106 or the camera 112 may include an infrared camera, an infrared filter, a visible light filter, a plurality of cameras, a depth camera, etc.
  • the AR device 104 may project virtual items over a representation of a real environment, which may be viewed by a user.
  • Eye tracking may be used with an AR system to determine which instrument a surgeon wants next by tracking the surgeon's eye to the instrument.
  • a nurse or surgical assistant may then retrieve the determined instrument.
  • the determined instrument may be presented in AR to the nurse or surgical assistant.
  • the surgeon may speak the instrument (e.g., using a pre-selected code word, using speech processing and word recognition, via saying a number, or the like).
  • the voice command may be combined with eye tracking, in still another example, to find an instrument;
  • the AR device 104 may be used in the surgical field 100 during a surgical procedure, for example performed by the surgeon 102 on the patient 108 .
  • the AR device 104 may project or display virtual objects, such as the virtual object 110 during the surgical procedure to augment the surgeon's vision.
  • the surgeon 102 may control the virtual object 110 using the AR device 104 , a remote controller for the AR device 104 , or by interacting with the virtual object 110 (e.g., using a hand to “interact” with the virtual object 110 or a gesture recognized by the camera 106 of the AR device 104 ).
  • the virtual object 108 may augment a surgical tool.
  • the virtual object 110 may appear (to the surgeon 102 viewing the virtual object 110 through the AR device 104 ) as a representation of a landmark previously placed on a patient bone.
  • the virtual object 110 may be used to represent a planned location of a landmark (e.g., using a pre-operative image and a captured image of the bone in the real space).
  • the virtual object 110 may react to movements of other virtual or real-world objects in the surgical field.
  • the virtual object 110 may be altered by a to move a landmark (e.g., a placed landmark). Further discussion of virtual landmarks is discussed below with respect to FIGS. 3-4 .
  • the virtual object 110 may be a virtual representation of a remote surgical field (e.g., an entire OR, a camera field of view of a room, a close-up view of a surgical theater, etc.).
  • the virtual object 110 may include a plurality of virtual objects. Further discussion of this example is provided below with respect to FIGS. 5-6 .
  • FIG. 2 illustrates an augmented reality (AR) instrument identification display 200 in accordance with some embodiments.
  • the nursing staff Prior to any surgical procedure, the nursing staff unloads trays, and prepares and places instrumentation for the procedure on a table. This process may be fastidious and error prone (e.g., missing instrument, misplacement of instrument, etc.).
  • a surgeon may have preferences for instrument placement, table location, or the like. For example, the table may be preferred in a particular setup, which may increase consistency and efficiency by removing risks of the wrong tool being picked up, which may delay a surgery. Errors due to human choice, staff change, turnover, or the like may be responsible for decreases in efficiency.
  • the instrumentation placement process may include a check-list, which is time consuming and also error prone.
  • the present systems and methods may include a technological solution to errors in instrument placement by leveraging artificial intelligence or augmented reality (AR) to ensure correct placement of instruments.
  • the systems and methods described herein may tell staff which instrument to place in what location on a table, for example based on surgeon preference (e.g., using AR).
  • the systems and methods described herein may be used to verify that one or all instruments are correctly placed on the table, such as using an automatic check list verification.
  • complicated instruments may be assembled using the systems and methods described herein.
  • the benefits of using the present systems and methods include a faster preparation or setup of a procedure room (e.g., operating room), eliminating instrument misplacement (improving workflow, efficiency, etc.), and helping avoid the need for surgeon oversight in the process.
  • a procedure room e.g., operating room
  • instrument misplacement improving workflow, efficiency, etc.
  • the AR instrument identification display 200 includes a surgical instrument 206 , a virtual indicator 208 , and may include additional information 210 , such as patient or procedure information.
  • the virtual indicator 208 may be used to identify the surgical instrument 206 that corresponds to a procedure being performed.
  • the virtual indicator 208 may include moving lights, flashing lights, color or changing color lights, or other virtual effects.
  • the additional information 210 may for example, name or provide other information about the surgical instrument 206 .
  • the virtual indicator 208 may be added to the AR display 200 B in response to a surgeon selection identifying a need for the surgical instrument 206 .
  • the virtual indicator 208 may be removed from the AR display 200 .
  • a virtual indicator 212 may be used to identify an item, such as a correctly or an incorrectly placed instrument, a verified instrument, or an unknown instrument.
  • a user of the AR device used to present the AR display 200 may interact with the virtual indicator 208 , for example by placing a finger, hand, or item adjacent to or appearing to occupy the same space as the virtual indicator 208 .
  • the virtual indicator 208 may perform an action, such as displaying information about the item represented by the virtual indicator 208 (e.g., a name of the item, whether the item is a one-time use item or can be re-sterilized, whether the item is fragile, whether the item is a patient-specific or personalized item, what procedure the item is to be used for, or the like).
  • information about the item represented by the virtual indicator 208 e.g., a name of the item, whether the item is a one-time use item or can be re-sterilized, whether the item is fragile, whether the item is a patient-specific or personalized item, what procedure the item is to be used for, or the like.
  • a schedule for procedures during a day in an operating room may be obtained or retrieved by a device.
  • the device may provide AR capabilities to a user, including instructions for setting up a next procedure in the schedule.
  • the users with the aid of the AR, may place the instruments in correct position or orientation on a table in the operating room.
  • a verification process may be performed, and an output (e.g., correctly placed or incorrectly placed, such as with additional instructions for correct placement) may be provided to the user (e.g., via the AR).
  • an output e.g., correctly placed or incorrectly placed, such as with additional instructions for correct placement
  • a picture may be taken and a full verification process may be performed to validate the operating room for the given procedure.
  • the full verification process may include a second check of each instrument, a check of the instruments against needed instruments for the given procedure, timing verification based on the schedule, or the like.
  • Data may be collected about a surgical procedure, such as a time-series of data based on progression through the procedure, what steps occur at what times (e.g., when started or completed), locations of team members (e.g., surgeon, nurse, etc.) throughout the procedure, camera stills or video of the procedure at various moments, instrument tracking or use, or the like.
  • FIG. 3 illustrates a system for displaying virtual representations of a landmark in accordance with some embodiments.
  • a landmark may be obtained, such as on a bone of a patient.
  • An AR device may show a virtual representation of the landmark that was acquired in a display view 300 .
  • the virtual representation may be displayed on a bone (e.g., a femur 306 or a tibia 308 ) of the patient (e.g., overlaid on the real bone).
  • the AR device may request confirmation (e.g., via a display) to confirm the landmark's location.
  • a voice command may be used to control the landmark confirmation or display with the AR device.
  • the virtual representations may include representations of surgeon generated (e.g., selected or registered) landmarks (e.g., 302 A, 302 B, and 302 C) or planned landmarks (e.g., 304 A, 304 B, and 304 C).
  • the AR display view 300 allows the femur 306 and the tibia 308 to be visible while also presenting virtual representations of landmarks.
  • different bones e.g., hip, shoulder, spine, etc.
  • a virtual representation of a bone may be displayed with the virtual representations of landmarks (e.g., entirely virtual).
  • the surgeon generated landmarks may include a landmark 302 A, which is displayed on the femur 306 separated by some distance from a corresponding planned landmark 304 A.
  • the planned landmark 304 A may be generated based on pre-operative planning, for example using a 3D model, an image of the patient, or the like.
  • the planned landmark 304 A may be registered in the real space. For example, a known image or model coordinate system may be converted to a coordinate system in the real space using image processing.
  • the image processing may compare captured images of a bone (e.g., in real-time), the patient, a reference object, or the like in real space to previously captured images or a previously generated model.
  • a location of the planned landmark 304 A may be registered on the real femur 306 . From this registration, further processing may be used to determine how to present a virtual representation of the planned landmark 304 A in the real space via an AR display device (e.g., overlaid virtually in the real space within the display view 300 ).
  • the surgeon generated landmark 302 A may be registered based on an input device (e.g., a pointer that may be used to identify landmarks) or may be identified directly via the AR device (e.g., with visual processing of an indicated landmark).
  • an input device e.g., a pointer that may be used to identify landmarks
  • the AR device e.g., with visual processing of an indicated landmark.
  • the registration to the real space for display in augmented reality may be accomplished similarly to the planned landmarks.
  • the location relative to the real space is known from the registration process.
  • the display view 300 may display only virtual representations of surgeon generated landmarks in one example, only virtual representations of planned landmarks in another example, or both in a third example.
  • the AR device may query the surgeon to confirm the placements (e.g., audibly, visually, etc.).
  • the surgeon may select virtually represented planned landmarks in the real space as surgeon generated landmarks.
  • the planned landmark 304 A may be selected to be converted to a surgeon generated landmark, in an example.
  • the surgeon may be presented with an option, such as to confirm the surgeon generated landmark 302 A (e.g., overriding a warning that the surgeon generated landmark 302 A is some distance from the planned landmark 304 A), changing the landmark location from the surgeon generated landmark 304 A to the planned landmark 304 A, re-doing the surgeon generated landmark 304 A based on the identified distance, moving the surgeon generated landmark 304 A in the direction of the planned landmark 304 A (e.g., along a line or plane, or via freehand movement, such as a gesture visible within the display view 300 ), or the like.
  • confirm the surgeon generated landmark 302 A e.g., overriding a warning that the surgeon generated landmark 302 A is some distance from the planned landmark 304 A
  • changing the landmark location from the surgeon generated landmark 304 A to the planned landmark 304 A re-doing the surgeon generated landmark 304 A based on the identified distance
  • moving the surgeon generated landmark 304 A in the direction of the planned landmark 304 A e.g., along a
  • the landmarks such as 302 C and 304 C that are overlapping, at the same place, substantially co-located, or adjacent, may be confirmed with a single entry on a virtual user interface, via a gesture, audibly, etc., or may be skipped (e.g., not asked to confirm) and assumed to be correct.
  • a threshold distance for different treatment may be used, and the threshold distance may be personalized, in an example.
  • the landmarks 302 B and 304 B may greater than the threshold distance in some examples, but less than the threshold distance in some other examples. In some examples, only landmarks that have a distance between surgeon generated and planned greater than the threshold may trigger a warning or require confirmation input from the surgeon.
  • the surgeon generated landmarks may be obtained using a robotic arm, which may include an automated process, a force-assist process, a force-resist process, or the like. Even though these landmarks are referred to herein as surgeon generated, they may be obtained autonomously by the robotic arm.
  • the registration may leverage the coordinate system of the robotic arm to translate the landmarks to the display view 300 of the AR device (e.g., rather than or in addition to using image processing or some other technique).
  • a virtual navigation menu may be presented within the display view 300 .
  • the virtual navigation menu may be used to operate aspects of the robotic arm, toggle display of landmarks, proceed to a next step in a surgical procedure, or the like.
  • the navigation menu may be moved or resized within the display view 300 , in an example. Movement may occur in response to a gesture, audible instruction, or the like.
  • the virtual navigation menu may automatically and virtually follow the robotic arm moving in real space, such as within the display view 300 .
  • FIG. 4 illustrates a flowchart showing a technique 400 for displaying virtual representations of a landmark in accordance with some embodiments.
  • the technique 400 may be performed by a processor, for example by executing instructions stored in memory.
  • the technique 400 includes an operation 402 to receive an indication of a location of a landmark on a bone of a patient.
  • the indication may be stored in a database or received directly from a landmark generation device (e.g., a pointer).
  • the technique 400 may include registering the bone using a 3D model before receiving the indication of the landmark.
  • a preliminary registration to the 3D model may be performed using a surface mapping technique via a camera on the AR device or robot arm, subject to confirmation by the landmark registration process.
  • the preliminary registration may be inaccurate, but helpful for use during landmark registration by orienting the AR device to display virtual landmarks.
  • a position or orientation of the bone may be determined using bone tracking, such as via a passive robotic arm.
  • the technique 400 includes an operation 404 to retrieve a planned location of the landmark on the bone of the patient.
  • the planned location may be retrieved based on a pre-operative image of the bone of the patient.
  • the pre-operative image may be registered to a current patient space, in an example.
  • the technique 400 includes an operation 406 to present, using an augmented reality display, a virtual indication of the landmark at the location or the planned location, or both.
  • the virtual indication may be presented within a surgical field while permitting the surgical field to be viewed through the augmented reality display.
  • the technique 400 includes an operation 408 to receive an input related to the landmark.
  • the input may include a response to a request for confirmation of the location of the landmark.
  • Operation 408 may include moving the location, confirming the location, indicating that the location is to be re-selected, validating the location, temporarily accepting or denying the location, an indication to remove the virtual indication (which may then be removed), or the like.
  • the technique 400 may include using earlier landmarks to iteratively update an expectation of where subsequent planned landmarks are to be identified in a 3D coordinate system of the patient. For example, once a few points are registered on the 3D model, the technique 400 may include determining that remaining landmarks will accurately match the corresponding points on the 3D model. Iterative updates to orientation or position of the 3D model (e.g., in an AR view) may be performed based on received landmarks to improve accuracy of the 3D model.
  • the technique 400 may include displaying a virtual navigation menu in the augmented reality display.
  • a user may virtually interact with the virtual navigation menu as if it was displayed on a screen.
  • An indication may be received to move the virtual navigation menu presented in the augmented reality display, for example to make the location more convenient.
  • the technique 400 may include displaying a live video, using the augmented reality display, of the bone using a camera affixed to an end effector of a robotic arm.
  • FIG. 5 illustrates a surgical field including a virtual representation of a remote surgical field, for example for use with an augmented reality display in accordance with some embodiments.
  • the surgical field may be viewable within a display view 500 of an AR device.
  • the AR device may show a virtual representation of the remote surgical field.
  • a voice command or gesture may be used to control whether the remote surgical field is viewable or not.
  • the display view 500 may be configured to display aspects of the remote surgical field, such as a remote patient 508 or a remote robotic arm 506 , displayed in full or zoomed in, such as according to surgeon preference or control.
  • the display view 500 may include a close-up view of a leg or bone of the remote patient 508 , for example during a surgical procedure.
  • the display view 500 presents a virtual representation of an aspect of the remote surgical field while permitting a local real surgical field to be displayed.
  • the real surgical field may include a patient 504 or a robotic arm 502 , in some examples.
  • the virtual representation may be displayed adjacent to the patient 504 , the robotic arm 502 , or elsewhere within the local real surgical field. Adjacent in this context may include separated by an absolute distance within the surgical field, separated by a perceived distance (e.g., appearing in the display view 500 to be separated by a foot, a few feet, etc.), anchored in a location (e.g., virtually displayed at a real location within the local surgical field), or moved according to surgeon preference.
  • the virtual representation may move when zoomed in or out.
  • the leg when only a leg of the remote patient 508 is virtually visible, the leg may be placed closer to the real leg of the patient 504 , but when the patient 508 is viewed in full, this distance may be increased.
  • the virtual representation of the remote surgical field may be based on images (e.g., video) captured by a camera affixed to the remote robotic arm 506 .
  • the camera on the remote robotic arm 506 may identify a feature, and another camera or an AR device in the remote surgical field may be used to see different points of view (e.g., camera views).
  • the remote patient 508 is a live surgical patient and the local patient 504 is a live surgical patient.
  • the remote patient 508 may be remotely operated on using the robotic arm 506 by a surgeon in the real space of the display view 500 .
  • the surgeon may simultaneously operate on both the remote patient 508 and the local patient 504 .
  • Simultaneously in this example may mean the surgeon switches between the patients at various operations of the surgery, such as at each step or after particular sequences of steps, or one surgery may be completed before the next is started, but both patients are available, viewable, or ready for surgery contemporaneously.
  • the surgeon may complete surgeries more quickly because multiple staff, operating rooms, and surgical equipment may be used in parallel rather than requiring serial surgeries.
  • the remote patient 508 may be operated on by a remote surgeon (e.g., with or without the use of the robotic arm 506 ), and the surgeon in the local space of the display view 500 may be called in to consult or provide assistance (e.g., with a portion of a procedure, such as operation of the remote robotic arm 506 , for example when the remote surgeon is less experienced using a robotic arm).
  • the remote patient 508 is viewable for the consultation (e.g., in real-time) such that the surgeon in the local space may give direction or advise without needing to physically leave the local surgical field.
  • This version of the example may be particularly useful when the remote surgeon is a student, a newer surgeon, or the surgery is occurring in a remote building, city, country, etc.
  • the remote patient 508 is a live surgical patient and the local patient 504 is a cadaver.
  • a surgeon in local space 500 may view a remote surgery, which may be occurring in real-time or may have already occurred and is viewed on replay.
  • This example allows for a student or newer surgeon to complete a procedure (e.g., a new type or particularly difficult type) on a cadaver while being able to view a similar or the same procedure virtually.
  • the virtual representation may be viewed at different angles, zoomed, or the like. When the virtual representation is a replay, the surgery may be reversed, sped up, paused, etc.
  • a remote surgeon may request advice or support from the local surgeon, who may attempt a portion of the surgery on the cadaver before the portion is attempted on the live remote patient 508 . This allows for the portion of the procedure to be tested without damage to the live remote patient 508 .
  • the remote patient 508 is a cadaver and the local patient 504 is a live surgical patient.
  • a surgeon in the local space of the display view 500 may attempt a portion of a procedure on the remote cadaver before attempting the portion on the live local patient 504 .
  • the local surgeon may control the remote robotic arm 506 while performing the portion on the cadaver.
  • the remote robotic arm 506 may save actions undertaken during the operation, which may be sent to the local robotic arm 502 , and optionally edited. The saved actions may be repeated by the local robotic arm 502 , for example to perform an autonomous portion of the procedure that has been tested on the cadaver.
  • Differences between the cadaver and the local live patient 504 may be used to alter the saved actions, for example by scaling, moving target points, or the like. Differences in the robotic arms may be accounted for based on a calibration step performed before starting the surgical procedure. In an example, a procedure may be tested on a cadaver using the remote robotic arm 506 , then successful actions may be transferred to the local robotic arm 502 for autonomous action or force-resist type movement by the local robotic arm 502 when performing the procedure on the local patient 504 .
  • the remote patient 508 is a cadaver and the local patient 504 is a cadaver.
  • a surgeon may practice a procedure on two different cadavers contemporaneously to identify differences in results from changes to the procedure.
  • the surgeon may perform a procedure for a student or newer surgeon while the student or newer surgeon operates remotely on the cadaver.
  • the local surgeon may view and optionally critique the remote surgery.
  • the remote surgical field may have a similar setup, allowing the student or newer surgeon to view the teaching surgeon's operation in an augmented or virtual reality view.
  • more than one remote surgical field may be presented.
  • a teaching surgeon may view multiple remote student surgeries.
  • two or more remote surgical fields may be scaled to fit in the display view 500 .
  • a remote surgical field may be placed adjacent another remote surgical field, in an example.
  • a local surgeon may provide assistance when requested for a remote procedure, such as in a collaborative mode with the remote surgical arm 506 .
  • the collaborative mode may allow the local surgeon to move the remote surgical arm 506 , while allowing the remote surgeon to stop the remote surgical arm 506 .
  • the local surgeon may be stop or take over control of the remote surgical arm 506 while monitoring the remote surgeon operating with the remote surgical arm 506 .
  • the local surgeon may control the local robotic arm 502 , which in turn may send information to control the remote robotic arm 506 or the local robotic arm 502 may move in response to information received from the remote robotic arm 506 .
  • the robotic arms may move in concert, such that either the remote or local surgeon may control the procedure.
  • One of the surgeons may act to resist erroneous movements while the other of the surgeons performs the procedure, each using their respective robotic arm.
  • the remote surgical field may represent a surgical field in a same building as the local surgical field.
  • FIG. 6 illustrates a flowchart showing a technique 600 for displaying a virtual representation of a remote surgical field within a local surgical field in accordance with some embodiments.
  • the technique 600 may be performed by a processor, for example by executing instructions stored in memory.
  • the technique 600 includes an operation 602 to receive a video stream of a remote surgical subject.
  • the technique 600 includes an operation 604 to present, using an augmented reality display, within a surgical field, a virtual surgical field representing the remote surgical subject.
  • Operation 604 may include presenting the virtual surgical field while permitting a patient within the surgical field to be viewed through the augmented reality device.
  • the virtual surgical field may be presented adjacent to the patient, in an example. Adjacent may mean separated by a fixed distance in absolute space within the surgical field, for example a foot, a few feet, etc. In another example, adjacent may mean separated by a relative distance as perceived through the augmented reality device (e.g., appearing to be separated by a foot, a few feet, etc.). Adjacent may mean touching, or almost touching.
  • the remote surgical subject may include a patient in another operating room within a building also housing the surgical field, a cadaver, or the like.
  • the technique 600 may further include an operation to receive a voice instruction and send the voice instruction to a remote speaker (e.g., within a remote surgical field corresponding to and represented by the virtual surgical field).
  • the technique 600 may include receiving a request to present the virtual surgical field before presenting the virtual surgical field (e.g., from a colleague, student, etc.).
  • the virtual surgical field may be used for testing aspects of a technique (e.g., with a cadaver), for helping or consulting on a case, or to perform an entire procedure, in various examples.
  • a second virtual surgical field may be presented (e.g., adjacent to the patient, such as on an opposite side, or adjacent to the first surgical field) for interaction or observation of a second remote surgical subject.
  • the virtual surgical field may be displayed including a virtual representation of a remote surgical robot.
  • the remote surgical robot may be controlled by a command issued within the surgical field, for example via a voice command, a gesture, a user input on a device, touchscreen, virtual indication, a written, typed, haptic command, or the like.
  • the remote surgical robot may be guided via a gesture.
  • the virtual surgical field may be displayed based on output of a camera affixed to an end effector of the remote surgical robot.
  • FIG. 7 illustrates a robot sterilization system 700 in accordance with some embodiments.
  • the robot sterilization system 700 includes a robotic arm 704 , and a sterilization unit 706 , which may be embedded in a base 702 of the robotic arm 704 or may be separate from the robotic arm 704 .
  • the sterilization unit 706 may be mounted under the robotic arm 704 or affixed to a portion of the robotic arm 704 (e.g., the base 702 ).
  • the sterilization unit 706 may include an opening 708 that may be used to output an instrument (e.g., instrument 712 ).
  • an instrument may be output from the opening 708 , for example using a mechanism within the sterilization unit 706 .
  • the sterilization unit 706 may include a tray 710 , which may be output from the opening 708 , the tray 710 used to convey the instrument 712 .
  • a door of the sterilization unit 706 may open to allow a user to remote an instrument.
  • the robotic arm 704 may be used to retrieve an instrument from within the sterilization unit 706 .
  • the robotic arm 704 may retrieve an instrument from within the sterilization unit 706 based on known locations of instruments within the sterilization unit 706 .
  • a door may be used to reload the sterilization unit 706 in an example.
  • the sterilization unit 706 may include a sterile environment without the capability of sterilizing instruments.
  • the sterilization unit 706 is a passive sterile storage unit.
  • the sterilization unit 706 may be used to sterilize an instrument.
  • the sterilization unit 706 may use sterilization equipment to sterilize the instrument, such as by using ultraviolet light, steam, gas, an autoclave, alcohol, heat pressure, glass beads, or the like.
  • the sterilization unit 706 may be controlled by a user interface or control mechanism, such as one incorporated in the base 702 or one also used to control the robotic arm 704 (e.g., an augmented reality user interface, a display screen, a microphone and algorithm for interpreting audible commands, the robotic arm 704 itself, or the like). Controls may include initiating sterilization of an instrument (or all instruments within the sterilization unit 706 ) or outputting an instrument (e.g., opening a door, outputting a specific selected instrument, outputting a next instrument in a procedure, or outputting a machine learning model identified instrument at a particular step in a procedure).
  • a user interface or control mechanism such as one incorporated in the base 702 or one also used to control the robotic arm 704 (e.g., an augmented reality user interface, a display screen, a microphone and algorithm for interpreting audible commands, the robotic arm 704 itself, or the like). Controls may include initiating sterilization of an instrument (or all instruments within the sterilization unit 706
  • the instrument 712 may be output automatically, for example based on surgeon preferences, a machine learned model, or the like. For example, image processing may be used to determine a step of a procedure that is completed or almost completed, and an instrument for a next step may be output. In another example, movement of the robotic arm 704 may be used to determine that an instrument is needed and output that instrument. In this example, the movement may be a stored movement or a movement unique to a portion of a surgical procedure that identifies a next step.
  • FIG. 8 illustrates a flowchart showing a technique 800 for storing a sterilized instrument using a robotic system in accordance with some embodiments.
  • the technique 800 may be implemented using a surgical robotic system, such as with a processor.
  • the technique 800 includes an operation 802 to provide a sterile environment.
  • the sterile environment may be housed by a sterilization unit to store an instrument.
  • the sterilization unit may be mounted under or form a base of a surgical robotic arm of the surgical robotic system.
  • the sterilization unit may be a portable sterilization unit.
  • the sterilization unit may be a sterile storage unit without sterilization capabilities itself.
  • the sterilization unit may be configured to actively sterilize the instrument, for example using ultraviolet light, steam, gas, an autoclave, alcohol, heat pressure, glass beads, or the like.
  • the sterilization unit may store a plurality of instruments including the instrument.
  • the technique 800 includes an operation 804 to determine whether the instrument is needed. In response to a determination that the instrument is not needed, the technique 800 may return to operation 602 or 604 .
  • Operation 804 may include using machine learning techniques to determine that the instrument is needed. For example, a trained model (which may include a binary classification, a regression model, a convolutional neural network, etc.) may be used to determine that a surgical step has been reached, that a time has passed, previously stored surgeon preferences, probability, a selected workflow, or the like.
  • a command such as a spoken command, a gesture, an interaction with a physical or virtual user interface, or other techniques may be used to determine that the instrument is needed (e.g., a request for the instrument).
  • the technique 800 includes an operation 806 to, in response to a determination that the instrument is needed, provide access to the instrument from the sterile environment.
  • Operation 806 may include displaying an indication of the instrument using an augmented reality display device.
  • Operation 806 may include causing an enclosure of the sterilization unit to open, exposing the sterile environment including the instrument.
  • Operation 806 may include causing the surgical robotic arm to retrieve the instrument.
  • Operation 806 may include causing the instrument to be output from the sterilization unit via a mechanical conveyance.
  • Operation 806 may include providing a set of sterile instruments, including the instrument, for a procedure.
  • FIG. 9 illustrates a system 900 for surgical instrument identification using an augmented reality display in accordance with some embodiments.
  • the system 900 may be used to perform any of the techniques 400 or 600 described in relation to FIG. 4 or 6 , for example, by using a processor 902 .
  • the system 900 includes an augmented reality device 901 that may be in communication with a database 916 .
  • the augmented reality device 901 includes a processor 902 , memory 904 , an AR display 908 , and a camera 906 .
  • the augmented reality device 901 may include a sensor 910 , a speaker 912 , or a haptic controller 914 .
  • the database 916 may include image storage 918 or preoperative plan storage 920 .
  • the augmented reality device 901 may be a HoloLens manufactured by Microsoft of Redmond, Wash.
  • the processor 902 of the augmented reality device 901 includes an augmented reality modeler 903 .
  • the augmented reality modeler 903 may be used by the processor 902 to create the augmented reality environment.
  • the augmented reality modeler 903 may receive dimensions of a room, such as from the camera 906 or sensor 910 , and create the augmented reality environment to fit within the physical structure of the room.
  • physical objects may be present in the room and the augmented reality modeler 903 may use the physical objects to present virtual objects in the augmented reality environment.
  • the augmented reality modeler 903 may use or detect a table present in the room and present a virtual object as resting on the table.
  • the AR display 908 may display the AR environment overlaid on a real environment.
  • the display 908 may show a virtual object, using the AR device 901 , such as in a fixed position in the AR environment.
  • the augmented reality modeler 903 may receive a video stream of a remote surgical field for virtually displaying within the room.
  • a dimension of a virtual object e.g., a remote surgical field
  • the augmented reality device 901 may provide a zoom function to allow a user to zoom in on a portion of a virtual object (e.g., within a virtual surgical field).
  • the augmented reality device 901 may include a sensor 910 , such as an infrared sensor.
  • the camera 906 or the sensor 910 may be used to detect movement, such as a gesture by a surgeon or other user, that may be interpreted by the processor 902 as attempted or intended interaction by the user with the virtual target.
  • the processor 902 may identify an object in a real environment, such as through processing information received using the camera 906 .
  • the AR display 908 may present, such as within a surgical field while permitting the surgical field to be viewed through the augmented reality display, a virtual feature corresponding to a physical feature hidden by an anatomical aspect of a patient.
  • the virtual feature may have a virtual position or orientation corresponding to a first physical position or orientation of the physical feature.
  • the virtual position or orientation of the virtual feature may include an offset from the first physical position or orientation of the physical feature.
  • the offset may include a predetermined distance from the augmented reality display, a relative distance from the augmented reality display to the anatomical aspect, or the like.
  • FIG. 10 illustrates generally an example of a block diagram of a machine 1000 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments.
  • the machine 1000 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1000 may operate in the capacity of a server machine, a client machine, or both in server-client network environments.
  • the machine 1000 may be a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
  • cloud computing software as a service
  • SaaS software as a service
  • Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or like mechanisms.
  • Such mechanisms are tangible entities (e.g., hardware) capable of performing specified operations when operating.
  • the hardware may be specifically configured to carry out a specific operation (e.g., hardwired).
  • the hardware may include configurable execution units (e.g., transistors, circuits, etc.) and a computer readable medium containing instructions, where the instructions configure the execution units to carry out a specific operation when in operation.
  • the configuring may occur under the direction of the executions units or a loading mechanism.
  • the execution units are communicatively coupled to the computer readable medium when the device is operating.
  • the execution units may be configured by a first set of instructions to implement a first set of features at one point in time and reconfigured by a second set of instructions to implement a second set of features.
  • Machine 1000 may include a hardware processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 1004 and a static memory 1006 , some or all of which may communicate with each other via an interlink (e.g., bus) 1008 .
  • the machine 1000 may further include a display unit 1010 , an alphanumeric input device 1012 (e.g., a keyboard), and a user interface (UI) navigation device 1014 (e.g., a mouse).
  • the display unit 1010 , alphanumeric input device 1012 and UI navigation device 1014 may be a touch screen display.
  • the display unit 1010 may include goggles, glasses, or other AR or VR display components.
  • the display unit may be worn on a head of a user and may provide a heads-up-display to the user.
  • the alphanumeric input device 1012 may include a virtual keyboard (e.g., a keyboard displayed virtually in a VR or AR setting.
  • the machine 1000 may additionally include a storage device (e.g., drive unit) 1016 , a signal generation device 1018 (e.g., a speaker), a network interface device 1020 , and one or more sensors 1021 , such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor.
  • the machine 1000 may include an output controller 1028 , such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices.
  • a serial e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.
  • the storage device 1016 may include a machine readable medium 1022 that is non-transitory on which is stored one or more sets of data structures or instructions 1024 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein.
  • the instructions 1024 may also reside, completely or at least partially, within the main memory 1004 , within static memory 1006 , or within the hardware processor 1002 during execution thereof by the machine 1000 .
  • one or any combination of the hardware processor 1002 , the main memory 1004 , the static memory 1006 , or the storage device 1016 may constitute machine readable media.
  • machine readable medium 1022 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 1024 .
  • machine readable medium may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 1024 .
  • machine readable medium may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 1000 and that cause the machine 1000 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
  • Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media.
  • machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • non-volatile memory such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • flash memory devices e.g., electrically Erasable Programmable Read-Only Memory (EEPROM)
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • flash memory devices e.g., electrically Era
  • the instructions 1024 may further be transmitted or received over a communications network 1026 using a transmission medium via the network interface device 1020 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.).
  • transfer protocols e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.
  • Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, as the personal area network family of standards known as Bluetooth® that are promulgated by the Bluetooth Special Interest Group, peer-to-peer (P2P) networks, among others.
  • the network interface device 1020 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 1026 .
  • the network interface device 1020 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques.
  • SIMO single-input multiple-output
  • MIMO multiple-input multiple-output
  • MISO multiple-input single-output
  • transmission medium shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 1000 , and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • FIGS. 11A-11B illustrate user interface components for landmark planning and plan evaluation in accordance with some embodiments.
  • FIG. 11A illustrates user interface feedback components 1102 - 1108 to provide feedback based on location of a patient, an instrument, or a placed or planned landmark.
  • the feedback components 1102 - 1108 may be presented on a user interface on a display, such as a display screen, an augmented reality view, or a virtual reality view. For example, when a landmark is placed by a surgeon, a feedback component may pop up in an augmented reality display so that the surgeon may view the feedback component.
  • Feedback component 1102 may indicate that movement is needed (e.g., of a bone or patient, of an instrument, of a camera, etc.), such as before starting a landmark placement process.
  • Feedback component 1104 may indicate that an acquisition point is too close to a previous landmark (e.g., virtual or placed).
  • Feedback component 1106 may indicate that an acquisition point was outside a target area (e.g., an error occurred, a stray landmark was placed, or a landmark was accidentally placed outside a bone or other target area).
  • Feedback component 1108 may indicate that a point that was placed is not aligned with a planned or virtual landmark point. The feedback component 1108 may appear instead of or in addition to a visual indication on the landmark itself (e.g., using an AR display).
  • the feedback component 1102 - 1108 may be interactable components, such as in an AR display.
  • a feedback component may be selected to provide further visual feedback, such as a video example, highlighting of a landmark (virtual, placed, or planned), an arrow or other indicator pointing out the issue, or the like.
  • FIG. 11B illustrates example virtual landmarks displayed for example in a user interface component of an AR display.
  • a component 1110 illustrates a portion of a knee bone with various virtual landmarks 1112 - 1116 displayed.
  • the virtual landmarks 1112 - 1116 may be visually distinct, such as including a color coding, an animation (e.g., flashing), have a transparency, or the like.
  • virtual landmark 1112 may indicate an upcoming landmark point to be placed
  • virtual landmark 1114 may indicate a current landmark point being placed
  • virtual landmark 1116 may indicate a successful landmark point previously placed.
  • the virtual landmarks 1112 - 1116 may be labeled in some examples, such as with a name for the landmark (e.g., femoral canal entry, posterior condyles, etc.).
  • the labels may be displayed in the AR display virtually, and in some examples may be hidden or revealed according to user preference or selection.
  • the virtual landmarks 1112 - 1116 may be removed when completed (e.g., for successfully placed landmarks), such as automatically or after user confirmation.
  • a plurality of virtual landmarks may be displayed representing all successfully placed points or all points that still need to be completed.
  • warnings, alerts, or information may be presented in an AR display during landmark selection.
  • an alert or warning may be displayed that some landmarks may have been placed incorrectly. These landmarks may be highlighted or otherwise indicated in the AR display (e.g., as a virtual landmark such as 1112 ).
  • Other information may be displayed during the landmarking process, such as surgeon notes for example notes captured during planning and placement of virtual landmarks, suggestions, steps, or the like.
  • Example 1 is a method for using an augmented reality device in a surgical field comprising: receiving, at a processor, an indication of a location of a landmark on a bone of a patient; retrieving, using the processor, a planned location of the landmark on the bone of the patient based on a pre-operative image of the bone of the patient; presenting, using an augmented reality display, within a surgical field, while permitting the surgical field to be viewed through the augmented reality display, a virtual indication of the landmark at the location and a virtual indication of the landmark at the planned location; and receiving, at the processor, a response to a request for confirmation of the location of the landmark.
  • Example 2 the subject matter of Example 1 includes, wherein the indication of the location of the landmark is stored in a database.
  • Example 3 the subject matter of Examples 1-2 includes, wherein the indication of the location of the landmark is received directly from a landmark generation device.
  • Example 4 the subject matter of Examples 1-3 includes, wherein the response confirms the location of the landmark.
  • Example 5 the subject matter of Examples 1-4 includes, wherein the response changes the location of the landmark to the planned location.
  • Example 6 the subject matter of Examples 1-5 includes, wherein the response includes a new location for the landmark.
  • Example 7 the subject matter of Examples 1-6 includes, removing the virtual indication in response to receiving the response.
  • Example 8 the subject matter of Examples 1-7 includes, registering the bone using a 3D model before receiving the indication of the landmark.
  • Example 9 the subject matter of Examples 1-8 includes, receiving an indication to move a virtual navigation menu presented in the augmented reality display.
  • Example 10 the subject matter of Examples 1-9 includes, wherein a position and orientation of the bone is determined using bone tracking via a passive robotic arm.
  • Example 11 the subject matter of Examples 1-10 includes, displaying a live video, using the augmented reality display, of the bone using a camera affixed to an end effector of a robotic arm.
  • Example 12 is a system configured to perform operations of any of any of the methods of Examples 1-11.
  • Example 13 is at least one machine-readable medium including instructions for operation of a computing system, which when executed by a machine, cause the machine to perform operations of any of the methods of Examples 1-11.
  • Example 14 is an apparatus comprising means for performing any of the methods of Examples 1-11.
  • Example 15 is a method for using an augmented reality device in a surgical field comprising: receiving a video stream of a remote surgical subject; and presenting, using an augmented reality display, within a surgical field, while permitting a patient within the surgical field to be viewed through the augmented reality display, a virtual surgical field adjacent to the patient, the virtual surgical field representing the remote surgical subject.
  • Example 16 the subject matter of Example 15 includes, receiving a voice instruction and sending the voice instruction to a remote speaker.
  • Example 17 the subject matter of Examples 15-16 includes, wherein the remote surgical subject includes a patient in another operating room within a building housing the surgical field.
  • Example 18 the subject matter of Examples 15-17 includes, wherein the remote surgical subject includes a cadaver.
  • Example 19 the subject matter of Examples 15-18 includes, wherein presenting the virtual surgical field includes displaying a virtual representation of a remote surgical robot.
  • Example 20 the subject matter of Example 19 includes, sending a command to the remote surgical robot.
  • Example 21 the subject matter of Example 20 includes, wherein the command includes a written, typed, touchscreen-selected, augmented reality selected, or spoken command.
  • Example 22 the subject matter of Examples 19-21 includes, guiding the remote surgical robot via a gesture.
  • Example 23 the subject matter of Examples 19-22 includes, displaying a view of the virtual surgical field using a camera affixed to an end effector of the remote surgical robot.
  • Example 24 the subject matter of Examples 15-23 includes, receiving a request to present the virtual surgical field before presenting the virtual surgical field.
  • Example 25 the subject matter of Examples 15-24 includes, presenting a second virtual surgical field adjacent to the patient or adjacent to the virtual surgical field, the second virtual surgical field representing a second remote surgical subject.
  • Example 26 is a system configured to perform operations of any of any of the methods of Examples 15-25.
  • Example 27 is at least one machine-readable medium including instructions for operation of a computing system, which when executed by a machine, cause the machine to perform operations of any of the methods of Examples 15-25.
  • Example 28 is an apparatus comprising means for performing any of the methods of Examples 15-25.
  • Example 29 is a surgical robotic system comprising: a surgical robotic arm; a sterilization unit enclosing a sterile environment and storing an instrument; a processor configured to: determine that the instrument is needed in an upcoming portion of a surgical procedure; and provide access to the instrument.
  • Example 30 the subject matter of Example 29 includes, wherein the sterilization unit is a base of the surgical robotic arm.
  • Example 31 the subject matter of Examples 29-30 includes, wherein the sterilization unit is a portable sterilization unit, and wherein the surgical robotic arm is configured to be mounted on the portable sterilization unit.
  • Example 32 the subject matter of Examples 29-31 includes, wherein the sterilization unit is a sterile storage unit without sterilization capabilities.
  • Example 33 the subject matter of Examples 29-32 includes, wherein the sterilization unit is configured to actively sterilize the instrument.
  • Example 34 the subject matter of Examples 29-33 includes, wherein the sterilization unit is configured to store a plurality of instruments including the instrument.
  • Example 35 the subject matter of Examples 29-34 includes, wherein the determination that the instrument is needed is based on machine learning.
  • Example 36 the subject matter of Examples 29-35 includes, wherein the determination that the instrument is needed is based on a previously stored surgeon preference.
  • Example 37 the subject matter of Examples 29-36 includes, wherein the determination that the instrument is needed is based on a probability using a selected workflow and a timer.
  • Example 38 the subject matter of Examples 29-37 includes, wherein the determination that the instrument is needed includes receiving a request for the instrument, including at least one of a spoken command, a touch on a touchscreen, an interaction with an augmented reality user interface, or a gesture.
  • Example 39 the subject matter of Examples 29-38 includes, wherein to provide access to the instrument, the processor is further configured to display an indication of the instrument using an augmented reality display device.
  • Example 40 the subject matter of Examples 29-39 includes, wherein to provide access to the instrument, the processor is further configured to cause an enclosure of the sterilization unit to open, exposing the sterile environment including the instrument.
  • Example 41 the subject matter of Examples 29-40 includes, wherein to provide access to the instrument, the processor is further configured to cause the surgical robotic arm to retrieve the instrument.
  • Example 42 the subject matter of Examples 29-41 includes, wherein to provide access to the instrument, the processor is further configured to cause the instrument to be output from the sterilization unit via a mechanical conveyance.
  • Example 43 the subject matter of Examples 29-42 includes, wherein to provide access to the instrument, the processor is further configured to provide a set of sterile instruments, including the instrument, for a procedure.
  • Example 44 is a system configured to perform operations of any of any of the methods of Examples 29-43.
  • Example 45 is at least one machine-readable medium including instructions for operation of a computing system, which when executed by a machine, cause the machine to perform operations of any of the methods of Examples 29-43.
  • Example 46 is an apparatus comprising means for performing any of the methods of Examples 29-43.
  • Example 47 is a method of using a surgical robotic system comprising: determining that an instrument, stored in a sterilization unit enclosing a sterile environment is needed in an upcoming portion of a surgical procedure; and providing access to the instrument from the sterilization unit that is mounted under or forms a base of a surgical robotic arm of the surgical robotic system.
  • Example 48 is a method for using an augmented reality device in a surgical field comprising: receiving, at a processor, an indication of a location of a landmark on a bone of a patient; retrieving, using the processor, a planned location of the landmark on the bone of the patient based on a pre-operative image of the bone of the patient; presenting within the surgical field, using an augmented reality display of the augmented reality device, while permitting the surgical field to be viewed through the augmented reality display, a virtual indication of the landmark at the location and a virtual indication of the landmark at the planned location; and receiving, at the processor, a confirmation of the location of the landmark.
  • Example 49 the subject matter of Example 48 includes, wherein the indication of the location of the landmark is stored in a database.
  • Example 50 the subject matter of Examples 48-49 includes, wherein the indication of the location of the landmark is received directly from a landmark generation device.
  • Example 51 the subject matter of Examples 48-50 includes, wherein the confirmation indicates the location of the landmark is correct.
  • Example 52 the subject matter of Examples 48-51 includes, for a second landmark having a second location and a second planned location, receiving a change for the second landmark from the second location to the second planned location, and outputting information corresponding to the change to the augmented reality display.
  • Example 53 the subject matter of Examples 48-52 includes, for a second landmark having a second location and a second planned location, receiving a change for the second landmark from the second location to a new location other than the second location and the second planned location, and outputting information corresponding to the new location to the augmented reality display for presenting.
  • Example 54 the subject matter of Examples 48-53 includes, removing the virtual indications in response to receiving the confirmation.
  • Example 55 the subject matter of Examples 48-54 includes, D model before receiving the indication of the landmark.
  • Example 56 the subject matter of Examples 48-55 includes, receiving an indication to move a virtual navigation menu presented in the augmented reality display.
  • Example 57 the subject matter of Examples 48-56 includes, wherein a position and orientation of the bone is determined using bone tracking via a passive robotic arm.
  • Example 58 the subject matter of Examples 48-57 includes, displaying a live video, using the augmented reality display, of the bone using a camera affixed to an end effector of a robotic arm.
  • Example 59 is an augmented reality device in a surgical field comprising: a processor; memory including instructions, which when executed by the processor, cause the processor to perform operations to: receive an indication of a location of a landmark on a bone of a patient; retrieve a planned location of the landmark on the bone of the patient based on a pre-operative image of the bone of the patient; and receive a confirmation of the location of the landmark; and an augmented reality display to, before the processor receives the confirmation, present within the surgical field while permitting the surgical field to be viewed through the augmented reality display, a virtual indication of the landmark at the location and a virtual indication of the landmark at the planned location.
  • Example 60 the subject matter of Example 59 includes, wherein the indication of the location of the landmark is stored in a database.
  • Example 61 the subject matter of Examples 59-60 includes, wherein the indication of the location of the landmark is received directly from a landmark generation device.
  • Example 62 the subject matter of Examples 59-61 includes, wherein the confirmation indicates the location of the landmark is correct.
  • Example 63 the subject matter of Examples 59-62 includes, wherein the instructions further cause the processor to, for a second landmark having a second location and a second planned location, receive a change for the second landmark from the second location to the second planned location, and output information corresponding to the change to the augmented reality display for presenting.
  • Example 64 the subject matter of Examples 59-63 includes, wherein the instructions further cause the processor to, for a second landmark having a second location and a second planned location, receive a change for the second landmark from the second location to a new location other than the second location and the second planned location, and output information corresponding to the new location to the augmented reality display for presenting.
  • Example 65 the subject matter of Examples 59-64 includes, wherein the instructions further cause the processor to remove the virtual indications in response to receiving the confirmation.
  • Example 66 the subject matter of Examples 59-65 includes, D model before receiving the indication of the landmark.
  • Example 67 is at least one machine-readable medium including instructions for operating an augmented reality device in a surgical field, which when executed by a processor, cause the processor to perform operations to: retrieving a plurality of planned locations corresponding to each of a plurality of landmarks on a bone of a patient based on pre-operative imaging of the bone of the patient; presenting within the surgical field, using an augmented reality display of the augmented reality device, while permitting the surgical field to be viewed through the augmented reality display, virtual indications of the plurality of landmarks at the plurality of planned locations; and receiving a confirmation of a first planned location as presented using the augmented reality display for a first landmark of the plurality of landmarks; and receiving a change to a second planned location as presented using the augmented reality display for a second landmark of the plurality of landmarks.
  • Example 68 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1-67.
  • Example 69 is an apparatus comprising means to implement of any of Examples 1-67.
  • Example 70 is a system to implement of any of Examples 1-67.
  • Example 71 is a method to implement of any of Examples 1-67.
  • Method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples.
  • An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times.
  • Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMs), read only memories (ROMs), and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Epidemiology (AREA)
  • Robotics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Pathology (AREA)
  • General Chemical & Material Sciences (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

A method or system for using an augmented reality device may include displaying information in a surgical field. A method may include receiving an indication of a location of a landmark on a bone of a patient, retrieving a planned location of the landmark on the bone of the patient and receiving information corresponding to the location or the planned location. The location or the planned location may be displayed as a virtual indication using an augmented reality display of the augmented reality device, for example while permitting the surgical field to be viewed through the augmented reality display.

Description

    CLAIM OF PRIORITY
  • This application claims the benefit of priority to U.S. Provisional Application No. 63/052,137 filed Jul. 15, 2020, titled “INSTRUMENT PREPARATION AND VALIDATION,” which is hereby incorporated herein by reference in its entirety.
  • BACKGROUND
  • Surgical advancements have allowed surgeons to use preoperative planning, display devices within a surgical field, optical imaging, and guides to improve surgical outcomes and customize surgery for a patient. While these advances have allowed for quicker and more successful surgeries, they ultimately rely on physical objects, which have costs and time requirements for manufacturing and configuration. Physical objects and devices may also obstruct portions of a surgical field, detracting from their benefits.
  • Computer-assisted surgery is a growing field that encompasses a wide range of devices, uses, procedures, and computing techniques, such as surgical navigation, pre-operative planning, and various robotic techniques. In computer-assisted surgery procedures, a robotic system may be used in some surgical procedures, such as orthopedic procedures, to aid a surgeon in completing the procedures more accurately, quicker, or with less fatigue.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
  • FIG. 1 illustrates surgical field in accordance with some embodiments.
  • FIG. 2 illustrates an AR instrument identification display in accordance with some embodiments.
  • FIG. 3 illustrates a system for displaying virtual representations of a landmark in accordance with some embodiments.
  • FIG. 4 illustrates a flowchart showing a technique for displaying virtual representations of a landmark in accordance with some embodiments.
  • FIG. 5 illustrates a surgical field including a virtual representation of a remote surgical field, for example for use with an augmented reality display in accordance with some embodiments.
  • FIG. 6 illustrates a flowchart showing a technique for displaying a virtual representation of a remote surgical field within a local surgical field in accordance with some embodiments.
  • FIG. 7 illustrates a robot sterilization system in accordance with some embodiments.
  • FIG. 8 illustrates a flowchart showing a technique for storing a sterilized instrument using a robotic system in accordance with some embodiments.
  • FIG. 9 illustrates a system for surgical instrument identification using an augmented reality display in accordance with some embodiments.
  • FIG. 10 illustrates generally an example of a block diagram of a machine upon which any one or more of the techniques discussed herein may perform in accordance with some embodiments.
  • FIGS. 11A-11B illustrate user interface components for landmark planning and plan evaluation in accordance with some embodiments.
  • DETAILED DESCRIPTION
  • Systems and methods for using an augmented reality device during a surgical procedure are described herein. The systems and methods herein describe uses for the augmented reality device, such as to display a landmark or representations of real objects overlaid on a real environment. An augmented reality (AR) device allows a user to view displayed virtual objects that appear to be projected into the real environment, which is also visible. AR devices typically include two display lenses or screens, including one for each eye of a user. Light is permitted to pass through the two display lenses such that aspects of the real environment are visible while also projecting light to make virtual elements visible to the user of the AR device.
  • FIG. 1 illustrates surgical field 100 in accordance with some embodiments. The surgical field 100 is illustrated in FIG. 1 including a surgeon 102, a patient 108, and may include a camera 112. The surgeon 102 is wearing an augmented reality (AR) device 104 which may be used to display a virtual object 110 to the surgeon 102. The virtual object 110 may not be visible to others within the surgical field 100 (e.g., surgical assistant 114 or nurse 120), though they may wear AR devices 116 and 122 respectively. Even if another person is viewing the surgical field 100 with an AR device, the person may not be able to see the virtual object 110 or may be able to see the virtual object 110 in a shared augmented reality with the surgeon 102, or may be able to see a modified version of the virtual object 110 (e.g., according to customizations unique to the surgeon 102 or the person) or may see different virtual objects entirely. Augmented reality is explained in more detail below.
  • Augmented reality is a technology for displaying virtual or “augmented” objects or visual effects overlaid on a real environment. The real environment may include a room or specific area (e.g., the surgical field 100), or may be more general to include the world at large. The virtual aspects overlaid on the real environment may be represented as anchored or in a set position relative to one or more aspects of the real environment. For example, the virtual object 110 may be configured to appear to be resting on a table. An AR system may present virtual aspects that are fixed to a real object without regard to a perspective of a viewer or viewers of the AR system (e.g., the surgeon 102). For example, the virtual object 110 may exist in a room, visible to a viewer of the AR system within the room and not visible to a viewer of the AR system outside the room. The virtual object 110 in the room may be displayed to the viewer outside the room when the viewer enters the room. In this example, the room may act as a real object that the virtual object 110 is fixed to in the AR system.
  • The AR device 104 may include one or more screens, such as a single screen or two screens (e.g., one per eye of a user). The screens may allow light to pass through the screens such that aspects of the real environment are visible while displaying the virtual object 110. The virtual object 110 may be made visible to the surgeon 102 by projecting light. The virtual object 110 may appear to have a degree of transparency or may be opaque (i.e., blocking aspects of the real environment).
  • An AR system may be viewable to one or more viewers, and may include differences among views available for the one or more viewers while retaining some aspects as universal among the views. For example, a heads-up display may change between two views while virtual objects may be fixed to a real object or area in both views. Aspects such as a color of an object, lighting, or other changes may be made among the views without changing a fixed position of at least one virtual object.
  • A user may see the virtual object 110 presented in an AR system as opaque or as including some level of transparency. In an example, the user may interact with the virtual object 110, such as by moving the virtual object 110 from a first position to a second position. For example, the user may move an object with his or her hand. This may be done in the AR system virtually by determining that the hand has moved into a position coincident or adjacent to the object (e.g., using one or more cameras, which may be mounted on an AR device, such as AR device camera 106 or separate, and which may be static or may be controlled to move), and causing the object to move in response. Virtual aspects may include virtual representations of real world objects or may include visual effects, such as lighting effects, etc. The AR system may include rules to govern the behavior of virtual objects, such as subjecting a virtual object to gravity or friction, or may include other predefined rules that defy real world physical constraints (e.g., floating objects, perpetual motion, etc.). An AR device 104 may include a camera 106 on the AR device 104 (not to be confused with the camera 112, separate from the AR device 104). The AR device camera 106 or the camera 112 may include an infrared camera, an infrared filter, a visible light filter, a plurality of cameras, a depth camera, etc. The AR device 104 may project virtual items over a representation of a real environment, which may be viewed by a user.
  • Eye tracking may be used with an AR system to determine which instrument a surgeon wants next by tracking the surgeon's eye to the instrument. In an example, a nurse or surgical assistant may then retrieve the determined instrument. The determined instrument may be presented in AR to the nurse or surgical assistant. In another example, the surgeon may speak the instrument (e.g., using a pre-selected code word, using speech processing and word recognition, via saying a number, or the like). The voice command may be combined with eye tracking, in still another example, to find an instrument;
  • The AR device 104 may be used in the surgical field 100 during a surgical procedure, for example performed by the surgeon 102 on the patient 108. The AR device 104 may project or display virtual objects, such as the virtual object 110 during the surgical procedure to augment the surgeon's vision. The surgeon 102 may control the virtual object 110 using the AR device 104, a remote controller for the AR device 104, or by interacting with the virtual object 110 (e.g., using a hand to “interact” with the virtual object 110 or a gesture recognized by the camera 106 of the AR device 104). The virtual object 108 may augment a surgical tool. For example, the virtual object 110 may appear (to the surgeon 102 viewing the virtual object 110 through the AR device 104) as a representation of a landmark previously placed on a patient bone. In another example, the virtual object 110 may be used to represent a planned location of a landmark (e.g., using a pre-operative image and a captured image of the bone in the real space). In certain examples, the virtual object 110 may react to movements of other virtual or real-world objects in the surgical field. For example, the virtual object 110 may be altered by a to move a landmark (e.g., a placed landmark). Further discussion of virtual landmarks is discussed below with respect to FIGS. 3-4.
  • In other examples, the virtual object 110 may be a virtual representation of a remote surgical field (e.g., an entire OR, a camera field of view of a room, a close-up view of a surgical theater, etc.). In this example, the virtual object 110 may include a plurality of virtual objects. Further discussion of this example is provided below with respect to FIGS. 5-6.
  • FIG. 2 illustrates an augmented reality (AR) instrument identification display 200 in accordance with some embodiments. Prior to any surgical procedure, the nursing staff unloads trays, and prepares and places instrumentation for the procedure on a table. This process may be fastidious and error prone (e.g., missing instrument, misplacement of instrument, etc.). A surgeon may have preferences for instrument placement, table location, or the like. For example, the table may be preferred in a particular setup, which may increase consistency and efficiency by removing risks of the wrong tool being picked up, which may delay a surgery. Errors due to human choice, staff change, turnover, or the like may be responsible for decreases in efficiency. The instrumentation placement process may include a check-list, which is time consuming and also error prone.
  • The present systems and methods may include a technological solution to errors in instrument placement by leveraging artificial intelligence or augmented reality (AR) to ensure correct placement of instruments. The systems and methods described herein may tell staff which instrument to place in what location on a table, for example based on surgeon preference (e.g., using AR). The systems and methods described herein may be used to verify that one or all instruments are correctly placed on the table, such as using an automatic check list verification. In an example, complicated instruments may be assembled using the systems and methods described herein.
  • The benefits of using the present systems and methods include a faster preparation or setup of a procedure room (e.g., operating room), eliminating instrument misplacement (improving workflow, efficiency, etc.), and helping avoid the need for surgeon oversight in the process.
  • The AR instrument identification display 200 includes a surgical instrument 206, a virtual indicator 208, and may include additional information 210, such as patient or procedure information. The virtual indicator 208 may be used to identify the surgical instrument 206 that corresponds to a procedure being performed. The virtual indicator 208 may include moving lights, flashing lights, color or changing color lights, or other virtual effects. The additional information 210 may for example, name or provide other information about the surgical instrument 206. The virtual indicator 208 may be added to the AR display 200B in response to a surgeon selection identifying a need for the surgical instrument 206. In an example, when the surgical instrument 206 is or has been moved, selected, or the surgical assistant otherwise indicates that it has been located or identified (or if the surgeon indicates it is no longer needed), the virtual indicator 208 may be removed from the AR display 200. In an example a virtual indicator 212 may be used to identify an item, such as a correctly or an incorrectly placed instrument, a verified instrument, or an unknown instrument. A user of the AR device used to present the AR display 200 may interact with the virtual indicator 208, for example by placing a finger, hand, or item adjacent to or appearing to occupy the same space as the virtual indicator 208. In response, the virtual indicator 208 may perform an action, such as displaying information about the item represented by the virtual indicator 208 (e.g., a name of the item, whether the item is a one-time use item or can be re-sterilized, whether the item is fragile, whether the item is a patient-specific or personalized item, what procedure the item is to be used for, or the like).
  • In an example, a schedule for procedures during a day in an operating room may be obtained or retrieved by a device. The device may provide AR capabilities to a user, including instructions for setting up a next procedure in the schedule. The users, with the aid of the AR, may place the instruments in correct position or orientation on a table in the operating room. After placement of an instrument, a verification process may be performed, and an output (e.g., correctly placed or incorrectly placed, such as with additional instructions for correct placement) may be provided to the user (e.g., via the AR). When the process is complete, and all instruments have been checked as correctly placed by the verification process, a picture may be taken and a full verification process may be performed to validate the operating room for the given procedure. The full verification process may include a second check of each instrument, a check of the instruments against needed instruments for the given procedure, timing verification based on the schedule, or the like. Data may be collected about a surgical procedure, such as a time-series of data based on progression through the procedure, what steps occur at what times (e.g., when started or completed), locations of team members (e.g., surgeon, nurse, etc.) throughout the procedure, camera stills or video of the procedure at various moments, instrument tracking or use, or the like.
  • FIG. 3 illustrates a system for displaying virtual representations of a landmark in accordance with some embodiments.
  • In an example, a landmark may be obtained, such as on a bone of a patient. An AR device may show a virtual representation of the landmark that was acquired in a display view 300. The virtual representation may be displayed on a bone (e.g., a femur 306 or a tibia 308) of the patient (e.g., overlaid on the real bone). The AR device may request confirmation (e.g., via a display) to confirm the landmark's location. In an example, a voice command may be used to control the landmark confirmation or display with the AR device.
  • The virtual representations may include representations of surgeon generated (e.g., selected or registered) landmarks (e.g., 302A, 302B, and 302C) or planned landmarks (e.g., 304A, 304B, and 304C). The AR display view 300 allows the femur 306 and the tibia 308 to be visible while also presenting virtual representations of landmarks. In other examples, different bones (e.g., hip, shoulder, spine, etc.) may be viewable. In still other examples, a virtual representation of a bone may be displayed with the virtual representations of landmarks (e.g., entirely virtual).
  • The surgeon generated landmarks may include a landmark 302A, which is displayed on the femur 306 separated by some distance from a corresponding planned landmark 304A. The planned landmark 304A may be generated based on pre-operative planning, for example using a 3D model, an image of the patient, or the like. The planned landmark 304A may be registered in the real space. For example, a known image or model coordinate system may be converted to a coordinate system in the real space using image processing. The image processing may compare captured images of a bone (e.g., in real-time), the patient, a reference object, or the like in real space to previously captured images or a previously generated model. Based on the comparison, a location of the planned landmark 304A may be registered on the real femur 306. From this registration, further processing may be used to determine how to present a virtual representation of the planned landmark 304A in the real space via an AR display device (e.g., overlaid virtually in the real space within the display view 300).
  • The surgeon generated landmark 302A may be registered based on an input device (e.g., a pointer that may be used to identify landmarks) or may be identified directly via the AR device (e.g., with visual processing of an indicated landmark). When using an input device, the registration to the real space for display in augmented reality may be accomplished similarly to the planned landmarks. In the case where the AR device is used to capture landmark locations directly, the location relative to the real space is known from the registration process.
  • The display view 300 may display only virtual representations of surgeon generated landmarks in one example, only virtual representations of planned landmarks in another example, or both in a third example. In the first example, the AR device may query the surgeon to confirm the placements (e.g., audibly, visually, etc.). In the second example, the surgeon may select virtually represented planned landmarks in the real space as surgeon generated landmarks. Said another way, the planned landmark 304A may be selected to be converted to a surgeon generated landmark, in an example. In the third example, the surgeon may be presented with an option, such as to confirm the surgeon generated landmark 302A (e.g., overriding a warning that the surgeon generated landmark 302A is some distance from the planned landmark 304A), changing the landmark location from the surgeon generated landmark 304A to the planned landmark 304A, re-doing the surgeon generated landmark 304A based on the identified distance, moving the surgeon generated landmark 304A in the direction of the planned landmark 304A (e.g., along a line or plane, or via freehand movement, such as a gesture visible within the display view 300), or the like.
  • The landmarks, such as 302C and 304C that are overlapping, at the same place, substantially co-located, or adjacent, may be confirmed with a single entry on a virtual user interface, via a gesture, audibly, etc., or may be skipped (e.g., not asked to confirm) and assumed to be correct. A threshold distance for different treatment may be used, and the threshold distance may be personalized, in an example. The landmarks 302B and 304B may greater than the threshold distance in some examples, but less than the threshold distance in some other examples. In some examples, only landmarks that have a distance between surgeon generated and planned greater than the threshold may trigger a warning or require confirmation input from the surgeon.
  • In an example, the surgeon generated landmarks may be obtained using a robotic arm, which may include an automated process, a force-assist process, a force-resist process, or the like. Even though these landmarks are referred to herein as surgeon generated, they may be obtained autonomously by the robotic arm. When using the robotic arm, the registration may leverage the coordinate system of the robotic arm to translate the landmarks to the display view 300 of the AR device (e.g., rather than or in addition to using image processing or some other technique).
  • A virtual navigation menu may be presented within the display view 300. The virtual navigation menu may be used to operate aspects of the robotic arm, toggle display of landmarks, proceed to a next step in a surgical procedure, or the like. The navigation menu may be moved or resized within the display view 300, in an example. Movement may occur in response to a gesture, audible instruction, or the like. In an example, the virtual navigation menu may automatically and virtually follow the robotic arm moving in real space, such as within the display view 300.
  • FIG. 4 illustrates a flowchart showing a technique 400 for displaying virtual representations of a landmark in accordance with some embodiments. The technique 400 may be performed by a processor, for example by executing instructions stored in memory.
  • The technique 400 includes an operation 402 to receive an indication of a location of a landmark on a bone of a patient. The indication may be stored in a database or received directly from a landmark generation device (e.g., a pointer). The technique 400 may include registering the bone using a 3D model before receiving the indication of the landmark. For example, a preliminary registration to the 3D model may be performed using a surface mapping technique via a camera on the AR device or robot arm, subject to confirmation by the landmark registration process. The preliminary registration may be inaccurate, but helpful for use during landmark registration by orienting the AR device to display virtual landmarks. A position or orientation of the bone may be determined using bone tracking, such as via a passive robotic arm.
  • The technique 400 includes an operation 404 to retrieve a planned location of the landmark on the bone of the patient. The planned location may be retrieved based on a pre-operative image of the bone of the patient. The pre-operative image may be registered to a current patient space, in an example.
  • The technique 400 includes an operation 406 to present, using an augmented reality display, a virtual indication of the landmark at the location or the planned location, or both. The virtual indication may be presented within a surgical field while permitting the surgical field to be viewed through the augmented reality display.
  • The technique 400 includes an operation 408 to receive an input related to the landmark. The input may include a response to a request for confirmation of the location of the landmark. Operation 408 may include moving the location, confirming the location, indicating that the location is to be re-selected, validating the location, temporarily accepting or denying the location, an indication to remove the virtual indication (which may then be removed), or the like.
  • In an example, the technique 400 may include using earlier landmarks to iteratively update an expectation of where subsequent planned landmarks are to be identified in a 3D coordinate system of the patient. For example, once a few points are registered on the 3D model, the technique 400 may include determining that remaining landmarks will accurately match the corresponding points on the 3D model. Iterative updates to orientation or position of the 3D model (e.g., in an AR view) may be performed based on received landmarks to improve accuracy of the 3D model.
  • The technique 400 may include displaying a virtual navigation menu in the augmented reality display. A user may virtually interact with the virtual navigation menu as if it was displayed on a screen. An indication may be received to move the virtual navigation menu presented in the augmented reality display, for example to make the location more convenient. The technique 400 may include displaying a live video, using the augmented reality display, of the bone using a camera affixed to an end effector of a robotic arm.
  • FIG. 5 illustrates a surgical field including a virtual representation of a remote surgical field, for example for use with an augmented reality display in accordance with some embodiments. The surgical field may be viewable within a display view 500 of an AR device. The AR device may show a virtual representation of the remote surgical field. In an example, a voice command or gesture may be used to control whether the remote surgical field is viewable or not.
  • The display view 500 may be configured to display aspects of the remote surgical field, such as a remote patient 508 or a remote robotic arm 506, displayed in full or zoomed in, such as according to surgeon preference or control. For example, the display view 500 may include a close-up view of a leg or bone of the remote patient 508, for example during a surgical procedure.
  • The display view 500 presents a virtual representation of an aspect of the remote surgical field while permitting a local real surgical field to be displayed. The real surgical field may include a patient 504 or a robotic arm 502, in some examples. The virtual representation may be displayed adjacent to the patient 504, the robotic arm 502, or elsewhere within the local real surgical field. Adjacent in this context may include separated by an absolute distance within the surgical field, separated by a perceived distance (e.g., appearing in the display view 500 to be separated by a foot, a few feet, etc.), anchored in a location (e.g., virtually displayed at a real location within the local surgical field), or moved according to surgeon preference. In some examples, the virtual representation may move when zoomed in or out. For example, when only a leg of the remote patient 508 is virtually visible, the leg may be placed closer to the real leg of the patient 504, but when the patient 508 is viewed in full, this distance may be increased. The virtual representation of the remote surgical field may be based on images (e.g., video) captured by a camera affixed to the remote robotic arm 506. For example, the camera on the remote robotic arm 506 may identify a feature, and another camera or an AR device in the remote surgical field may be used to see different points of view (e.g., camera views).
  • In an example, the remote patient 508 is a live surgical patient and the local patient 504 is a live surgical patient. In this example, the remote patient 508 may be remotely operated on using the robotic arm 506 by a surgeon in the real space of the display view 500. For example, the surgeon may simultaneously operate on both the remote patient 508 and the local patient 504. Simultaneously in this example may mean the surgeon switches between the patients at various operations of the surgery, such as at each step or after particular sequences of steps, or one surgery may be completed before the next is started, but both patients are available, viewable, or ready for surgery contemporaneously. In this version of this example, the surgeon may complete surgeries more quickly because multiple staff, operating rooms, and surgical equipment may be used in parallel rather than requiring serial surgeries. In another version of this example, the remote patient 508 may be operated on by a remote surgeon (e.g., with or without the use of the robotic arm 506), and the surgeon in the local space of the display view 500 may be called in to consult or provide assistance (e.g., with a portion of a procedure, such as operation of the remote robotic arm 506, for example when the remote surgeon is less experienced using a robotic arm). The remote patient 508 is viewable for the consultation (e.g., in real-time) such that the surgeon in the local space may give direction or advise without needing to physically leave the local surgical field. This version of the example may be particularly useful when the remote surgeon is a student, a newer surgeon, or the surgery is occurring in a remote building, city, country, etc.
  • In an example, the remote patient 508 is a live surgical patient and the local patient 504 is a cadaver. In this example, a surgeon in local space 500 may view a remote surgery, which may be occurring in real-time or may have already occurred and is viewed on replay. This example allows for a student or newer surgeon to complete a procedure (e.g., a new type or particularly difficult type) on a cadaver while being able to view a similar or the same procedure virtually. The virtual representation may be viewed at different angles, zoomed, or the like. When the virtual representation is a replay, the surgery may be reversed, sped up, paused, etc. In another version of this example, a remote surgeon may request advice or support from the local surgeon, who may attempt a portion of the surgery on the cadaver before the portion is attempted on the live remote patient 508. This allows for the portion of the procedure to be tested without damage to the live remote patient 508.
  • In an example, the remote patient 508 is a cadaver and the local patient 504 is a live surgical patient. In this example, a surgeon in the local space of the display view 500 may attempt a portion of a procedure on the remote cadaver before attempting the portion on the live local patient 504. The local surgeon may control the remote robotic arm 506 while performing the portion on the cadaver. The remote robotic arm 506 may save actions undertaken during the operation, which may be sent to the local robotic arm 502, and optionally edited. The saved actions may be repeated by the local robotic arm 502, for example to perform an autonomous portion of the procedure that has been tested on the cadaver. Differences between the cadaver and the local live patient 504 may be used to alter the saved actions, for example by scaling, moving target points, or the like. Differences in the robotic arms may be accounted for based on a calibration step performed before starting the surgical procedure. In an example, a procedure may be tested on a cadaver using the remote robotic arm 506, then successful actions may be transferred to the local robotic arm 502 for autonomous action or force-resist type movement by the local robotic arm 502 when performing the procedure on the local patient 504.
  • In an example, the remote patient 508 is a cadaver and the local patient 504 is a cadaver. In this example, a surgeon may practice a procedure on two different cadavers contemporaneously to identify differences in results from changes to the procedure. In another version of this example, the surgeon may perform a procedure for a student or newer surgeon while the student or newer surgeon operates remotely on the cadaver. In this version, the local surgeon may view and optionally critique the remote surgery. The remote surgical field may have a similar setup, allowing the student or newer surgeon to view the teaching surgeon's operation in an augmented or virtual reality view.
  • In any of the above examples, more than one remote surgical field may be presented. For example, a teaching surgeon may view multiple remote student surgeries. When two or more remote surgical fields are presented, they may be scaled to fit in the display view 500. A remote surgical field may be placed adjacent another remote surgical field, in an example.
  • A local surgeon may provide assistance when requested for a remote procedure, such as in a collaborative mode with the remote surgical arm 506. The collaborative mode may allow the local surgeon to move the remote surgical arm 506, while allowing the remote surgeon to stop the remote surgical arm 506. In another example, the local surgeon may be stop or take over control of the remote surgical arm 506 while monitoring the remote surgeon operating with the remote surgical arm 506. In yet another example, the local surgeon may control the local robotic arm 502, which in turn may send information to control the remote robotic arm 506 or the local robotic arm 502 may move in response to information received from the remote robotic arm 506. For example, the robotic arms may move in concert, such that either the remote or local surgeon may control the procedure. One of the surgeons may act to resist erroneous movements while the other of the surgeons performs the procedure, each using their respective robotic arm. In an example, the remote surgical field may represent a surgical field in a same building as the local surgical field.
  • FIG. 6 illustrates a flowchart showing a technique 600 for displaying a virtual representation of a remote surgical field within a local surgical field in accordance with some embodiments. The technique 600 may be performed by a processor, for example by executing instructions stored in memory. The technique 600 includes an operation 602 to receive a video stream of a remote surgical subject.
  • The technique 600 includes an operation 604 to present, using an augmented reality display, within a surgical field, a virtual surgical field representing the remote surgical subject. Operation 604 may include presenting the virtual surgical field while permitting a patient within the surgical field to be viewed through the augmented reality device. The virtual surgical field may be presented adjacent to the patient, in an example. Adjacent may mean separated by a fixed distance in absolute space within the surgical field, for example a foot, a few feet, etc. In another example, adjacent may mean separated by a relative distance as perceived through the augmented reality device (e.g., appearing to be separated by a foot, a few feet, etc.). Adjacent may mean touching, or almost touching.
  • The remote surgical subject may include a patient in another operating room within a building also housing the surgical field, a cadaver, or the like. The technique 600 may further include an operation to receive a voice instruction and send the voice instruction to a remote speaker (e.g., within a remote surgical field corresponding to and represented by the virtual surgical field). The technique 600 may include receiving a request to present the virtual surgical field before presenting the virtual surgical field (e.g., from a colleague, student, etc.). The virtual surgical field may be used for testing aspects of a technique (e.g., with a cadaver), for helping or consulting on a case, or to perform an entire procedure, in various examples. A second virtual surgical field may be presented (e.g., adjacent to the patient, such as on an opposite side, or adjacent to the first surgical field) for interaction or observation of a second remote surgical subject.
  • The virtual surgical field may be displayed including a virtual representation of a remote surgical robot. The remote surgical robot may be controlled by a command issued within the surgical field, for example via a voice command, a gesture, a user input on a device, touchscreen, virtual indication, a written, typed, haptic command, or the like. The remote surgical robot may be guided via a gesture. In an example, the virtual surgical field may be displayed based on output of a camera affixed to an end effector of the remote surgical robot.
  • FIG. 7 illustrates a robot sterilization system 700 in accordance with some embodiments. The robot sterilization system 700 includes a robotic arm 704, and a sterilization unit 706, which may be embedded in a base 702 of the robotic arm 704 or may be separate from the robotic arm 704. When separate, the sterilization unit 706 may be mounted under the robotic arm 704 or affixed to a portion of the robotic arm 704 (e.g., the base 702).
  • The sterilization unit 706 may include an opening 708 that may be used to output an instrument (e.g., instrument 712). In an example, an instrument may be output from the opening 708, for example using a mechanism within the sterilization unit 706. In another example, the sterilization unit 706 may include a tray 710, which may be output from the opening 708, the tray 710 used to convey the instrument 712. In yet another example, a door of the sterilization unit 706 may open to allow a user to remote an instrument. In still another example, the robotic arm 704 may be used to retrieve an instrument from within the sterilization unit 706. For example, the robotic arm 704 may retrieve an instrument from within the sterilization unit 706 based on known locations of instruments within the sterilization unit 706.
  • A door may be used to reload the sterilization unit 706 in an example. The sterilization unit 706 may include a sterile environment without the capability of sterilizing instruments. In this example, the sterilization unit 706 is a passive sterile storage unit. In another example, the sterilization unit 706 may be used to sterilize an instrument. In this example, the sterilization unit 706 may use sterilization equipment to sterilize the instrument, such as by using ultraviolet light, steam, gas, an autoclave, alcohol, heat pressure, glass beads, or the like.
  • The sterilization unit 706 may be controlled by a user interface or control mechanism, such as one incorporated in the base 702 or one also used to control the robotic arm 704 (e.g., an augmented reality user interface, a display screen, a microphone and algorithm for interpreting audible commands, the robotic arm 704 itself, or the like). Controls may include initiating sterilization of an instrument (or all instruments within the sterilization unit 706) or outputting an instrument (e.g., opening a door, outputting a specific selected instrument, outputting a next instrument in a procedure, or outputting a machine learning model identified instrument at a particular step in a procedure).
  • The instrument 712 may be output automatically, for example based on surgeon preferences, a machine learned model, or the like. For example, image processing may be used to determine a step of a procedure that is completed or almost completed, and an instrument for a next step may be output. In another example, movement of the robotic arm 704 may be used to determine that an instrument is needed and output that instrument. In this example, the movement may be a stored movement or a movement unique to a portion of a surgical procedure that identifies a next step.
  • FIG. 8 illustrates a flowchart showing a technique 800 for storing a sterilized instrument using a robotic system in accordance with some embodiments. The technique 800 may be implemented using a surgical robotic system, such as with a processor.
  • The technique 800 includes an operation 802 to provide a sterile environment. The sterile environment may be housed by a sterilization unit to store an instrument. The sterilization unit may be mounted under or form a base of a surgical robotic arm of the surgical robotic system. In the example where the sterilization unit is mounted under the surgical robotic arm, the sterilization unit may be a portable sterilization unit. In an example, the sterilization unit may be a sterile storage unit without sterilization capabilities itself. In another example, the sterilization unit may be configured to actively sterilize the instrument, for example using ultraviolet light, steam, gas, an autoclave, alcohol, heat pressure, glass beads, or the like. The sterilization unit may store a plurality of instruments including the instrument.
  • The technique 800 includes an operation 804 to determine whether the instrument is needed. In response to a determination that the instrument is not needed, the technique 800 may return to operation 602 or 604. Operation 804 may include using machine learning techniques to determine that the instrument is needed. For example, a trained model (which may include a binary classification, a regression model, a convolutional neural network, etc.) may be used to determine that a surgical step has been reached, that a time has passed, previously stored surgeon preferences, probability, a selected workflow, or the like. In other examples, a command, such as a spoken command, a gesture, an interaction with a physical or virtual user interface, or other techniques may be used to determine that the instrument is needed (e.g., a request for the instrument).
  • The technique 800 includes an operation 806 to, in response to a determination that the instrument is needed, provide access to the instrument from the sterile environment. Operation 806 may include displaying an indication of the instrument using an augmented reality display device. Operation 806 may include causing an enclosure of the sterilization unit to open, exposing the sterile environment including the instrument. Operation 806 may include causing the surgical robotic arm to retrieve the instrument. Operation 806 may include causing the instrument to be output from the sterilization unit via a mechanical conveyance. Operation 806 may include providing a set of sterile instruments, including the instrument, for a procedure.
  • FIG. 9 illustrates a system 900 for surgical instrument identification using an augmented reality display in accordance with some embodiments. The system 900 may be used to perform any of the techniques 400 or 600 described in relation to FIG. 4 or 6, for example, by using a processor 902. The system 900 includes an augmented reality device 901 that may be in communication with a database 916. The augmented reality device 901 includes a processor 902, memory 904, an AR display 908, and a camera 906. The augmented reality device 901 may include a sensor 910, a speaker 912, or a haptic controller 914. The database 916 may include image storage 918 or preoperative plan storage 920. In an example, the augmented reality device 901 may be a HoloLens manufactured by Microsoft of Redmond, Wash.
  • The processor 902 of the augmented reality device 901 includes an augmented reality modeler 903. The augmented reality modeler 903 may be used by the processor 902 to create the augmented reality environment. For example, the augmented reality modeler 903 may receive dimensions of a room, such as from the camera 906 or sensor 910, and create the augmented reality environment to fit within the physical structure of the room. In another example, physical objects may be present in the room and the augmented reality modeler 903 may use the physical objects to present virtual objects in the augmented reality environment. For example, the augmented reality modeler 903 may use or detect a table present in the room and present a virtual object as resting on the table. The AR display 908 may display the AR environment overlaid on a real environment. The display 908 may show a virtual object, using the AR device 901, such as in a fixed position in the AR environment. The augmented reality modeler 903 may receive a video stream of a remote surgical field for virtually displaying within the room. In an example, a dimension of a virtual object (e.g., a remote surgical field) may be modified (e.g., shrunk) to be virtually displayed within the room. In an example, the augmented reality device 901 may provide a zoom function to allow a user to zoom in on a portion of a virtual object (e.g., within a virtual surgical field).
  • The augmented reality device 901 may include a sensor 910, such as an infrared sensor. The camera 906 or the sensor 910 may be used to detect movement, such as a gesture by a surgeon or other user, that may be interpreted by the processor 902 as attempted or intended interaction by the user with the virtual target. The processor 902 may identify an object in a real environment, such as through processing information received using the camera 906.
  • The AR display 908, for example during a surgical procedure, may present, such as within a surgical field while permitting the surgical field to be viewed through the augmented reality display, a virtual feature corresponding to a physical feature hidden by an anatomical aspect of a patient. The virtual feature may have a virtual position or orientation corresponding to a first physical position or orientation of the physical feature. In an example, the virtual position or orientation of the virtual feature may include an offset from the first physical position or orientation of the physical feature. The offset may include a predetermined distance from the augmented reality display, a relative distance from the augmented reality display to the anatomical aspect, or the like.
  • FIG. 10 illustrates generally an example of a block diagram of a machine 1000 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments. In alternative embodiments, the machine 1000 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1000 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. The machine 1000 may be a personal computer (PC), a tablet PC, a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term “machine” shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
  • Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or like mechanisms. Such mechanisms are tangible entities (e.g., hardware) capable of performing specified operations when operating. In an example, the hardware may be specifically configured to carry out a specific operation (e.g., hardwired). In an example, the hardware may include configurable execution units (e.g., transistors, circuits, etc.) and a computer readable medium containing instructions, where the instructions configure the execution units to carry out a specific operation when in operation. The configuring may occur under the direction of the executions units or a loading mechanism. Accordingly, the execution units are communicatively coupled to the computer readable medium when the device is operating. For example, under operation, the execution units may be configured by a first set of instructions to implement a first set of features at one point in time and reconfigured by a second set of instructions to implement a second set of features.
  • Machine (e.g., computer system) 1000 may include a hardware processor 1002 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 1004 and a static memory 1006, some or all of which may communicate with each other via an interlink (e.g., bus) 1008. The machine 1000 may further include a display unit 1010, an alphanumeric input device 1012 (e.g., a keyboard), and a user interface (UI) navigation device 1014 (e.g., a mouse). In an example, the display unit 1010, alphanumeric input device 1012 and UI navigation device 1014 may be a touch screen display. The display unit 1010 may include goggles, glasses, or other AR or VR display components. For example, the display unit may be worn on a head of a user and may provide a heads-up-display to the user. The alphanumeric input device 1012 may include a virtual keyboard (e.g., a keyboard displayed virtually in a VR or AR setting.
  • The machine 1000 may additionally include a storage device (e.g., drive unit) 1016, a signal generation device 1018 (e.g., a speaker), a network interface device 1020, and one or more sensors 1021, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. The machine 1000 may include an output controller 1028, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices.
  • The storage device 1016 may include a machine readable medium 1022 that is non-transitory on which is stored one or more sets of data structures or instructions 1024 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 1024 may also reside, completely or at least partially, within the main memory 1004, within static memory 1006, or within the hardware processor 1002 during execution thereof by the machine 1000. In an example, one or any combination of the hardware processor 1002, the main memory 1004, the static memory 1006, or the storage device 1016 may constitute machine readable media.
  • While the machine readable medium 1022 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 1024.
  • The term “machine readable medium” may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 1000 and that cause the machine 1000 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media. Specific examples of machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; and CD-ROM and DVD-ROM disks.
  • The instructions 1024 may further be transmitted or received over a communications network 1026 using a transmission medium via the network interface device 1020 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, as the personal area network family of standards known as Bluetooth® that are promulgated by the Bluetooth Special Interest Group, peer-to-peer (P2P) networks, among others. In an example, the network interface device 1020 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 1026. In an example, the network interface device 1020 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term “transmission medium” shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 1000, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
  • FIGS. 11A-11B illustrate user interface components for landmark planning and plan evaluation in accordance with some embodiments.
  • FIG. 11A illustrates user interface feedback components 1102-1108 to provide feedback based on location of a patient, an instrument, or a placed or planned landmark. The feedback components 1102-1108 may be presented on a user interface on a display, such as a display screen, an augmented reality view, or a virtual reality view. For example, when a landmark is placed by a surgeon, a feedback component may pop up in an augmented reality display so that the surgeon may view the feedback component.
  • Feedback component 1102 may indicate that movement is needed (e.g., of a bone or patient, of an instrument, of a camera, etc.), such as before starting a landmark placement process. Feedback component 1104 may indicate that an acquisition point is too close to a previous landmark (e.g., virtual or placed). Feedback component 1106 may indicate that an acquisition point was outside a target area (e.g., an error occurred, a stray landmark was placed, or a landmark was accidentally placed outside a bone or other target area). Feedback component 1108 may indicate that a point that was placed is not aligned with a planned or virtual landmark point. The feedback component 1108 may appear instead of or in addition to a visual indication on the landmark itself (e.g., using an AR display).
  • The feedback component 1102-1108 may be interactable components, such as in an AR display. In this example, a feedback component may be selected to provide further visual feedback, such as a video example, highlighting of a landmark (virtual, placed, or planned), an arrow or other indicator pointing out the issue, or the like.
  • FIG. 11B illustrates example virtual landmarks displayed for example in a user interface component of an AR display. In this example, a component 1110 illustrates a portion of a knee bone with various virtual landmarks 1112-1116 displayed. In an example, the virtual landmarks 1112-1116 may be visually distinct, such as including a color coding, an animation (e.g., flashing), have a transparency, or the like. In an example, virtual landmark 1112 may indicate an upcoming landmark point to be placed, virtual landmark 1114 may indicate a current landmark point being placed, and virtual landmark 1116 may indicate a successful landmark point previously placed.
  • The virtual landmarks 1112-1116 may be labeled in some examples, such as with a name for the landmark (e.g., femoral canal entry, posterior condyles, etc.). The labels may be displayed in the AR display virtually, and in some examples may be hidden or revealed according to user preference or selection. The virtual landmarks 1112-1116 may be removed when completed (e.g., for successfully placed landmarks), such as automatically or after user confirmation. In some examples, a plurality of virtual landmarks may be displayed representing all successfully placed points or all points that still need to be completed.
  • In the examples of FIGS. 11A-11B, warnings, alerts, or information may be presented in an AR display during landmark selection. For example, when bone registration is not successful, an alert or warning may be displayed that some landmarks may have been placed incorrectly. These landmarks may be highlighted or otherwise indicated in the AR display (e.g., as a virtual landmark such as 1112). Other information may be displayed during the landmarking process, such as surgeon notes for example notes captured during planning and placement of virtual landmarks, suggestions, steps, or the like.
  • Example 1 is a method for using an augmented reality device in a surgical field comprising: receiving, at a processor, an indication of a location of a landmark on a bone of a patient; retrieving, using the processor, a planned location of the landmark on the bone of the patient based on a pre-operative image of the bone of the patient; presenting, using an augmented reality display, within a surgical field, while permitting the surgical field to be viewed through the augmented reality display, a virtual indication of the landmark at the location and a virtual indication of the landmark at the planned location; and receiving, at the processor, a response to a request for confirmation of the location of the landmark.
  • In Example 2, the subject matter of Example 1 includes, wherein the indication of the location of the landmark is stored in a database.
  • In Example 3, the subject matter of Examples 1-2 includes, wherein the indication of the location of the landmark is received directly from a landmark generation device.
  • In Example 4, the subject matter of Examples 1-3 includes, wherein the response confirms the location of the landmark.
  • In Example 5, the subject matter of Examples 1-4 includes, wherein the response changes the location of the landmark to the planned location.
  • In Example 6, the subject matter of Examples 1-5 includes, wherein the response includes a new location for the landmark.
  • In Example 7, the subject matter of Examples 1-6 includes, removing the virtual indication in response to receiving the response.
  • In Example 8, the subject matter of Examples 1-7 includes, registering the bone using a 3D model before receiving the indication of the landmark.
  • In Example 9, the subject matter of Examples 1-8 includes, receiving an indication to move a virtual navigation menu presented in the augmented reality display.
  • In Example 10, the subject matter of Examples 1-9 includes, wherein a position and orientation of the bone is determined using bone tracking via a passive robotic arm.
  • In Example 11, the subject matter of Examples 1-10 includes, displaying a live video, using the augmented reality display, of the bone using a camera affixed to an end effector of a robotic arm.
  • Example 12 is a system configured to perform operations of any of any of the methods of Examples 1-11.
  • Example 13 is at least one machine-readable medium including instructions for operation of a computing system, which when executed by a machine, cause the machine to perform operations of any of the methods of Examples 1-11.
  • Example 14 is an apparatus comprising means for performing any of the methods of Examples 1-11.
  • Example 15 is a method for using an augmented reality device in a surgical field comprising: receiving a video stream of a remote surgical subject; and presenting, using an augmented reality display, within a surgical field, while permitting a patient within the surgical field to be viewed through the augmented reality display, a virtual surgical field adjacent to the patient, the virtual surgical field representing the remote surgical subject.
  • In Example 16, the subject matter of Example 15 includes, receiving a voice instruction and sending the voice instruction to a remote speaker.
  • In Example 17, the subject matter of Examples 15-16 includes, wherein the remote surgical subject includes a patient in another operating room within a building housing the surgical field.
  • In Example 18, the subject matter of Examples 15-17 includes, wherein the remote surgical subject includes a cadaver.
  • In Example 19, the subject matter of Examples 15-18 includes, wherein presenting the virtual surgical field includes displaying a virtual representation of a remote surgical robot.
  • In Example 20, the subject matter of Example 19 includes, sending a command to the remote surgical robot.
  • In Example 21, the subject matter of Example 20 includes, wherein the command includes a written, typed, touchscreen-selected, augmented reality selected, or spoken command.
  • In Example 22, the subject matter of Examples 19-21 includes, guiding the remote surgical robot via a gesture.
  • In Example 23, the subject matter of Examples 19-22 includes, displaying a view of the virtual surgical field using a camera affixed to an end effector of the remote surgical robot.
  • In Example 24, the subject matter of Examples 15-23 includes, receiving a request to present the virtual surgical field before presenting the virtual surgical field.
  • In Example 25, the subject matter of Examples 15-24 includes, presenting a second virtual surgical field adjacent to the patient or adjacent to the virtual surgical field, the second virtual surgical field representing a second remote surgical subject.
  • Example 26 is a system configured to perform operations of any of any of the methods of Examples 15-25.
  • Example 27 is at least one machine-readable medium including instructions for operation of a computing system, which when executed by a machine, cause the machine to perform operations of any of the methods of Examples 15-25.
  • Example 28 is an apparatus comprising means for performing any of the methods of Examples 15-25.
  • Example 29 is a surgical robotic system comprising: a surgical robotic arm; a sterilization unit enclosing a sterile environment and storing an instrument; a processor configured to: determine that the instrument is needed in an upcoming portion of a surgical procedure; and provide access to the instrument.
  • In Example 30, the subject matter of Example 29 includes, wherein the sterilization unit is a base of the surgical robotic arm.
  • In Example 31, the subject matter of Examples 29-30 includes, wherein the sterilization unit is a portable sterilization unit, and wherein the surgical robotic arm is configured to be mounted on the portable sterilization unit.
  • In Example 32, the subject matter of Examples 29-31 includes, wherein the sterilization unit is a sterile storage unit without sterilization capabilities.
  • In Example 33, the subject matter of Examples 29-32 includes, wherein the sterilization unit is configured to actively sterilize the instrument.
  • In Example 34, the subject matter of Examples 29-33 includes, wherein the sterilization unit is configured to store a plurality of instruments including the instrument.
  • In Example 35, the subject matter of Examples 29-34 includes, wherein the determination that the instrument is needed is based on machine learning.
  • In Example 36, the subject matter of Examples 29-35 includes, wherein the determination that the instrument is needed is based on a previously stored surgeon preference.
  • In Example 37, the subject matter of Examples 29-36 includes, wherein the determination that the instrument is needed is based on a probability using a selected workflow and a timer.
  • In Example 38, the subject matter of Examples 29-37 includes, wherein the determination that the instrument is needed includes receiving a request for the instrument, including at least one of a spoken command, a touch on a touchscreen, an interaction with an augmented reality user interface, or a gesture.
  • In Example 39, the subject matter of Examples 29-38 includes, wherein to provide access to the instrument, the processor is further configured to display an indication of the instrument using an augmented reality display device.
  • In Example 40, the subject matter of Examples 29-39 includes, wherein to provide access to the instrument, the processor is further configured to cause an enclosure of the sterilization unit to open, exposing the sterile environment including the instrument.
  • In Example 41, the subject matter of Examples 29-40 includes, wherein to provide access to the instrument, the processor is further configured to cause the surgical robotic arm to retrieve the instrument.
  • In Example 42, the subject matter of Examples 29-41 includes, wherein to provide access to the instrument, the processor is further configured to cause the instrument to be output from the sterilization unit via a mechanical conveyance.
  • In Example 43, the subject matter of Examples 29-42 includes, wherein to provide access to the instrument, the processor is further configured to provide a set of sterile instruments, including the instrument, for a procedure.
  • Example 44 is a system configured to perform operations of any of any of the methods of Examples 29-43.
  • Example 45 is at least one machine-readable medium including instructions for operation of a computing system, which when executed by a machine, cause the machine to perform operations of any of the methods of Examples 29-43.
  • Example 46 is an apparatus comprising means for performing any of the methods of Examples 29-43.
  • Example 47 is a method of using a surgical robotic system comprising: determining that an instrument, stored in a sterilization unit enclosing a sterile environment is needed in an upcoming portion of a surgical procedure; and providing access to the instrument from the sterilization unit that is mounted under or forms a base of a surgical robotic arm of the surgical robotic system.
  • Example 48 is a method for using an augmented reality device in a surgical field comprising: receiving, at a processor, an indication of a location of a landmark on a bone of a patient; retrieving, using the processor, a planned location of the landmark on the bone of the patient based on a pre-operative image of the bone of the patient; presenting within the surgical field, using an augmented reality display of the augmented reality device, while permitting the surgical field to be viewed through the augmented reality display, a virtual indication of the landmark at the location and a virtual indication of the landmark at the planned location; and receiving, at the processor, a confirmation of the location of the landmark.
  • In Example 49, the subject matter of Example 48 includes, wherein the indication of the location of the landmark is stored in a database.
  • In Example 50, the subject matter of Examples 48-49 includes, wherein the indication of the location of the landmark is received directly from a landmark generation device.
  • In Example 51, the subject matter of Examples 48-50 includes, wherein the confirmation indicates the location of the landmark is correct.
  • In Example 52, the subject matter of Examples 48-51 includes, for a second landmark having a second location and a second planned location, receiving a change for the second landmark from the second location to the second planned location, and outputting information corresponding to the change to the augmented reality display.
  • In Example 53, the subject matter of Examples 48-52 includes, for a second landmark having a second location and a second planned location, receiving a change for the second landmark from the second location to a new location other than the second location and the second planned location, and outputting information corresponding to the new location to the augmented reality display for presenting.
  • In Example 54, the subject matter of Examples 48-53 includes, removing the virtual indications in response to receiving the confirmation.
  • In Example 55, the subject matter of Examples 48-54 includes, D model before receiving the indication of the landmark.
  • In Example 56, the subject matter of Examples 48-55 includes, receiving an indication to move a virtual navigation menu presented in the augmented reality display.
  • In Example 57, the subject matter of Examples 48-56 includes, wherein a position and orientation of the bone is determined using bone tracking via a passive robotic arm.
  • In Example 58, the subject matter of Examples 48-57 includes, displaying a live video, using the augmented reality display, of the bone using a camera affixed to an end effector of a robotic arm.
  • Example 59 is an augmented reality device in a surgical field comprising: a processor; memory including instructions, which when executed by the processor, cause the processor to perform operations to: receive an indication of a location of a landmark on a bone of a patient; retrieve a planned location of the landmark on the bone of the patient based on a pre-operative image of the bone of the patient; and receive a confirmation of the location of the landmark; and an augmented reality display to, before the processor receives the confirmation, present within the surgical field while permitting the surgical field to be viewed through the augmented reality display, a virtual indication of the landmark at the location and a virtual indication of the landmark at the planned location.
  • In Example 60, the subject matter of Example 59 includes, wherein the indication of the location of the landmark is stored in a database.
  • In Example 61, the subject matter of Examples 59-60 includes, wherein the indication of the location of the landmark is received directly from a landmark generation device.
  • In Example 62, the subject matter of Examples 59-61 includes, wherein the confirmation indicates the location of the landmark is correct.
  • In Example 63, the subject matter of Examples 59-62 includes, wherein the instructions further cause the processor to, for a second landmark having a second location and a second planned location, receive a change for the second landmark from the second location to the second planned location, and output information corresponding to the change to the augmented reality display for presenting.
  • In Example 64, the subject matter of Examples 59-63 includes, wherein the instructions further cause the processor to, for a second landmark having a second location and a second planned location, receive a change for the second landmark from the second location to a new location other than the second location and the second planned location, and output information corresponding to the new location to the augmented reality display for presenting.
  • In Example 65, the subject matter of Examples 59-64 includes, wherein the instructions further cause the processor to remove the virtual indications in response to receiving the confirmation.
  • In Example 66, the subject matter of Examples 59-65 includes, D model before receiving the indication of the landmark.
  • Example 67 is at least one machine-readable medium including instructions for operating an augmented reality device in a surgical field, which when executed by a processor, cause the processor to perform operations to: retrieving a plurality of planned locations corresponding to each of a plurality of landmarks on a bone of a patient based on pre-operative imaging of the bone of the patient; presenting within the surgical field, using an augmented reality display of the augmented reality device, while permitting the surgical field to be viewed through the augmented reality display, virtual indications of the plurality of landmarks at the plurality of planned locations; and receiving a confirmation of a first planned location as presented using the augmented reality display for a first landmark of the plurality of landmarks; and receiving a change to a second planned location as presented using the augmented reality display for a second landmark of the plurality of landmarks.
  • Example 68 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1-67.
  • Example 69 is an apparatus comprising means to implement of any of Examples 1-67.
  • Example 70 is a system to implement of any of Examples 1-67.
  • Example 71 is a method to implement of any of Examples 1-67.
  • Method examples described herein may be machine or computer-implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples. An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non-transitory, or non-volatile tangible computer-readable media, such as during execution or at other times. Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMs), read only memories (ROMs), and the like.

Claims (20)

What is claimed is:
1. A method for using an augmented reality device in a surgical field comprising:
receiving, at a processor, an indication of a location of a landmark on a bone of a patient;
retrieving, using the processor, a planned location of the landmark on the bone of the patient based on a pre-operative image of the bone of the patient;
presenting within the surgical field, using an augmented reality display of the augmented reality device, while permitting the surgical field to be viewed through the augmented reality display, a virtual indication of the landmark at the location and a virtual indication of the landmark at the planned location; and
receiving, at the processor, a confirmation of the location of the landmark.
2. The method of claim 1, wherein the indication of the location of the landmark is stored in a database.
3. The method of claim 1, wherein the indication of the location of the landmark is received directly from a landmark generation device.
4. The method of claim 1, wherein the confirmation indicates the location of the landmark is correct.
5. The method of claim 1, further comprising, for a second landmark having a second location and a second planned location, receiving a change for the second landmark from the second location to the second planned location, and outputting information corresponding to the change to the augmented reality display.
6. The method of claim 1, further comprising, for a second landmark having a second location and a second planned location, receiving a change for the second landmark from the second location to a new location other than the second location and the second planned location, and outputting information corresponding to the new location to the augmented reality display for presenting.
7. The method of claim 1, further comprising removing the virtual indications in response to receiving the confirmation.
8. The method of claim 1, further comprising registering the bone using a 3D model before receiving the indication of the landmark.
9. The method of claim 1, further comprising receiving an indication to move a virtual navigation menu presented in the augmented reality display.
10. The method of claim 1, wherein a position and orientation of the bone is determined using bone tracking via a passive robotic arm.
11. The method of claim 1, further comprising displaying a live video, using the augmented reality display, of the bone using a camera affixed to an end effector of a robotic arm.
12. An augmented reality device in a surgical field comprising:
a processor;
memory including instructions, which when executed by the processor, cause the processor to perform operations to:
receive an indication of a location of a landmark on a bone of a patient;
retrieve a planned location of the landmark on the bone of the patient based on a pre-operative image of the bone of the patient; and
receive a confirmation of the location of the landmark; and
an augmented reality display to, before the processor receives the confirmation, present within the surgical field while permitting the surgical field to be viewed through the augmented reality display, a virtual indication of the landmark at the location and a virtual indication of the landmark at the planned location.
13. The augmented reality device of claim 12, wherein the indication of the location of the landmark is stored in a database.
14. The augmented reality device of claim 12, wherein the indication of the location of the landmark is received directly from a landmark generation device.
15. The augmented reality device of claim 12, wherein the confirmation indicates the location of the landmark is correct.
16. The augmented reality device of claim 12, wherein the instructions further cause the processor to, for a second landmark having a second location and a second planned location, receive a change for the second landmark from the second location to the second planned location, and output information corresponding to the change to the augmented reality display for presenting.
17. The augmented reality device of claim 12, wherein the instructions further cause the processor to, for a second landmark having a second location and a second planned location, receive a change for the second landmark from the second location to a new location other than the second location and the second planned location, and output information corresponding to the new location to the augmented reality display for presenting.
18. The augmented reality device of claim 12, wherein the instructions further cause the processor to remove the virtual indications in response to receiving the confirmation.
19. The augmented reality device of claim 12, wherein the instructions further cause the processor to register the bone using a 3D model before receiving the indication of the landmark.
20. At least one machine-readable medium including instructions for operating an augmented reality device in a surgical field, which when executed by a processor, cause the processor to perform operations to:
retrieving a plurality of planned locations corresponding to each of a plurality of landmarks on a bone of a patient based on pre-operative imaging of the bone of the patient;
presenting within the surgical field, using an augmented reality display of the augmented reality device, while permitting the surgical field to be viewed through the augmented reality display, virtual indications of the plurality of landmarks at the plurality of planned locations; and
receiving a confirmation of a first planned location as presented using the augmented reality display for a first landmark of the plurality of landmarks; and
receiving a change to a second planned location as presented using the augmented reality display for a second landmark of the plurality of landmarks.
US17/376,676 2020-07-15 2021-07-15 Augmented reality bone landmark display Pending US20220020219A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/376,676 US20220020219A1 (en) 2020-07-15 2021-07-15 Augmented reality bone landmark display

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063052137P 2020-07-15 2020-07-15
US17/376,676 US20220020219A1 (en) 2020-07-15 2021-07-15 Augmented reality bone landmark display

Publications (1)

Publication Number Publication Date
US20220020219A1 true US20220020219A1 (en) 2022-01-20

Family

ID=79291809

Family Applications (3)

Application Number Title Priority Date Filing Date
US17/376,696 Active 2044-06-05 US12376919B2 (en) 2020-07-15 2021-07-15 Robotic device and sterilization unit for surgical instrument
US17/376,676 Pending US20220020219A1 (en) 2020-07-15 2021-07-15 Augmented reality bone landmark display
US19/268,312 Pending US20250339217A1 (en) 2020-07-15 2025-07-14 Robotic device and sterilization unit for surgical instrument

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US17/376,696 Active 2044-06-05 US12376919B2 (en) 2020-07-15 2021-07-15 Robotic device and sterilization unit for surgical instrument

Family Applications After (1)

Application Number Title Priority Date Filing Date
US19/268,312 Pending US20250339217A1 (en) 2020-07-15 2025-07-14 Robotic device and sterilization unit for surgical instrument

Country Status (1)

Country Link
US (3) US12376919B2 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220241013A1 (en) * 2014-03-28 2022-08-04 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
US11911277B2 (en) 2009-11-24 2024-02-27 Tornier Sas Determining implantation configuration for a prosthetic component or application of a resurfacing tool
US12020801B2 (en) 2018-06-19 2024-06-25 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12070272B2 (en) 2013-10-10 2024-08-27 Stryker European Operations Limited Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants
WO2024220894A1 (en) * 2023-04-20 2024-10-24 Andromeda Surgical, Inc. Universal surgical robotic platform
US12376919B2 (en) 2020-07-15 2025-08-05 Orthosoft Ulc Robotic device and sterilization unit for surgical instrument
US12458446B2 (en) 2019-05-14 2025-11-04 Howmedica Osteonics Corp. Bone wall tracking and guidance for orthopedic implant placement
US12465374B2 (en) 2019-12-18 2025-11-11 Howmedica Osteonics Corp. Surgical guidance for surgical tools
US12472013B2 (en) 2019-11-26 2025-11-18 Howmedica Osteonics Corp. Virtual guidance for correcting surgical pin installation
US12496135B2 (en) 2021-02-02 2025-12-16 Howmedica Osteonics Corp. Mixed-reality humeral-head sizing and placement

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4512356A1 (en) * 2024-01-23 2025-02-26 Siemens Healthineers AG Medical intervention assistance robot

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070249967A1 (en) * 2006-03-21 2007-10-25 Perception Raisonnement Action En Medecine Computer-aided osteoplasty surgery system
US20090088634A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Tool tracking systems and methods for image guided surgery
US20140058407A1 (en) * 2012-08-27 2014-02-27 Nikolaos V. Tsekos Robotic Device and System Software, Hardware and Methods of Use for Image-Guided and Robot-Assisted Surgery
US20140078175A1 (en) * 2012-09-18 2014-03-20 Qualcomm Incorporated Methods and systems for making the use of head-mounted displays less obvious to non-users
US20140275760A1 (en) * 2013-03-13 2014-09-18 Samsung Electronics Co., Ltd. Augmented reality image display system and surgical robot system comprising the same
US20150005785A1 (en) * 2011-12-30 2015-01-01 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for detection and avoidance of collisions of robotically-controlled medical devices
US20150265368A1 (en) * 2014-03-24 2015-09-24 Intuitive Surgical Operations, Inc. Systems and Methods for Anatomic Motion Compensation
US20150366628A1 (en) * 2014-06-18 2015-12-24 Covidien Lp Augmented surgical reality environment system
US20160128654A1 (en) * 2014-02-25 2016-05-12 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US20160239963A1 (en) * 2015-02-13 2016-08-18 St. Jude Medical International Holding S.À.R.L. Tracking-based 3d model enhancement
US20160331262A1 (en) * 2015-05-13 2016-11-17 Ep Solutions Sa Combined Electrophysiological Mapping and Cardiac Ablation Methods, Systems, Components and Devices
US20160331474A1 (en) * 2015-05-15 2016-11-17 Mako Surgical Corp. Systems and methods for providing guidance for a robotic medical procedure
US20160354155A1 (en) * 2013-03-15 2016-12-08 Wes Hodges System and method for health imaging informatics
US20170258526A1 (en) * 2016-03-12 2017-09-14 Philipp K. Lang Devices and methods for surgery
US20170360512A1 (en) * 2016-06-16 2017-12-21 Zimmer, Inc. Soft tissue balancing in articular surgery
US20180082480A1 (en) * 2016-09-16 2018-03-22 John R. White Augmented reality surgical technique guidance
US20180256258A1 (en) * 2017-03-13 2018-09-13 Seth Anderson Nash Augmented reality diagnosis guidance
US10621436B2 (en) * 2016-11-03 2020-04-14 Zimmer Us, Inc. Augmented reality therapeutic movement display and gesture analyzer
US20200337789A1 (en) * 2018-01-10 2020-10-29 Covidien Lp Guidance for positioning a patient and surgical robot
US20200367990A1 (en) * 2019-05-24 2020-11-26 Karl Storz Imaging, Inc. Augmented Reality System For Medical Procedures
US20210176383A1 (en) * 2019-12-05 2021-06-10 Synaptics Incorporated Under-display image sensor for eye tracking
US20210282887A1 (en) * 2020-03-13 2021-09-16 Trumpf Medizin Systeme Gmbh + Co. Kg Surgical augmented reality
US20210290310A1 (en) * 2018-12-11 2021-09-23 Project Moray, Inc. Hybrid-dimensional, augmented reality, and/or registration of user interface and simulation systems for robotic catheters and other uses
US20220000562A1 (en) * 2017-03-14 2022-01-06 Stephen B. Murphy, M.D. Systems and methods for determining leg length change during hip surgery
US20220125519A1 (en) * 2019-07-09 2022-04-28 Materialise N.V. Augmented reality assisted joint arthroplasty
US20220313375A1 (en) * 2019-12-19 2022-10-06 Noah Medical Corporation Systems and methods for robotic bronchoscopy
US11638613B2 (en) * 2019-05-29 2023-05-02 Stephen B. Murphy Systems and methods for augmented reality based surgical navigation

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2005209197A1 (en) 2004-01-16 2005-08-11 Smith & Nephew, Inc. Computer-assisted ligament balancing in total knee arthroplasty
US7885705B2 (en) 2006-02-10 2011-02-08 Murphy Stephen B System and method for facilitating hip surgery
US8986309B1 (en) 2007-11-01 2015-03-24 Stephen B. Murphy Acetabular template component and method of using same during hip arthrosplasty
US8267938B2 (en) 2007-11-01 2012-09-18 Murphy Stephen B Method and apparatus for determining acetabular component positioning
WO2014197451A1 (en) 2013-06-03 2014-12-11 Murphy Stephen B Method and apparatus for performing anterior hip surgery
US10537399B2 (en) * 2016-08-16 2020-01-21 Ethicon Llc Surgical tool positioning based on sensed parameters
US12376919B2 (en) 2020-07-15 2025-08-05 Orthosoft Ulc Robotic device and sterilization unit for surgical instrument

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070249967A1 (en) * 2006-03-21 2007-10-25 Perception Raisonnement Action En Medecine Computer-aided osteoplasty surgery system
US20090088634A1 (en) * 2007-09-30 2009-04-02 Intuitive Surgical, Inc. Tool tracking systems and methods for image guided surgery
US20150005785A1 (en) * 2011-12-30 2015-01-01 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for detection and avoidance of collisions of robotically-controlled medical devices
US20140058407A1 (en) * 2012-08-27 2014-02-27 Nikolaos V. Tsekos Robotic Device and System Software, Hardware and Methods of Use for Image-Guided and Robot-Assisted Surgery
US20140078175A1 (en) * 2012-09-18 2014-03-20 Qualcomm Incorporated Methods and systems for making the use of head-mounted displays less obvious to non-users
US20140275760A1 (en) * 2013-03-13 2014-09-18 Samsung Electronics Co., Ltd. Augmented reality image display system and surgical robot system comprising the same
US20160354155A1 (en) * 2013-03-15 2016-12-08 Wes Hodges System and method for health imaging informatics
US20160128654A1 (en) * 2014-02-25 2016-05-12 JointPoint, Inc. Systems and Methods for Intra-Operative Image Analysis
US20150265368A1 (en) * 2014-03-24 2015-09-24 Intuitive Surgical Operations, Inc. Systems and Methods for Anatomic Motion Compensation
US20150366628A1 (en) * 2014-06-18 2015-12-24 Covidien Lp Augmented surgical reality environment system
US20160239963A1 (en) * 2015-02-13 2016-08-18 St. Jude Medical International Holding S.À.R.L. Tracking-based 3d model enhancement
US20160331262A1 (en) * 2015-05-13 2016-11-17 Ep Solutions Sa Combined Electrophysiological Mapping and Cardiac Ablation Methods, Systems, Components and Devices
US20160331474A1 (en) * 2015-05-15 2016-11-17 Mako Surgical Corp. Systems and methods for providing guidance for a robotic medical procedure
US20170258526A1 (en) * 2016-03-12 2017-09-14 Philipp K. Lang Devices and methods for surgery
US20170360512A1 (en) * 2016-06-16 2017-12-21 Zimmer, Inc. Soft tissue balancing in articular surgery
US20180082480A1 (en) * 2016-09-16 2018-03-22 John R. White Augmented reality surgical technique guidance
US10621436B2 (en) * 2016-11-03 2020-04-14 Zimmer Us, Inc. Augmented reality therapeutic movement display and gesture analyzer
US20180256258A1 (en) * 2017-03-13 2018-09-13 Seth Anderson Nash Augmented reality diagnosis guidance
US20220000562A1 (en) * 2017-03-14 2022-01-06 Stephen B. Murphy, M.D. Systems and methods for determining leg length change during hip surgery
US20200337789A1 (en) * 2018-01-10 2020-10-29 Covidien Lp Guidance for positioning a patient and surgical robot
US20210290310A1 (en) * 2018-12-11 2021-09-23 Project Moray, Inc. Hybrid-dimensional, augmented reality, and/or registration of user interface and simulation systems for robotic catheters and other uses
US20200367990A1 (en) * 2019-05-24 2020-11-26 Karl Storz Imaging, Inc. Augmented Reality System For Medical Procedures
US11638613B2 (en) * 2019-05-29 2023-05-02 Stephen B. Murphy Systems and methods for augmented reality based surgical navigation
US20220125519A1 (en) * 2019-07-09 2022-04-28 Materialise N.V. Augmented reality assisted joint arthroplasty
US20210176383A1 (en) * 2019-12-05 2021-06-10 Synaptics Incorporated Under-display image sensor for eye tracking
US20220313375A1 (en) * 2019-12-19 2022-10-06 Noah Medical Corporation Systems and methods for robotic bronchoscopy
US20210282887A1 (en) * 2020-03-13 2021-09-16 Trumpf Medizin Systeme Gmbh + Co. Kg Surgical augmented reality

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11911277B2 (en) 2009-11-24 2024-02-27 Tornier Sas Determining implantation configuration for a prosthetic component or application of a resurfacing tool
US12070272B2 (en) 2013-10-10 2024-08-27 Stryker European Operations Limited Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants
US12133691B2 (en) 2013-10-10 2024-11-05 Stryker European Operations Limited Methods, systems and devices for pre-operatively planned shoulder surgery guides and implants
US12262951B2 (en) * 2014-03-28 2025-04-01 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
US20220241013A1 (en) * 2014-03-28 2022-08-04 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
US12170139B2 (en) 2018-06-19 2024-12-17 Howmedica Osteonics Corp. Virtual checklists for orthopedic surgery
US12020801B2 (en) 2018-06-19 2024-06-25 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12125577B2 (en) 2018-06-19 2024-10-22 Howmedica Osteonics Corp. Mixed reality-aided education using virtual models or virtual representations for orthopedic surgical procedures
US12380986B2 (en) 2018-06-19 2025-08-05 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12050999B2 (en) 2018-06-19 2024-07-30 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12046349B2 (en) 2018-06-19 2024-07-23 Howmedica Osteonics Corp. Visualization of intraoperatively modified surgical plans
US12237066B2 (en) 2018-06-19 2025-02-25 Howmedica Osteonics Corp. Multi-user collaboration and workflow techniques for orthopedic surgical procedures using mixed reality
US12112269B2 (en) 2018-06-19 2024-10-08 Howmedica Osteonics Corp. Mixed reality-aided surgical assistance in orthopedic surgical procedures
US12266440B2 (en) 2018-06-19 2025-04-01 Howmedica Osteonics Corp. Automated instrument or component assistance using mixed reality in orthopedic surgical procedures
US12362057B2 (en) 2018-06-19 2025-07-15 Howmedica Osteonics Corp. Virtual guidance for orthopedic surgical procedures
US12458446B2 (en) 2019-05-14 2025-11-04 Howmedica Osteonics Corp. Bone wall tracking and guidance for orthopedic implant placement
US12472013B2 (en) 2019-11-26 2025-11-18 Howmedica Osteonics Corp. Virtual guidance for correcting surgical pin installation
US12465374B2 (en) 2019-12-18 2025-11-11 Howmedica Osteonics Corp. Surgical guidance for surgical tools
US12376919B2 (en) 2020-07-15 2025-08-05 Orthosoft Ulc Robotic device and sterilization unit for surgical instrument
US12496135B2 (en) 2021-02-02 2025-12-16 Howmedica Osteonics Corp. Mixed-reality humeral-head sizing and placement
WO2024220894A1 (en) * 2023-04-20 2024-10-24 Andromeda Surgical, Inc. Universal surgical robotic platform

Also Published As

Publication number Publication date
US12376919B2 (en) 2025-08-05
US20250339217A1 (en) 2025-11-06
US20220015841A1 (en) 2022-01-20

Similar Documents

Publication Publication Date Title
US20250339217A1 (en) Robotic device and sterilization unit for surgical instrument
US12444141B2 (en) Augmented reality surgical technique guidance
AU2022201300B2 (en) Augmented reality therapeutic movement display and gesture analyzer
US11672611B2 (en) Automatic identification of instruments
KR102728830B1 (en) Systems and methods for providing guidance for a robotic medical procedure
CN109475385B (en) System and method for intraoperative surgical planning
US10575905B2 (en) Augmented reality diagnosis guidance
US20190038362A1 (en) Surgical field camera system
WO2019139931A1 (en) Guidance for placement of surgical ports
CN112566579A (en) Multi-user collaboration and workflow techniques for orthopedic surgery using mixed reality
JP2016503676A (en) Positioning and navigation using 3D tracking sensors
US20230093342A1 (en) Method and system for facilitating remote presentation or interaction
CN113033526A (en) Computer-implemented method, electronic device and computer program product
US20200334998A1 (en) Wearable image display device for surgery and surgery information real-time display system
US12288614B2 (en) Platform for handling of medical devices associated with a medical device kit
US12400750B1 (en) Automatic content tagging in videos of minimally invasive surgeries
US11232861B1 (en) Systems and methods for displaying predetermined information for clinical use
US20240268892A1 (en) Virtual Reality Surgical Systems And Methods Including Virtual Navigation

Legal Events

Date Code Title Description
AS Assignment

Owner name: ORTHOSOFT ULC, CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHAV, RAMNADA;COUTURE, PIERRE;SIGNING DATES FROM 20210716 TO 20210809;REEL/FRAME:057373/0697

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCV Information on status: appeal procedure

Free format text: NOTICE OF APPEAL FILED

STCV Information on status: appeal procedure

Free format text: APPEAL BRIEF (OR SUPPLEMENTAL BRIEF) ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: TC RETURN OF APPEAL

STCV Information on status: appeal procedure

Free format text: EXAMINER'S ANSWER TO APPEAL BRIEF MAILED

STCV Information on status: appeal procedure

Free format text: ON APPEAL -- AWAITING DECISION BY THE BOARD OF APPEALS

STCV Information on status: appeal procedure

Free format text: BOARD OF APPEALS DECISION RENDERED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION COUNTED, NOT YET MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED