WO2025041183A1 - Optical device with an augmented reality management system - Google Patents
Optical device with an augmented reality management system Download PDFInfo
- Publication number
- WO2025041183A1 WO2025041183A1 PCT/IT2024/050170 IT2024050170W WO2025041183A1 WO 2025041183 A1 WO2025041183 A1 WO 2025041183A1 IT 2024050170 W IT2024050170 W IT 2024050170W WO 2025041183 A1 WO2025041183 A1 WO 2025041183A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- optical device
- images
- command interface
- additional components
- command
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/20—Surgical microscopes characterised by non-optical aspects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0334—Foot operated pointing devices
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B2090/364—Correlation of different images or relation of image positions in respect to the body
- A61B2090/365—Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/372—Details of monitor hardware
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/50—Supports for surgical instruments, e.g. articulated arms
- A61B2090/502—Headgear, e.g. helmet, spectacles
Definitions
- the present invention concerns an optical device with an augmented reality management system.
- the invention belongs to the technical field of machinery and technology in hospital and non-hospital operating rooms, particularly in the technical field of operating microscopes and similar.
- Virtual and augmented reality is an emerging technology and is also increasingly being used in other devices such as helmet or goggle visors.
- New technologies to aid surgery include the possibility of using virtual and/or augmented reality as an additional visual aid for better surgery through, for example, the use of virtual helmets or visors.
- a further aim of the present invention is to provide an optical device with an augmented reality management system capable of handling the augmented and/or virtual reality applicable in a multitude of fields and sectors.
- a further aim of the present invention is to provide an optical device with an augmented reality management system that can be used in a surgical and operative context by an operator or surgeon.
- a further aim of the present invention is to provide an optical device with an augmented reality management system that allows images visible through the eyepieces of the optical device to be managed while keeping the hands free.
- Further aim of the present invention is to provide an optical device with an augmented reality management system that can manage augmented reality images in both two dimensions and three dimensions.
- a further aim of the present invention is to realise an optical device with an augmented reality management system for handling a plurality of additional components superimposed on or combined with the real or virtual images visible to the operator or surgeon.
- FIG. 1 and 2 are two schematic representations of vision in a three- dimensional X-Y-Z plane by an operator looking through eyepieces;
- FIG. 3 is a perspective view of an optical device with a manual touchpad command interface according to the present invention.
- FIG. 4 is a perspective view of an optical device with a joystick hand command interface according to the present invention.
- FIG. 5A, 5B and 5C are perspective views of an optical device with a touch-free manual command interface according to the present invention.
- - figure 6 is a perspective view of two pedals for using a foot command interface according to the present invention
- - figure 7A illustrates the rotation movement on the Z axis of one of the pedals of figure 6;
- figure 7B illustrates the rotation movement on the X axis of one of the pedals of figure 6;
- figure 7C illustrates the movement of rotation on the Y axis of one of the pedals of figure 6;
- figure 7D illustrates the sliding movement along the X axis of one of the pedals of figure 6;
- figure 7E illustrates the sliding movement along the Y axis of one of the pedals of figure 6;
- the optical device with an augmented reality management system object of the invention in a first embodiment thereof according to the present invention, is shown with numerical reference 100 and comprises an optical instrument for producing and displaying real images, images belonging to virtual and/or augmented reality and additional computer- generated components.
- the optical device 100 can be used in different sectors and for different applications.
- the optical device 100 may be a traditional or innovative device and may also be a helmet or goggle viewer or any other related technology.
- the optical device 100 may be used in the medical field and may be an endoscope, a laparoscope, a stereo-microscope, an operating microscope, etc.
- the optical device 100 in the case of the surgical microscope, traditionally comprises eyepieces, an optical body, a binocular body and a supporting mechanical body or arm.
- the surgeon or operator positions himself with his eyes on the eyepieces, which allow the surgeon or operator to look at the area where the optical instrument is directed and pointed.
- the surgeon looking into the eyepieces, can have a two- dimensional field of view, for example on an X-Y plane, or three-dimensional, for example on an X-Y-Z space.
- the optical device 100 comprises integrated instrumentation for creating and viewing virtual and/or augmented reality images that allows the surgeon to take advantage of additional components, such as for example other images or digital elements, to be facilitated in operations or other delicate or precision actions on patients.
- the optical device 100 also comprises elements and/or devices for handling virtual images and supplementary components.
- the virtual images and additional components can be integrated and/or superimposed on the real images detected by the optical device in either two- or three-dimensional form.
- virtual images and additional components can be present either in the X-Y plane, in the case of a two-dimensional field of view, or in the X-Y- Z space in the case of a three-dimensional field of view.
- the additional components can be superimposed or positioned more or less deeply than the real images.
- the additional components are generated and managed by a computer and are controlled by the surgeon by means of one or more command interfaces that allow convenient and easy interaction between the surgeon and the additional components themselves.
- command interfaces allow the additional components to be activated or deactivated, a cursor to be controlled for their selection, and a whole series of operations similar to actions on a computer.
- a surgeon will be able to simultaneously look at the actual images taken by the optical device 100, manage additional components and operate the functions of a computer while keeping his or her hands free to carry out operations or actions on the patient on whom he is performing the operation.
- the optical device 100 may comprise one or more of the following command interfaces
- each command interface may be used individually or in combination with one or more of the other command interfaces in order to optimise the possibility of controlling the additional components from the computer and seen within the eyepieces of the optical device 100.
- the management of the augmented reality and additional components takes place through a visual command interface capable of performing a simultaneous check of the angle of convergence of the two pupils of the surgeon.
- the visual interface comprises a system of prisms, mirrors, one or more cameras, lateral and/or coaxial sensors integrated with the optical device 100 that allow instantaneous verification of the displacement of the pupils and the eyes of the surgeon positioned on the eyepieces (if present), determining the angle of convergence of the pupils.
- control of pupil displacement occurs through a camera capable of filming and recording its movements.
- a possible filming of eye movements can be carried out through the same optical path of the microscope, by inserting beam-splitters in this optical path to intercept the image coming from the eyepieces of the optical device rather than from the operating field.
- the eyepiece acts as a lens pointing towards the eyes of the surgeon and thus towards his pupil.
- the pupil is framed through micro-cameras positioned anterior and lateral to the eyepieces and pointed directly at the pupil. In this way, the shot will not be taken within the optical path of the microscope but will still be able to intercept the eye movements of the surgeon. More in detail, in this embodiment, the eyepieces are equipped with spacer rings on which the surgeon positions his eyes.
- these spacer rings prevent external ambient light from penetrating into the eyepieces.
- a beam-splitter in front of the eyepieces, which, placed in front of the eye, allows additional cameras to view the eye while naturally allowing images from the surgical field and augmented reality devices to pass through.
- the cameras point at the beam-splitter, which intercepts the surgeon's eye movements and records the movements of the pupil.
- a specific software translates the detected angle of convergence into a focal distance representing where in the Z-axis (depth) the surgeon is looking.
- the determination of the focal distance by the software enables a specific interaction between the real images, the virtual images and additional components positioned in the volume of the visual field of the surgeon.
- a screen containing the real images in addition to a plurality of additional components, for example virtual images or computer folders, positioned both in a horizontal X-Y plane and in the Z axis, that is at different depths and possibly overlapping each other.
- the visual command interface will allow the recognition of the focal plane in which the surgeon is looking, so that a cursor can be placed on the folder belonging to that plane and possibly make it more evident by illuminating and/or diminishing the illumination of the folders above and below it.
- the same cursor can then be moved, through eye movements on that plane, in order to subsequently exercise click functions with the recognition and interpretation of blinks.
- surgeon can move the cursor to other planes in order to access additional components positioned at different depths.
- surgeon's gaze is directed towards the operating or visual field of the real world, and thus to images that do not belong to the additional components, these will remain unaltered, remaining inactive and dimly lit, so that they can be traversed by the surgeon's own view.
- the visual command interface allows the surgeon to manage, move and operate a cursor and interact with the additional components, solely through the movement of his eyes, similar to what he could do with a mouse on a computer.
- an additional on/off additional component is inserted at a chosen location, for example at the periphery of the field of vision, which when selected is able to switch on or off, by positioning or removing them, all additional components.
- This on/off additional component has the function of a switch and can have any shape, size and position according to requirements, preferences and possible customisation.
- the on/off additional component can be present in any command interface.
- the visual command interface is also able to follow the surgeon's gaze through a single eye by interpreting its movement in the X and Y axes of a two-dimensional plane, making the software understand its horizontal positioning above an additional component. In this case, however, the point observed in the Z-axis cannot be identified, and the software will not be able to distinguish any overlapping images and additional components.
- the cursor is automatically positioned in the plane in which the surgeon is looking, this plane being known.
- the positioning of any image or contribution, and thus also of the cursor within the optical device 100 follows the logic of parallax.
- this logic dictates that if two points are positioned in the centre of the two individual eyepieces, the fused image in the observer's brain will be seen in a specific plane which can be called, for simplification, plane 0.
- the software knowing the position, in pixels, millimetres or other units of measurement, of each additional contribution as well as their distance from each other, determines exactly where this image will be reproduced in the stereoscopic fusion in relation to those above and below.
- the software will attribute to the cursor in the two displays for the left and right eye reciprocally that exact distance ratio, sliding it in that specific plane.
- the optical device 100 comprises an toggle command interface capable of detecting, via sensors, electrical impulses from muscles at the leg and/or ankle level.
- the nerve impulse of the surgeon determines an electrical potential that sensors applied at specific points along the course of the nerves are able to record in order to assess the electrical variations and attribute their origin and intensity.
- dedicated software attributes specific actions to the recording of the nerve impulse such as cursor management, zooming, focusing, etc.
- the toggle command interface comprises an wearable electronic device for extending sensors over the dorsal surface of the surgeon's foot.
- the toggle command interface may act autonomously or interconnected with the visual command interface and/or one or more of the other command interfaces set out below.
- a further embodiment of the present invention involves the use of a manual command interface.
- the manual command interface may be used in the event that the surgeon has the possibility of momentarily interrupting the operation phase in order to be able to consult the virtual images and additional contributions within the eyepieces.
- the manual command interface comprises a manually operating device, for example in the form of a handle, which is solidly coupled to the body or arms of the optical device 100 and which allows control of the cursor via trackballs, trackpads, touchpads 30 or joysticks 40 installed thereon.
- the hand-operated device allows the control, management and interaction of the real, virtual and additional images and the movement of the cursor analogously to what is possible with a mouse in a computer.
- the hand-operated device is equipped with one or more left and right click buttons corresponding to the buttons of a mouse.
- the optical device 100 comprises a touch-free manual command interface having a sensor using, for example, Lidar technology or cameras and computer vision software capable of recognising and distinguishing individual hand finger movements.
- Such a sensor applied for example on the body of the optical device 100, is connected to the computer will be able to allow the surgeon to control the computer by keeping his eyes inside the eyepieces and only moving the fingers of the hand in the vicinity of the sensor.
- the touch-free hand command interface also allows the cursor to be moved by simple finger or hand movements.
- the optical device 100 comprises a foot command interface comprising one or more foot pedals 60 (figure 6) with different movement directions, comprising:
- the foot command interface comprises four micro switches for toe, heel, right and left respectively. These micro switches can be programmed with keyboard functions, mouse button functions or other computer-related functions for real, virtual, additional component and cursor movement.
- the 3D rotation movement is linked to a lock/unlock mechanism, for example electromagnetic, which allows it to be activated or deactivated.
- a lock/unlock mechanism for example electromagnetic
- the pedal can operate in 3D mode as a slider or for other functions that require it.
- the 3D movement performed after the release of the pedal allows the pedal to be used as a joystick or trackball; this is done with the use of specific digital or analogue sensors, such as those of a joystick.
- the foot command interface can also include acoustic or vibrational components with the function of feedback and warning for the surgeon.
- the optical device 100 comprises an acoustic command interface comprising a microphone and one or more loudspeakers for example integrated in the optical device 100.
- the microphone is attributed the ability to listen for voice commands used by the surgeon to control and manage the virtual images and additional components, as well as all possible applications related to the use of the microscope such as, for example, use of the camera, control of the illumination, focus, zoom and working distance, traditionally controlled manually or by foot.
- the use of the microphone can be employed for a facilitated relationship with the computer in general and with an artificial intelligence used as a command and dictation interface.
- the microphone is activated or deactivated via a switch or button located on the optical device 100 for activating or deactivating the microphone.
- one or more speakers are inserted into the optical device 100 for listening to various noises and sounds.
- Such noises or sounds may be noises due to surgery, music, sounds from outside the operating room, sounds for communication with other surgeons or operators, etc.
- Such sounds or noises may also be related to computers or communication with artificial intelligence.
- loudspeakers can also be activated via a switch or button.
- Loudspeakers and other acoustic and vibrational devices may also be used to send feedback messages to the surgeon as a warning or regarding commands executed.
- the optical device 100 may comprise cameras connected to the computer or network to allow for the analysis and evaluation of real, virtual images and additional contributions from an artificial intelligence that can offer its input to the surgeon in real time via the acoustic command interface.
- Such artificial intelligence can be inserted into one or both optical paths of the optical device 100.
- the artificial intelligence is capable of managing the images, additional components and functionality of the optical device 100 itself.
- the artificial intelligence is not only able to answer questions asked vocally by the surgeon in order to interact with it, but is also connected not only to the microphone, but also to the cameras inserted in the optical paths of the optical device 100.
- the artificial intelligence not only sees the images of the surgical field but also knows, relates and, if necessary, manages those coming from the augmented reality computer.
- the artificial intelligence can also be connected to the other previously described command interfaces used individually or in combination.
- the optical device 100 may comprise and integrate one or more of the previously described interfaces.
- each interface in addition to interacting with real images, virtual images and additional components, can interact with and manage traditional functions of an optical device such as focus, zoom, light intensity, camera, etc.
- the optical device 100 may possess special functions to manage digital radiological and/or anatomical images that may be superimposed or combined with the real images taken by the optical device 100 itself.
- the handling of digital images can be done manually through the use of software within the application bar of the microscope.
- a specific icon activates the flipping and inverting of the image at one or two separate moments.
- semi-automatic systems can, once the position of the patient and the operator with the microscope is known, automatically adapt the images for the specific situation.
- artificial intelligence will be able to interpret the digital images and understand the needs of the surgeon by asking the appropriate questions if necessary.
- digital images may need to be flipped upside down and mirrored in order to be correctly usable and visible to the surgeon.
- This operation of adapting the images and additional components to the situation on the operating field can be performed by the surgeon through a controlled application of the command interfaces, but also autonomously by an artificial intelligence on the basis of the analysis of the real images from the operating field, or by one or more automatisms based on the position of the surgeon and the patient himself.
- the optical device with augmented reality management system is compatible with any operating system and device such as macOS, Windows, Android, Linux, iOS, etc.
- the augmented reality management optical device is instead capable of working independently of any operating system.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Heart & Thoracic Surgery (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Artificial Intelligence (AREA)
- Dermatology (AREA)
- Neurology (AREA)
- Neurosurgery (AREA)
- Multimedia (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Microscoopes, Condenser (AREA)
Abstract
An optical device (100) with an augmented reality management system comprising an optical instrument for producing and displaying real images, virtual images and computer generated components, comprising one or more of the following command interfaces: a visual command interface, an ankle monitor command interface, a manual command interface, a free-touch manual command interface, an acoustic command interface. The command interfaces are suitable for the management of and interaction with real images, virtual images and supplemental components.
Description
OPTICAL DEVICE WITH AN AUGMENTED REALITY MANAGEMENT SYSTEM
The present invention concerns an optical device with an augmented reality management system.
Field of the invention
The invention belongs to the technical field of machinery and technology in hospital and non-hospital operating rooms, particularly in the technical field of operating microscopes and similar.
Prior art
The technological evolution of vision systems for augmented reality allows this technology to be increasingly used in various fields and sectors, whether industrial or mechanical, healthcare and/or typical of everyday life of every individual.
The technological sector related to virtual and augmented reality is gaining in importance, and is being used in a multitude of different applications, starting with video games.
One particular field of application in which virtual and augmented reality and related technologies are gaining ground and will be even more widely used in the near future is the surgical field, which utilises instruments such as endoscopes, laparoscopes, stereo-microscopes operators and similar technologies.
Virtual and augmented reality is an emerging technology and is also increasingly being used in other devices such as helmet or goggle visors.
The surgical world, in recent decades, has increasingly turned to the use of visual systems that allow a magnified and improved view of the surgical field in order to have a better view of anatomical structures where the type of intervention requires it. This development enables a reduction in iatrogenic damage, postoperative complications and healthcare costs.
New technologies to aid surgery include the possibility of using virtual and/or augmented reality as an additional visual aid for better surgery through, for example, the use of virtual helmets or visors.
The possibility of integrating images from the computer with those from the operating field is opening up a new research frontier for surgery.
In fact, the possibility of observing virtual images and other additional components in the context of the real world certainly brings a number of advantages for users, in this case surgeons as an example.
However, these images need to be handled appropriately, especially if the operator is without or unable to use a keyboard or mouse, as in the case of a surgeon performing surgery on a patient.
Furthermore, unlike the images displayed on the computer that are two- dimensional and positioned in an X-Y plane, those viewed within an optical surgical instrument such as a stereo-microscope or any other wearable instrument and not for augmented reality, are stereoscopic images, and as such have depth and are therefore visible in an X-Y-Z space.
Therefore, in addition to the X-Y axis, it will also be useful and necessary to handle them in the Z axis.
Currently, there are no systems for the management of virtual and augmented reality within instruments such as operating microscopes or similar and, in fact, the management of the images is entrusted to an external operator with a mouse or keyboard.
This results in the inability of the surgeon who is viewing the images to manage these images himself, and of course the need for a person to do this for him.
The management of X-Y axes in general and the Z-axis in particular for interaction with virtual images or additional components will become of paramount importance, not only for its use through other medical devices such as the stereomicroscope, stereo-endoscope, laparoscope and helmet or goggle visors for augmented reality in surgery, but also for all other non-medical applications such as architecture, education, automotive and aeronautics.
The relevant prior art also comprises patent applications US2005206583A1 and WO2021226134A1.
Aim of the invention
It is therefore an aim of the present invention to provide an optical device with an augmented reality management system capable of solving the above-mentioned drawbacks and critical issues.
A further aim of the present invention is to provide an optical device with an augmented reality management system capable of handling the augmented and/or virtual reality applicable in a multitude of fields and sectors.
A further aim of the present invention is to provide an optical device with an augmented reality management system that can be used in a surgical and operative context by an operator or surgeon.
A further aim of the present invention is to provide an optical device with an augmented reality management system that allows images visible through the eyepieces of the optical device to be managed while keeping the hands free.
Further aim of the present invention is to provide an optical device with an augmented reality management system that can manage augmented reality images in both two dimensions and three dimensions.
A further aim of the present invention is to realise an optical device with an augmented reality management system for handling a plurality of additional components superimposed on or combined with the real or virtual images visible to the operator or surgeon.
These and other aims are achieved by an optical device according to the attached independent claim.
Further detailed technical features are set out in the attached dependent claims.
Brief description of the figures
The present invention will now be described, by way of example but not limitation, according to some preferred embodiments thereof, and with the aid of the attached figures, in which:
- figures 1 and 2 are two schematic representations of vision in a three- dimensional X-Y-Z plane by an operator looking through eyepieces;
- figure 3 is a perspective view of an optical device with a manual touchpad command interface according to the present invention;
- figure 4 is a perspective view of an optical device with a joystick hand command interface according to the present invention;
- figures 5A, 5B and 5C are perspective views of an optical device with a touch-free manual command interface according to the present invention;
- figure 6 is a perspective view of two pedals for using a foot command interface according to the present invention;
- figure 7A illustrates the rotation movement on the Z axis of one of the pedals of figure 6;
- figure 7B illustrates the rotation movement on the X axis of one of the pedals of figure 6;
- figure 7C illustrates the movement of rotation on the Y axis of one of the pedals of figure 6;
- figure 7D illustrates the sliding movement along the X axis of one of the pedals of figure 6;
- figure 7E illustrates the sliding movement along the Y axis of one of the pedals of figure 6;
- figure 7F illustrates the free sliding movement of one of the pedals of figure 6.
Detailed description
Referring to the above-mentioned figures, an embodiment of an optical device with an augmented reality management system according to the invention is illustrated.
Referring to the above-mentioned figures, the optical device with an augmented reality management system object of the invention, in a first embodiment thereof according to the present invention, is shown with numerical reference 100 and comprises an optical instrument for producing and displaying real images, images belonging to virtual and/or augmented reality and additional computer- generated components.
Advantageously, the optical device 100 can be used in different sectors and for different applications. The optical device 100 may be a traditional or innovative device and may also be a helmet or goggle viewer or any other related technology.
By way of example, the optical device 100 may be used in the medical field and may be an endoscope, a laparoscope, a stereo-microscope, an operating microscope, etc.
In the following of the present description, for convenience of exposition, reference will be made by way of example to an operating microscope used in medicine and surgery by a surgeon, but it is understood that other optical devices also fall within the inventive concept of the present invention.
The optical device 100, in the case of the surgical microscope, traditionally comprises eyepieces, an optical body, a binocular body and a supporting mechanical body or arm.
The surgeon or operator positions himself with his eyes on the eyepieces, which allow the surgeon or operator to look at the area where the optical instrument is directed and pointed.
Advantageously, the surgeon, looking into the eyepieces, can have a two- dimensional field of view, for example on an X-Y plane, or three-dimensional, for example on an X-Y-Z space.
Still advantageously, the optical device 100 comprises integrated instrumentation for creating and viewing virtual and/or augmented reality images that allows the surgeon to take advantage of additional components, such as for example other images or digital elements, to be facilitated in operations or other delicate or precision actions on patients.
Even more advantageously, the optical device 100 also comprises elements and/or devices for handling virtual images and supplementary components.
The virtual images and additional components can be integrated and/or superimposed on the real images detected by the optical device in either two- or three-dimensional form.
More in detail, virtual images and additional components can be present either in the X-Y plane, in the case of a two-dimensional field of view, or in the X-Y- Z space in the case of a three-dimensional field of view. In the three-dimensional case, the additional components can be superimposed or positioned more or less deeply than the real images.
In particular, the additional components are generated and managed by a computer and are controlled by the surgeon by means of one or more command interfaces that allow convenient and easy interaction between the surgeon and the additional components themselves.
Advantageously, such command interfaces allow the additional components to be activated or deactivated, a cursor to be controlled for their selection, and a whole series of operations similar to actions on a computer.
More advantageously, through the use of the command interfaces, a surgeon will be able to simultaneously look at the actual images taken by the optical device 100, manage additional components and operate the functions of a computer while
keeping his or her hands free to carry out operations or actions on the patient on whom he is performing the operation.
More in detail, the optical device 100 may comprise one or more of the following command interfaces
- a visual command interface;
- a toggle command interface;
- a manual command interface;
- a touch-free manual command interface;
- a foot command interface;
- an acoustic command interface.
Advantageously, each command interface may be used individually or in combination with one or more of the other command interfaces in order to optimise the possibility of controlling the additional components from the computer and seen within the eyepieces of the optical device 100.
In a first embodiment of the present invention, the management of the augmented reality and additional components takes place through a visual command interface capable of performing a simultaneous check of the angle of convergence of the two pupils of the surgeon.
In particular, the visual interface comprises a system of prisms, mirrors, one or more cameras, lateral and/or coaxial sensors integrated with the optical device 100 that allow instantaneous verification of the displacement of the pupils and the eyes of the surgeon positioned on the eyepieces (if present), determining the angle of convergence of the pupils.
In a particular embodiment, the control of pupil displacement occurs through a camera capable of filming and recording its movements. In particular, a possible filming of eye movements can be carried out through the same optical path of the microscope, by inserting beam-splitters in this optical path to intercept the image coming from the eyepieces of the optical device rather than from the operating field.
In this way, the eyepiece acts as a lens pointing towards the eyes of the surgeon and thus towards his pupil.
In another embodiment, the pupil is framed through micro-cameras positioned anterior and lateral to the eyepieces and pointed directly at the pupil. In this way, the shot will not be taken within the optical path of the microscope but will still be able to intercept the eye movements of the surgeon.
More in detail, in this embodiment, the eyepieces are equipped with spacer rings on which the surgeon positions his eyes.
Advantageously, these spacer rings prevent external ambient light from penetrating into the eyepieces.
In a further embodiment, as an alternative or in combination with the microcameras on the lateral portions of the eyepieces directed towards the pupil of the surgeon, it is possible to place a beam-splitter in front of the eyepieces, which, placed in front of the eye, allows additional cameras to view the eye while naturally allowing images from the surgical field and augmented reality devices to pass through. Specifically, the cameras point at the beam-splitter, which intercepts the surgeon's eye movements and records the movements of the pupil.
With reference to figures 1 and 2, a specific software translates the detected angle of convergence into a focal distance representing where in the Z-axis (depth) the surgeon is looking.
The determination of the focal distance by the software enables a specific interaction between the real images, the virtual images and additional components positioned in the volume of the visual field of the surgeon.
In the illustrative case of an operating microscope, it is possible to have a screen containing the real images in addition to a plurality of additional components, for example virtual images or computer folders, positioned both in a horizontal X-Y plane and in the Z axis, that is at different depths and possibly overlapping each other.
By pointing the eyes on a particular additional component, for example a folder, the visual command interface will allow the recognition of the focal plane in which the surgeon is looking, so that a cursor can be placed on the folder belonging to that plane and possibly make it more evident by illuminating and/or diminishing the illumination of the folders above and below it.
The same cursor can then be moved, through eye movements on that plane, in order to subsequently exercise click functions with the recognition and interpretation of blinks.
In subsequent actions, the surgeon can move the cursor to other planes in order to access additional components positioned at different depths.
Advantageously, if the surgeon's gaze is directed towards the operating or visual field of the real world, and thus to images that do not belong to the additional
components, these will remain unaltered, remaining inactive and dimly lit, so that they can be traversed by the surgeon's own view.
In other words, the visual command interface allows the surgeon to manage, move and operate a cursor and interact with the additional components, solely through the movement of his eyes, similar to what he could do with a mouse on a computer.
Advantageously, in order to allow the surgeon to completely clear or recall the additional components, an additional on/off additional component is inserted at a chosen location, for example at the periphery of the field of vision, which when selected is able to switch on or off, by positioning or removing them, all additional components.
This on/off additional component has the function of a switch and can have any shape, size and position according to requirements, preferences and possible customisation.
As an example, in the case of an operating microscope, it will be small and positioned in the side fields of the field of view of the surgeon.
The on/off additional component can be present in any command interface.
In a simplified embodiment, the visual command interface is also able to follow the surgeon's gaze through a single eye by interpreting its movement in the X and Y axes of a two-dimensional plane, making the software understand its horizontal positioning above an additional component. In this case, however, the point observed in the Z-axis cannot be identified, and the software will not be able to distinguish any overlapping images and additional components.
In this embodiment, the cursor is automatically positioned in the plane in which the surgeon is looking, this plane being known.
Advantageously, the positioning of any image or contribution, and thus also of the cursor within the optical device 100 follows the logic of parallax.
As can be seen in figure 1 or 2, this logic dictates that if two points are positioned in the centre of the two individual eyepieces, the fused image in the observer's brain will be seen in a specific plane which can be called, for simplification, plane 0.
If two other points are positioned closer together, they will appear in a plane closer to the observer and above plane 0. If on the contrary two points are positioned
further apart, the opposite will happen and they will be seen in a plane below plane 0 and further away from the observer.
The software, knowing the position, in pixels, millimetres or other units of measurement, of each additional contribution as well as their distance from each other, determines exactly where this image will be reproduced in the stereoscopic fusion in relation to those above and below.
The software will attribute to the cursor in the two displays for the left and right eye reciprocally that exact distance ratio, sliding it in that specific plane.
Advantageously, even more approximate placements above or below said plane are still considered effective for the present invention.
In a further embodiment, the optical device 100 comprises an toggle command interface capable of detecting, via sensors, electrical impulses from muscles at the leg and/or ankle level.
In this way, the recognition of the electrical potential evoked by each individual muscle is translated into a computer action for the management of real and virtual images and additional components.
Specifically, the nerve impulse of the surgeon determines an electrical potential that sensors applied at specific points along the course of the nerves are able to record in order to assess the electrical variations and attribute their origin and intensity.
Advantageously, dedicated software attributes specific actions to the recording of the nerve impulse, such as cursor management, zooming, focusing, etc.
Advantageously, the toggle command interface comprises an wearable electronic device for extending sensors over the dorsal surface of the surgeon's foot.
Still advantageously, the toggle command interface may act autonomously or interconnected with the visual command interface and/or one or more of the other command interfaces set out below.
Referring to figures 3 and 4, a further embodiment of the present invention involves the use of a manual command interface.
Such a manual command interface may be used in the event that the surgeon has the possibility of momentarily interrupting the operation phase in order to be able to consult the virtual images and additional contributions within the eyepieces.
Advantageously, the manual command interface comprises a manually operating device, for example in the form of a handle, which is solidly coupled to the body or arms of the optical device 100 and which allows control of the cursor via trackballs, trackpads, touchpads 30 or joysticks 40 installed thereon.
The hand-operated device allows the control, management and interaction of the real, virtual and additional images and the movement of the cursor analogously to what is possible with a mouse in a computer.
Advantageously, the hand-operated device is equipped with one or more left and right click buttons corresponding to the buttons of a mouse.
With reference to figures 5A, 5B and 5C, a further embodiment of the present invention is illustrated, wherein the optical device 100 comprises a touch-free manual command interface having a sensor using, for example, Lidar technology or cameras and computer vision software capable of recognising and distinguishing individual hand finger movements.
Such a sensor, applied for example on the body of the optical device 100, is connected to the computer will be able to allow the surgeon to control the computer by keeping his eyes inside the eyepieces and only moving the fingers of the hand in the vicinity of the sensor.
Through the movement of the hand it will then be possible to move the cursor and interact with the real and virtual images and additional componets generated by the computer and visible to the surgeon.
The touch-free hand command interface also allows the cursor to be moved by simple finger or hand movements.
In a further embodiment, the optical device 100 comprises a foot command interface comprising one or more foot pedals 60 (figure 6) with different movement directions, comprising:
- tilting movement ("tip-to-tack");
- antero-posterior translation;
- right and left rotation on a plane on the vertical axis near the centre of foot support;
- 3D rotation on a centre close to the centre of support of the foot.
The movements of the pedals 60 can be seen in figures 7A-7F.
In more detail, the foot command interface comprises four micro switches for toe, heel, right and left respectively. These micro switches can be programmed with
keyboard functions, mouse button functions or other computer-related functions for real, virtual, additional component and cursor movement.
Advantageously, the 3D rotation movement is linked to a lock/unlock mechanism, for example electromagnetic, which allows it to be activated or deactivated.
In particular, through the electromagnetic release, the pedal can operate in 3D mode as a slider or for other functions that require it. Specifically, the 3D movement performed after the release of the pedal allows the pedal to be used as a joystick or trackball; this is done with the use of specific digital or analogue sensors, such as those of a joystick.
Still advantageously, the foot command interface can also include acoustic or vibrational components with the function of feedback and warning for the surgeon.
In a further embodiment, the optical device 100 comprises an acoustic command interface comprising a microphone and one or more loudspeakers for example integrated in the optical device 100.
More in detail, the microphone is attributed the ability to listen for voice commands used by the surgeon to control and manage the virtual images and additional components, as well as all possible applications related to the use of the microscope such as, for example, use of the camera, control of the illumination, focus, zoom and working distance, traditionally controlled manually or by foot.
Furthermore, the use of the microphone can be employed for a facilitated relationship with the computer in general and with an artificial intelligence used as a command and dictation interface.
Advantageously, the microphone is activated or deactivated via a switch or button located on the optical device 100 for activating or deactivating the microphone.
Advantageously, one or more speakers are inserted into the optical device 100 for listening to various noises and sounds.
Such noises or sounds may be noises due to surgery, music, sounds from outside the operating room, sounds for communication with other surgeons or operators, etc.
In addition, such sounds or noises may also be related to computers or communication with artificial intelligence.
Advantageously, such loudspeakers can also be activated via a switch or button.
Loudspeakers and other acoustic and vibrational devices may also be used to send feedback messages to the surgeon as a warning or regarding commands executed.
Advantageously, the optical device 100 may comprise cameras connected to the computer or network to allow for the analysis and evaluation of real, virtual images and additional contributions from an artificial intelligence that can offer its input to the surgeon in real time via the acoustic command interface.
Such artificial intelligence can be inserted into one or both optical paths of the optical device 100.
Furthermore, the artificial intelligence is capable of managing the images, additional components and functionality of the optical device 100 itself.
In particular, the artificial intelligence is not only able to answer questions asked vocally by the surgeon in order to interact with it, but is also connected not only to the microphone, but also to the cameras inserted in the optical paths of the optical device 100.
In this way, the artificial intelligence not only sees the images of the surgical field but also knows, relates and, if necessary, manages those coming from the augmented reality computer.
Advantageously, the artificial intelligence can also be connected to the other previously described command interfaces used individually or in combination.
Advantageously, the optical device 100 may comprise and integrate one or more of the previously described interfaces.
Again advantageously, each interface, in addition to interacting with real images, virtual images and additional components, can interact with and manage traditional functions of an optical device such as focus, zoom, light intensity, camera, etc.
Equally advantageously, the optical device 100 may possess special functions to manage digital radiological and/or anatomical images that may be superimposed or combined with the real images taken by the optical device 100 itself.
In particular, the handling of digital images can be done manually through the use of software within the application bar of the microscope. In this case, a specific
icon activates the flipping and inverting of the image at one or two separate moments.
Furthermore, semi-automatic systems can, once the position of the patient and the operator with the microscope is known, automatically adapt the images for the specific situation.
Advantageously, artificial intelligence will be able to interpret the digital images and understand the needs of the surgeon by asking the appropriate questions if necessary.
However, the viewing of digital X-ray images, as well as other anatomical images from digital scans of the patient, may be inconsistent with the actual images of the operating field, leading to confusion and risk of misjudgement on the part of the surgeon. Therefore, digital images may need to be flipped upside down and mirrored in order to be correctly usable and visible to the surgeon.
As an example, consider the situation where the surgeon is positioned at twelve o'clock in relation to the patient's head and has to operate on the teeth or roots of the upper incisors. The view of the operating field in this case presents incisal images of the teeth upwards and the root apices downwards. In order to make the normal digital X-ray image of the aforementioned teeth coherent, it will be necessary to invert and mirror it, thus making it superimposable on the real images of the surgical field detected by the optical device 100 and visible to the surgeon.
This operation of adapting the images and additional components to the situation on the operating field can be performed by the surgeon through a controlled application of the command interfaces, but also autonomously by an artificial intelligence on the basis of the analysis of the real images from the operating field, or by one or more automatisms based on the position of the surgeon and the patient himself.
Advantageously, the optical device with augmented reality management system is compatible with any operating system and device such as macOS, Windows, Android, Linux, iOS, etc.
In further embodiments, the augmented reality management optical device is instead capable of working independently of any operating system.
From the description given, the characteristics of the optical device integrated with an augmented reality management system, which is the subject matter of the invention, are clear, as are its advantages.
Finally, it is clear that numerous other variations may be made to the device in question, without departing from the principles of novelty inherent in the inventive idea, just as it is clear that, in the practical implementation of the invention, the materials, shapes and sizes of the details illustrated may be any as required and the same may be replaced with equivalent ones.
Where the features and techniques mentioned in any of the claims are followed by reference marks, those reference marks have been included for the sole purpose of increasing the intelligibility of the claims and, accordingly, those reference marks have no limiting effect on the interpretation of each element identified by way of example by those reference marks.
Claims
1. An optical device (100) with an augmented reality management system comprising an optical instrument for producing and viewing real images, virtual images and additional components generated by a computer and a plurality of command interfaces for the management of and interaction with said real images, said virtual images and said additional components, characterized in that said command interfaces comprise a visual command interface and an acoustic command interface, wherein said visual command interface is configured to verify, by means of a software, the movement of the pupils and the eyes of a user by determining the angle of convergence of said pupils for managing and interacting with said real images, said virtual images and said additional components, and wherein said acoustic command interface comprises at least one microphone and one or more loudspeakers, integrated in said optical device (100) and configured to communicate with an artificial intelligence for managing and interacting with said real images, said virtual images and said additional components.
2. The optical device (100) according to claim 1 characterized in that said real images, said virtual images and said additional components are produced and visible in a two-dimensional visual field or in a three-dimensional visual field.
3. The optical device (100) according to claim 1 or 2 characterized in that said interfaces are configured to control a cursor for interacting with said real images, said virtual images and said additional components.
4. The optical device (100) according to claim 3 characterized in that said said command interfaces further include a toggle command interface that comprises a wearable electronic device comprising a series of sensors placed on the dorsal surface of a foot of the user, so as to detect the electrical impulses of the muscles at the level of the user's leg and/or ankle for managing and interacting with said real images, said virtual images and said additional components, and for moving said cursor.
5. The optical device (100) according to claim 3 characterized in that said said command interfaces further include a manual command interface comprising a manual operating device, for example in the form of a handle, solidly coupled to the body or arms of the optical device (100) and permitting control of said cursor by
means of a trackball, trackpad, touchpad (30) or joystick (40) installed on said optical device (100).
6. The optical device (100) according to claim 3 characterized in that said said command interfaces further include a touch-free manual command interface that comprises a sensor for recognizing and distinguishing the movements of the fingers of the user for managing and interacting with said real images, said virtual images and said additional components, and for moving said cursor.
7. The optical device (100) according to claim 3 characterized in that said said command interfaces further include a foot command interface that comprises one or more pedals (60) with different movement directions, said foot command interface acting on microswitches for managing and interacting with said real images, said virtual images and said additional components, and for moving said cursor.
8. The optical device (100) according to claim 7 characterized in that said foot command interface comprises a lock/unlock mechanism for activating or deactivating a 3D movement by means of said one or more pedals (60).
9. The optical device (100) according to claim 1 characterized in that said optical device (100) has one or more functions for managing and placing superimposed digital images or digital images combined with said real images.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IT102023000017427 | 2023-08-22 | ||
| IT202300017427 | 2023-08-22 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025041183A1 true WO2025041183A1 (en) | 2025-02-27 |
Family
ID=88504756
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IT2024/050170 Pending WO2025041183A1 (en) | 2023-08-22 | 2024-08-19 | Optical device with an augmented reality management system |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025041183A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050206583A1 (en) * | 1996-10-02 | 2005-09-22 | Lemelson Jerome H | Selectively controllable heads-up display system |
| WO2021226134A1 (en) * | 2020-05-04 | 2021-11-11 | Raytrx, Llc | Surgery visualization theatre |
| US20220417492A1 (en) * | 2014-03-19 | 2022-12-29 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking for stereo viewer |
-
2024
- 2024-08-19 WO PCT/IT2024/050170 patent/WO2025041183A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050206583A1 (en) * | 1996-10-02 | 2005-09-22 | Lemelson Jerome H | Selectively controllable heads-up display system |
| US20220417492A1 (en) * | 2014-03-19 | 2022-12-29 | Intuitive Surgical Operations, Inc. | Medical devices, systems, and methods using eye gaze tracking for stereo viewer |
| WO2021226134A1 (en) * | 2020-05-04 | 2021-11-11 | Raytrx, Llc | Surgery visualization theatre |
Non-Patent Citations (3)
| Title |
|---|
| BAUTISTA LUIS ET AL: "Usability test with medical personnel of a hand-gesture control techniques for surgical environment", INTERNATIONAL JOURNAL ON INTERACTIVE DESIGN AND MANUFACTURING (IJIDEM), SPRINGER PARIS, PARIS, vol. 14, no. 3, 12 August 2020 (2020-08-12), pages 1031 - 1040, XP037227333, ISSN: 1955-2513, [retrieved on 20200812], DOI: 10.1007/S12008-020-00690-9 * |
| RYO SUZUKI ET AL: "Augmented Reality and Robotics: A Survey and Taxonomy for AR-enhanced Human-Robot Interaction and Robotic Interfaces", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 7 March 2022 (2022-03-07), XP091177831, DOI: 10.1145/3491102.3517719 * |
| SILVA JENNIFER N AVARI ET AL: "Design Considerations for Interacting and Navigating with 2 Dimensional and 3 Dimensional Medical Images in Virtual, Augmented and Mixed Reality Medical Applications", 3 July 2021, TOPICS IN CRYPTOLOGY - CT-RSA 2020 : THE CRYPTOGRAPHERS' TRACK AT THE RSA CONFERENCE 2020, SAN FRANCISCO, CA, USA, FEBRUARY 24-28, 2020, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, PAGE(S) 117 - 133, XP047601703 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7573027B2 (en) | SYSTEM AND METHOD FOR UTILIZING THREE-DIMENSIONAL OVERLAY IN MEDICAL PROCEDURE - Patent application | |
| TWI734106B (en) | Stereoscopic visualization camera and integrated robotics platform | |
| US20220054223A1 (en) | Surgical visualization systems and displays | |
| US12329588B2 (en) | Surgical virtual reality user interface | |
| US11373550B2 (en) | Augmented reality training system | |
| JP6709796B2 (en) | Operating room and surgical site recognition | |
| US9766441B2 (en) | Surgical stereo vision systems and methods for microsurgery | |
| JP5992448B2 (en) | Image system and method | |
| US20180368656A1 (en) | Surgical visualization systems and displays | |
| CN109644266B (en) | Stereoscopic visualization system capable of achieving depth perception of surgical area | |
| US20170020627A1 (en) | Surgical visualization systems and displays | |
| JP2016502120A (en) | Head mounted system and method for computing and rendering a stream of digital images using the head mounted system | |
| JP2014512550A6 (en) | Image system and method | |
| CN106456148A (en) | Medical devices, systems, and methods using eye gaze tracking | |
| CN106659541A (en) | Medical devices, systems and methods integrating eye gaze tracking for stereoscopic viewers | |
| JP2007512854A (en) | Surgical navigation system (camera probe) | |
| CN104918572A (en) | Digital system for surgical video capturing and display | |
| JP4129527B2 (en) | Virtual surgery simulation system | |
| WO2015198023A1 (en) | Ocular simulation tool | |
| WO2025041183A1 (en) | Optical device with an augmented reality management system | |
| JP7367041B2 (en) | UI for head-mounted display systems | |
| Queisner | MEDICAL SCREEN OPERATIONS: HOW HEAD-MOUNTED DISPLAYS TRANSFORM ACTION AND PERCEPTION IN SURGICAL PRACTICE. | |
| Afkari et al. | Command Selection in Gaze-based See-through Virtual Image-Guided Environments | |
| Clancy et al. | Gaze-contingent autofocus system for robotic-assisted minimally invasive surgery | |
| Wieben | Virtual and augmented reality in medicine |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24782651 Country of ref document: EP Kind code of ref document: A1 |