US20200386982A1 - Magnification glasses with multiple cameras - Google Patents
Magnification glasses with multiple cameras Download PDFInfo
- Publication number
- US20200386982A1 US20200386982A1 US16/625,780 US201816625780A US2020386982A1 US 20200386982 A1 US20200386982 A1 US 20200386982A1 US 201816625780 A US201816625780 A US 201816625780A US 2020386982 A1 US2020386982 A1 US 2020386982A1
- Authority
- US
- United States
- Prior art keywords
- camera
- glasses
- processor
- view
- magnification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000011521 glass Substances 0.000 title claims abstract description 109
- 230000003190 augmentative effect Effects 0.000 claims description 10
- 238000005286 illumination Methods 0.000 claims description 8
- 238000003491 array Methods 0.000 claims description 7
- 230000005540 biological transmission Effects 0.000 claims description 4
- 230000000712 assembly Effects 0.000 description 17
- 238000000429 assembly Methods 0.000 description 17
- 230000003287 optical effect Effects 0.000 description 7
- 230000003213 activating effect Effects 0.000 description 5
- 238000000034 method Methods 0.000 description 5
- 239000000463 material Substances 0.000 description 4
- 241000282461 Canis lupus Species 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000010437 gem Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B25/00—Eyepieces; Magnifying glasses
- G02B25/002—Magnifying glasses
- G02B25/004—Magnifying glasses having binocular arrangement
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/24—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the mouth, i.e. stomatoscopes, e.g. with tongue depressors; Instruments for opening or keeping open the mouth
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/462—Displaying means of special interest characterised by constructional features of the display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/466—Displaying means of special interest adapted to display 3D data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/50—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications
- A61B6/51—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment specially adapted for specific body parts; specially adapted for specific clinical applications for dentistry
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
Definitions
- the present invention relates generally to magnification glasses and more specifically to magnification glasses that comprise multiple cameras each with the same working distance but providing different magnification.
- loupes are generally swiveled into position in front of the practitioner's eyes when magnification is required and then swiveled away when not required. Achieving different levels of magnification usually requires swapping the loupe lenses.
- Optical loupes are relatively heavy due to their glass lenses and stick out once in position, occupying between 3-10 cm of space in front of the practitioner's eyes. Further—they are often fixed inside the glasses—permanently obstructing the view and requiring switching of glasses for different magnifications.
- An alternative to the optical loupe is use of a digital loupe comprising a digital camera that feeds an image to a small screen.
- the camera/screen combination can be mounted onto a headband or mounted as part of the glass in front of the practitioner's eyes when magnification is required.
- the image from the digital camera can then be digitally zoomed for greater magnification.
- the disadvantage of digital zoom is that the quality of the image suffers as the zoom is increased due to the reduced number of pixels in use, and therefore the range of magnification is limited.
- the use of optical magnification combined with a digital camera is possible but requires the use of auto-focus digital cameras which are more expensive and also heavier than the non-auto-focus equivalents.
- the background art therefore does not teach or suggest a system for close range magnification glasses using cost-effective digital cameras that can provide an extended range of magnification. It would also be desirable to be able to capture and transmit the view as seen through the magnification device.
- the presently claimed invention provides wearable glasses featuring variable magnification of close range subjects by using camera array comprising a plurality of cost-effective fixed focus, fixed distance, fixed magnification digital cameras each providing different magnification while maintaining the same working distance.
- the image captured by the camera array is displayed to the user using viewing assemblies that are one of: screens placed in front of the glasses; or image projection onto the glasses lenses, or a headset comprising viewing screens.
- the image displayed may be any of: the magnified view captured by the camera array; or virtual reality, or augmented reality; or any of these overlaid with relevant textual or image data.
- infrared images or images lit with ultraviolet light are provided.
- the use of multiple fixed focus cameras each providing output images of different magnification for a fixed working distance lowers the cost of implementation, results in a magnification device with a more compact and lighter form factor, and increases the magnification range for digital camera based magnification glasses.
- the close range fixed working distance is preferably between 10-40 cm.
- Each of the fixed focus cameras in the array has a lens providing different maximum magnification allowing for fast switching between different levels of magnification.
- a first camera lens provides up to 6 ⁇ magnification and a second camera lens provides up to 10 ⁇ magnification.
- a first camera lens provides up to 5 ⁇ magnification and a second camera lens provides up to 10 ⁇ magnification.
- any combination of cameras is provided for the magnification and working distance required by the particular application.
- more than two cameras may be provided per array.
- the images captured by each camera in the array are provided to a processing unit that outputs a selected image from one of the cameras depending on the magnification required.
- Digital magnification of the output image is preferably provided from the camera with the lowest magnification lens up till the camera with the highest magnification lens with a seamless switch of the output image between the image captured from the first camera and the image captured from the second camera.
- the captured image is displayed on one or both of the left and right viewing assemblies.
- the camera array may be duplicated with each array feeding a respective left or right viewing assembly to therefore provide magnified stereoscopic vision.
- the control of the magnification provided is via any suitable control device including but not limited to a joystick, foot pedal, speech control, switch mounted on the glasses or attached to them or similar.
- the image captured by the cameras can preferably be saved such as to non-volatile storage or transmitted for display on a local or remote display.
- the image displayed can preferably be overlaid with relevant textual or image data.
- the displays can preferably be divided into a bifocal arrangement for display of different views or overlay data on different parts of each display.
- magnification glasses comprising: a first camera array for capturing a magnified image in the field of view of the glasses comprising at least a first camera and a second camera wherein each of the first camera and the second camera provide different levels of magnification, wherein the first camera and the second camera have the same working distance.
- the working distance is fixed between 150 mm to 400 mm.
- each of the first and second cameras provides a different magnification of between 2 ⁇ and 10 ⁇ .
- the working distance is fixed in a subrange between 150 mm to 400 mm.
- the working distance is fixed in a range between 150 mm to 400 mm.
- the glasses further comprise a first viewer assembly and a processor, wherein the processor receives a first magnified image captured by the first camera and a second magnified image captured by the second camera and transmits either one of the first image or the second image for display on the first viewer assembly, wherein the processor is a computing device.
- the processor digitally zooms either one of the first magnified image or the second magnified image before transmission to the first viewer assembly.
- the first viewer assembly comprises at least one of: a display screen positioned in front of the glasses frame; a display screen mounted in the glasses frame; or a projector for projecting on a lens in the frame.
- one or more of the cameras comprises an infrared camera.
- the processor provides for display on the viewer assembly at least one of: the magnified view captured by the camera assembly; a virtual reality view; an augmented reality view; a data view; or an infrared view.
- the glasses further comprise a second camera array and a second viewer assembly positioned on the same side of the glasses, wherein the first camera array and a first viewer assembly are positioned on the opposite side of the glasses wherein the first and second camera arrays are spaced horizontally apart so as to capture stereoscopic vision, wherein the processor transmits the image from the first camera array to the first viewer assembly and wherein the processor transmits the image from the second camera array to the second viewer assembly.
- the camera arrays are mounted on the glasses.
- the camera arrays are mounted on a headband.
- each camera comprises a lens and an image sensor.
- the processor is housed inside the frames.
- the processor is housed in an external enclosure.
- the glasses further comprise a controller selected from the group consisting of: a joystick, a foot pedal, speech control, and a switch.
- the glasses further comprise an illumination source.
- the illumination source comprises an ultraviolet light.
- electro-optical magnifying glasses comprising: a wearable display; and a first camera array for capturing a magnified image in the field of view of the glasses comprising at least a first camera and a second camera wherein each of the first camera and the second camera provide different levels of magnification, wherein the first camera and the second camera have the same working distance.
- the working distance is fixed between 150 mm to 400 mm.
- each of the first and second cameras provides a different magnification of between 2 ⁇ and 10 ⁇ .
- the glasses further comprise a processor, wherein the processor receives a first magnified image captured by the first camera and a second magnified image captured by the second camera and transmits either one of the first image or the second image for display on the display, wherein the processor is a computing device.
- the processor digitally zooms either one of the first magnified image or the second magnified image before transmission to the first viewer assembly.
- one or more of the cameras comprises an infrared camera.
- the processor provides for display on the display of at least one of: the magnified view captured by the camera assembly; a virtual reality view; an augmented reality view; a data view; or an infrared view.
- magnification glasses comprising: wearable glasses frames with lenses; a first camera array comprising at least two fixed focus cameras capturing a magnified view of the field of view of the glasses; and a viewer assembly comprising at least one of: a display screen in front of at least one of the lenses for displaying at least the magnified view from one of the cameras; or a projector for projecting at least the magnified view from one of the cameras on at least one of the lenses.
- the glasses further comprise a second camera array wherein each of the first and second arrays are spaced horizontally apart so as to capture stereoscopic vision.
- the camera arrays are mounted on the glasses.
- the camera arrays are mounted on a headband.
- each camera comprises a lens and an image sensor and preferably the lens provides a magnification of between 2 ⁇ and 10 ⁇ .
- each camera comprises a graphics card.
- each camera has a working distance of 150 mm to 400 mm.
- the glasses further comprise a processor for receiving the image captured by the cameras and transmitting the image to the display screen or the projector, wherein the processor is a computing device.
- the processor is housed inside the frames.
- the processor is housed in an external enclosure.
- the viewer assembly can be swiveled away from the frames.
- the viewer assembly displays at least one of: a magnified view captured by the cameras; a virtual reality view; an augmented reality view; a data view, or a combination of these.
- the glasses further comprise a controller selected from the group consisting of: a joystick, a foot pedal, speech control, and a switch.
- magnification glasses comprising: wearable glasses frames; at least two fixed focus camera capturing a magnified view of the field of view of the glasses; and a display screen in front of the glasses for displaying the magnified view from one of the cameras.
- magnification glasses may also refer to “loupe glasses”.
- camera array or “camera assembly” or “camera set” as used herein refer to a set of cameras each with different magnification or other capabilities.
- close range refers to a distance of 10-40 cm.
- working distance is the distance wherein the image captured by the cameras is in focus.
- Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof.
- several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof.
- selected steps of the invention could be implemented as a chip or a circuit.
- selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system.
- selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
- any device featuring a data processor and the ability to execute one or more instructions may be described as a computer, including but not limited to any type of personal computer (PC), single board computer (SBC), field-programmable gate array (FPGA), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network”.
- FIGS. 1A-1E are illustrations of magnification glasses according to at least some embodiments of the present invention.
- FIG. 2 is an illustration of magnification glasses according to at least some embodiments of the present invention.
- FIGS. 3A-3F are illustrations of the use of magnification glasses according to at least some embodiments of the present invention.
- FIGS. 4A and 4B show an illustration of a magnification headset according to at least some embodiments of the present invention.
- FIG. 5 shows an illustrative graph of the level of magnification provided by a magnification glasses according to at least some embodiments of the present invention.
- FIGS. 1A-1E show magnification glasses according to at least some embodiments of the present invention.
- FIGS. 1A and 1D shown a front view of the magnification glasses 100
- FIGS. 1B and 1E show a rear view of the magnification glasses 100
- FIG. 1C shows a top view of the magnification glasses 100 .
- glasses 100 comprise a glasses frame 102 as known in the art.
- Frame 102 is shaped as a frame of a standard pair of glasses as known in the art and is made from materials as known in the art such as plastic or any other suitable material.
- the design of frame 102 as shown in the figures should not be considered limiting.
- Glasses 100 comprise lenses 104 .
- Lenses 104 are optically clear and manufactured from glass or plastic or other composite material as known in the art.
- Lenses 104 are optionally adapted to the optical needs of the specific wearer such as featuring prescription lenses.
- lenses 104 are optionally coated with reflective materials to enable projection of an image thereon as will be described further below.
- lenses 104 are not provided such as when viewer 122 comprises a display screen 122 S.
- Frames comprise arms 106 wherein arms 106 fit over the ears of the wearer and nose bridge 108 for supporting glasses 100 on the nose of the wearer as known in the art.
- a camera assembly 110 is mounted on top of frame 102 .
- two camera assemblies 110 A and 110 B are provided, each providing information for the view provided to the left or right eye so as to enable stereoscopic vision where the camera assembly 110 A on the right provides for the image of the right eye and the camera assembly 110 B on the left provides for the image of the left eye.
- Each camera assembly 110 comprises at least two cameras 112 however the embodiment as shown should not be considered limiting and more than two cameras 112 may be provided.
- first camera 112 A and second camera 112 B each comprise a lens 160 , and image sensor 162 .
- any suitable combination of lens 160 and image sensor 162 are used and the specifications provided below should not be considered limiting.
- Cameras 112 are chosen with differing specifications resulting in a fixed working distance that suits the close range working environment. Camera specifications include but are not limited to apertures, sensor size, focal length, FOV (field of view), DOF (depth of field), and so forth.
- Camera 112 has a fixed working distance of between 100-400 mm. Different lens and camera combinations are preferably chosen to provide specific fixed close range working distances adapted to the specific application.
- An exemplary lens of first camera 112 A is a 25 mm FL f/8 lens such as the Blue Series M12 VideoTM Imaging Lens with the following specifications:
- An exemplary lens of second camera 112 B is a 10 mm FL f/8, such as the Blue Series M12 VideoTM Imaging Lens with the following specifications:
- each camera 112 the lens 160 is fitted to an image sensor 162 .
- An exemplary image sensor 162 such as provided in each of cameras 112 A and 112 B is the CAM 130 _CUMI 1820 _MOD 13 MP camera such as provided by e-con SystemsTM with the following specifications:
- image sensor 162 is chosen along with lens 160 to provide a specific magnification and working distance and the exemplary image sensor described above should not be considered limiting.
- camera 112 may be added to camera assembly 110 wherein each additional camera 112 provides a different level of magnification while maintaining the same working distance as other cameras 112 in camera assembly 110 .
- camera 112 comprises an infrared (IR) camera (not shown) capable of capturing IR images wherein processor 130 converts the captured IR images into images in the visible spectrum for display on viewer assemblies 120 .
- IR infrared
- Camera assembly 110 optionally comprises an illumination source 116 which may be any of an LED, LED array, or fiber optic array or any other illumination source.
- illumination source 116 comprises an ultraviolet (UV) light.
- Camera assembly 110 is connected to a processor 130 and a power source 140 .
- processor 130 and power source 140 are embedded inside frame 102 .
- processor 130 and power source 140 are provided in an external enclosure 260 ( FIG. 2A ).
- the electrical connections between camera assembly 110 and processor 130 and power source 140 are made using cables embedded inside frame 102 .
- the cables 262 are visible connecting the components of glasses 200 .
- an external enclosure 260 is shown as part of the embodiment of FIG. 2A , it should be appreciated that alternative embodiments are possible such as the glasses 100 with an external enclosure or the glasses 200 with processor 130 and power source 140 embedded within frame 102 .
- the image sensor 162 of each camera 112 communicates with processor 130 using any suitable protocol such as the non-liming example of the MIPI Camera Serial Interface.
- Processor 130 is a computing device as defined herein.
- Processor 130 is preferably in wireless or wired communication with an external processor (not shown) or data source such as a server (not shown).
- processor 130 is a Qualcomm® Qualcomm® Qualcomm® Qualcomm® SnapdragonTM 410/410E processor.
- a non-limiting example of an external processor is an Intel® Core i7-5557U.
- power source 140 include a battery or wired mains connection with a voltage adaptor.
- Optionally glasses 100 are connected to an external camera 132 by either wired or wireless connection.
- the image captured by external camera 132 is provided to processor 130 for transmitting to viewer assembly 120 .
- Viewer assembly 120 is mounted on the side of frame 102 .
- Viewer assembly 120 comprises viewer 122 and associated electronics (not shown).
- Viewer assembly 120 is shown here as duplicated with one viewer assembly 120 per eye of the user.
- only one viewer assembly 120 is provided.
- viewer assembly 122 can be tilted away from lenses 104 so as not to obstruct the view of the practitioner when viewer assembly 120 is not needed.
- Viewer assembly 122 can optionally be tilted upwards or to the sides.
- the position of viewers 122 as shown is illustrative and should not be considered limiting.
- viewer 122 comprises a display screen 122 S that is viewed by the user through lenses 104 .
- a non-limiting example of such a screen is the Vufine+TM.
- viewer 122 comprises a projector 122 P for projecting an image onto lenses 104 .
- a projector 122 P is the LumusTM OE33 with the following specifications:
- Controller buttons 124 are integrated into viewer assembly for control of the functioning of glasses 100 .
- controller buttons are provided on a separate controller (not shown) that is connected wirelessly or wired to processor 130 .
- Non-limiting examples of controller buttons 124 include a joystick, foot pedal, speech control, switch mounted on glasses 100 or attached to glasses 100 or similar
- Control of glasses 100 by controller buttons 124 includes but is not limited to powering on and off, selecting the view shown on viewer 122 , activating illumination source 116 , activating UV light (not shown), and so forth.
- FIG. 2 is an illustration of magnification glasses according to at least some embodiments of the present invention.
- the components of glasses 200 as shown in FIG. 2 are the same as those of FIGS. 1A-1C , however, in the embodiment of FIG. 2 , camera assembly 110 is mounted on a headband 210 .
- Cables 262 interconnect camera assemblies 110 , viewer assemblies 120 , processor 130 and power source 140 where processor 130 and power source 140 are provided inside enclosure 260 . Where possible, cables (not shown) are embedded inside frames 102 .
- FIGS. 3A-3F are illustrations of the use of magnification glasses according to at least some embodiments of the present invention.
- the usage is illustrated with the embodiment of FIGS. 1A-1C and also 1 D- 1 E as indicated below, however glasses 200 of FIG. 2 are also used in the same way as illustrated in FIGS. 3A-3F and as described herein and therefore the description should be considered to include these.
- the views of FIGS. 3A-3F are rear views showing the user view through glasses 100 or 200 .
- the illustrated non-limiting application is use of the glasses such as by a dentist or oral hygienist. Any other usage is possible and the use of the glasses of the present invention is not limited to dentistry.
- a user sees an unaided view of a patient through the lenses 104 of glasses 100 .
- viewer assemblies 120 are swiveled sideways so as not to obstruct the view of the user.
- FIG. 3B based on the embodiment of FIGS. 1D-1E , the user chooses to view a magnified view of the patient and selects a first level of magnification using buttons 124 .
- This first level of magnification is provided by first camera 112 A.
- the video from first camera 112 A is communicated to processor 130 and processor 130 connects the video to viewer assemblies 120 .
- viewer assemblies 120 When there are left and right first cameras 112 A, their respective video streams are connected to left and right viewer assemblies 120 respectively and viewer assemblies 120 use left and right projectors 122 P to project the magnified image onto lenses 104 as shown.
- FIG. 3C depicts the view after the user has again selected to increase the magnification by pressing the appropriate button 124 .
- the image from the first camera 112 A is digitally zoomed and then at a predefined crossover point the feed to viewer assembly 120 is switched to the second level of magnification provided by second camera 112 B.
- the video from second camera 112 B is communicated to processor 130 and processor 130 connects the video to viewer assemblies 120 where viewer assemblies 120 use projectors 122 P to project the magnified image onto lenses 104 as shown.
- their respective video streams are connected to left and right viewer assemblies 120 respectively.
- cameras 112 each with a set magnification level which is selected by the processor depending on the control of the user such that as a user activates buttons 124 for increased magnification the view shown by viewer assembly 120 is magnified as shown in FIG. 3G .
- cameras 112 with different capabilities such as IR or external cameras, and the images captured from these may be selected using controller buttons 124 for display on viewer assemblies 120 .
- the method of displaying the magnified view will vary as follows:
- viewer 122 is a display screen 122 S such as shown in FIG. 3D , the magnified or other view is shown on viewer 122 S;
- viewer 122 is a projector 122 P such as shown in FIGS. 3B, 3C, 3E and 3F , the magnified or other view is projected on the front of lenses 104 .
- the type of video or image shown on the screen 122 S or projection 122 P may be any one of the following:
- Magnification where the view is the magnified video or image captured by camera assembly 110 ;
- VR Virtual reality
- AR Augmented reality
- Data view where data related to the particular application is displayed to the user.
- the viewer 122 S could display an x-ray of the mouth of the patient such as in FIG. 3E .
- a bifocal view as in FIG. 3F shows patient data on a bottom portion of the lens with the magnified or unmagnified view visible in the upper portion of the lens.
- the split view may divide the lens in left and right portions, or multiple portions other than those shown.
- the same view is shown on each lens but optionally the view is different.
- the data for this data view is provided by an external data source that is connected to glasses 100 or 200 ;
- Infrared view where the view is the IR video or image captured by camera assembly 110 when camera assembly 110 comprises at least one IR camera;
- External camera view where the view is the view captured by an external camera 132 connected to glasses 100 or 200 .
- Magnification headset 400 is a headset as known in the art providing the functionality of glasses 100 in a headset arrangement. Items with the same drawing numbers as used above with reference to FIGS. 1A-1E have the same functionality. Magnification headset 400 may also be referred to herein as a “wearable display” or as “magnification glasses”.
- Headset 400 comprises camera assemblies 110 each comprising two cameras 112 A and 112 B.
- the images from camera assemblies 110 are displayed on viewers 122 H where each of viewers 122 H are screens mounted in headset 400 and positioned in front of the eyes of the wearer of headset 400 .
- a single screen 122 H is provided that is divided into two viewing portions (left and right)
- camera assemblies 110 comprise more than two cameras each.
- Camera assembly 110 is connected to a processor 130 and a power source 140 that are embedded inside headset 400 .
- the electrical connections between camera assembly 110 and the processor 130 and power source 140 are made using cables embedded inside headset 400 .
- the image sensor 162 of each camera 112 communicates with the processor 130 using any suitable protocol such as the non-liming example of the MIPI Camera Serial Interface.
- headset 400 is connected to an external camera 132 by either wired or wireless connection.
- the image captured by external camera 132 is provided to the processor 130 for transmitting to viewer assembly 120 .
- the type of video or image shown on screens 122 H may be any one of the following:
- Magnification where the view is the magnified video or image captured by camera assembly 110 ;
- VR Virtual reality
- Augmented reality Where the view captured by camera assembly 110 is augmented with data or other indications where the AR view is constructed by processor 130 or alternatively by a connected external processor (not shown);
- Data view where data related to the particular application is displayed to the user as described above with reference to FIGS. 3E and 3F ;
- Infrared view where the view is the IR video or image captured by camera assembly 110 when camera assembly 110 comprises at least one IR camera;
- External camera view where the view is the view captured by an external camera 132 connected to headset 400 .
- FIG. 5 shows an illustrative graph of the level of magnification provided by a magnification glasses according to at least some embodiments of the present invention.
- the graph 500 shows magnification for glasses 100 , 200 or 400 as described in hereinabove.
- camera assembly 110 comprises two cameras 112 A and 112 B, wherein first camera 112 A provides magnification of 2 X and second camera 112 B provides magnification of 6 X.
- magnification levels are illustrative and should not be considered limiting.
- controller 130 transmits the output of first camera 112 A to viewing assembly 120 such that the captured image is shown with a magnification of 2 X.
- controller 130 On slope 504 of graph 500 the image of camera 112 A is digitally zoomed by controller 130 such that the captured image is shown with increasing magnification.
- the increase in magnification is preferably triggered by a user of the glasses activating controller buttons 124 such as a button (not shown) for increasing magnification.
- the image of camera 112 A is digitally zoomed by controller 130 such that the captured image is shown on the viewing assembly with 6 ⁇ magnification.
- controller transmits the output of second camera 112 B to viewing assembly 120 such that the captured image is shown with a magnification of 6 X.
- the switch at point 506 between the capture of first camera 112 A and the capture of second camera 112 B is preferably seamless and the user of glasses is not aware of the switch.
- the image of camera 112 B is digitally zoomed by controller 130 such that the captured image is shown with increasing magnification.
- the increase in magnification is preferably triggered by a user of the glasses activating controller buttons 124 such as a button (not shown) for increasing magnification.
- magnification of the captured image displayed on viewing assembly 120 is therefore in the range of 2 ⁇ -10 ⁇ and may be increased or decreased between this range by the user activating controller buttons 124 .
- controller 130 switches from camera 112 B to camera 112 A.
Landscapes
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- General Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- High Energy & Nuclear Physics (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
Abstract
Description
- The present invention relates generally to magnification glasses and more specifically to magnification glasses that comprise multiple cameras each with the same working distance but providing different magnification.
- For many medical and industrial applications it is useful for the practitioner to have a magnified view of the work surface or patient. For example, dentists often need to magnify the area being worked on in the patient's mouth. Other practitioners needing close range magnified views include dermatologists, surgeons, gemologists such as for viewing gems, or agronomists such as for inspecting leaves and other plant parts. In the past this was most often achieved by wearing a headset or pair of glasses with optical loupes attached. In use, the loupes are generally swiveled into position in front of the practitioner's eyes when magnification is required and then swiveled away when not required. Achieving different levels of magnification usually requires swapping the loupe lenses. Optical loupes are relatively heavy due to their glass lenses and stick out once in position, occupying between 3-10 cm of space in front of the practitioner's eyes. Further—they are often fixed inside the glasses—permanently obstructing the view and requiring switching of glasses for different magnifications.
- An alternative to the optical loupe is use of a digital loupe comprising a digital camera that feeds an image to a small screen. As with the optical loupe solution the camera/screen combination can be mounted onto a headband or mounted as part of the glass in front of the practitioner's eyes when magnification is required. The image from the digital camera can then be digitally zoomed for greater magnification. The disadvantage of digital zoom is that the quality of the image suffers as the zoom is increased due to the reduced number of pixels in use, and therefore the range of magnification is limited. The use of optical magnification combined with a digital camera is possible but requires the use of auto-focus digital cameras which are more expensive and also heavier than the non-auto-focus equivalents.
- The background art therefore does not teach or suggest a system for close range magnification glasses using cost-effective digital cameras that can provide an extended range of magnification. It would also be desirable to be able to capture and transmit the view as seen through the magnification device.
- The presently claimed invention provides wearable glasses featuring variable magnification of close range subjects by using camera array comprising a plurality of cost-effective fixed focus, fixed distance, fixed magnification digital cameras each providing different magnification while maintaining the same working distance. The image captured by the camera array is displayed to the user using viewing assemblies that are one of: screens placed in front of the glasses; or image projection onto the glasses lenses, or a headset comprising viewing screens. The image displayed may be any of: the magnified view captured by the camera array; or virtual reality, or augmented reality; or any of these overlaid with relevant textual or image data. Optionally, infrared images or images lit with ultraviolet light are provided.
- The use of multiple fixed focus cameras each providing output images of different magnification for a fixed working distance lowers the cost of implementation, results in a magnification device with a more compact and lighter form factor, and increases the magnification range for digital camera based magnification glasses. The close range fixed working distance is preferably between 10-40 cm.
- Each of the fixed focus cameras in the array has a lens providing different maximum magnification allowing for fast switching between different levels of magnification. In an exemplary and preferable embodiment a first camera lens provides up to 6× magnification and a second camera lens provides up to 10× magnification. Alternatively a first camera lens provides up to 5× magnification and a second camera lens provides up to 10× magnification. Alternatively any combination of cameras is provided for the magnification and working distance required by the particular application. Optionally more than two cameras may be provided per array.
- The images captured by each camera in the array are provided to a processing unit that outputs a selected image from one of the cameras depending on the magnification required. Digital magnification of the output image is preferably provided from the camera with the lowest magnification lens up till the camera with the highest magnification lens with a seamless switch of the output image between the image captured from the first camera and the image captured from the second camera.
- Preferably the captured image is displayed on one or both of the left and right viewing assemblies. Alternatively the camera array may be duplicated with each array feeding a respective left or right viewing assembly to therefore provide magnified stereoscopic vision. The control of the magnification provided is via any suitable control device including but not limited to a joystick, foot pedal, speech control, switch mounted on the glasses or attached to them or similar. The image captured by the cameras can preferably be saved such as to non-volatile storage or transmitted for display on a local or remote display. The image displayed can preferably be overlaid with relevant textual or image data. The displays can preferably be divided into a bifocal arrangement for display of different views or overlay data on different parts of each display.
- According to at least some embodiments of the present invention, there are provided magnification glasses comprising: a first camera array for capturing a magnified image in the field of view of the glasses comprising at least a first camera and a second camera wherein each of the first camera and the second camera provide different levels of magnification, wherein the first camera and the second camera have the same working distance. Preferably the working distance is fixed between 150 mm to 400 mm. Preferably each of the first and second cameras provides a different magnification of between 2× and 10×. Preferably the working distance is fixed in a subrange between 150 mm to 400 mm. Preferably the working distance is fixed in a range between 150 mm to 400 mm.
- Preferably the glasses further comprise a first viewer assembly and a processor, wherein the processor receives a first magnified image captured by the first camera and a second magnified image captured by the second camera and transmits either one of the first image or the second image for display on the first viewer assembly, wherein the processor is a computing device. Preferably the processor digitally zooms either one of the first magnified image or the second magnified image before transmission to the first viewer assembly.
- Preferably the first viewer assembly comprises at least one of: a display screen positioned in front of the glasses frame; a display screen mounted in the glasses frame; or a projector for projecting on a lens in the frame. Optionally one or more of the cameras comprises an infrared camera. Preferably the processor provides for display on the viewer assembly at least one of: the magnified view captured by the camera assembly; a virtual reality view; an augmented reality view; a data view; or an infrared view.
- Preferably the glasses further comprise a second camera array and a second viewer assembly positioned on the same side of the glasses, wherein the first camera array and a first viewer assembly are positioned on the opposite side of the glasses wherein the first and second camera arrays are spaced horizontally apart so as to capture stereoscopic vision, wherein the processor transmits the image from the first camera array to the first viewer assembly and wherein the processor transmits the image from the second camera array to the second viewer assembly.
- Optionally the camera arrays are mounted on the glasses. Alternatively the camera arrays are mounted on a headband. Preferably each camera comprises a lens and an image sensor. Preferably the processor is housed inside the frames. Optionally the processor is housed in an external enclosure. Preferably the glasses further comprise a controller selected from the group consisting of: a joystick, a foot pedal, speech control, and a switch. Preferably the glasses further comprise an illumination source. Optionally the illumination source comprises an ultraviolet light.
- According to further embodiments of the present invention, there are provided electro-optical magnifying glasses comprising: a wearable display; and a first camera array for capturing a magnified image in the field of view of the glasses comprising at least a first camera and a second camera wherein each of the first camera and the second camera provide different levels of magnification, wherein the first camera and the second camera have the same working distance. Preferably the working distance is fixed between 150 mm to 400 mm. Preferably each of the first and second cameras provides a different magnification of between 2× and 10×.
- Preferably the glasses further comprise a processor, wherein the processor receives a first magnified image captured by the first camera and a second magnified image captured by the second camera and transmits either one of the first image or the second image for display on the display, wherein the processor is a computing device. Preferably the processor digitally zooms either one of the first magnified image or the second magnified image before transmission to the first viewer assembly. Optionally one or more of the cameras comprises an infrared camera.
- Preferably the processor provides for display on the display of at least one of: the magnified view captured by the camera assembly; a virtual reality view; an augmented reality view; a data view; or an infrared view.
- According to further embodiments of the present invention, there are provided magnification glasses comprising: wearable glasses frames with lenses; a first camera array comprising at least two fixed focus cameras capturing a magnified view of the field of view of the glasses; and a viewer assembly comprising at least one of: a display screen in front of at least one of the lenses for displaying at least the magnified view from one of the cameras; or a projector for projecting at least the magnified view from one of the cameras on at least one of the lenses. Preferably the glasses further comprise a second camera array wherein each of the first and second arrays are spaced horizontally apart so as to capture stereoscopic vision. Optionally the camera arrays are mounted on the glasses. Optionally the camera arrays are mounted on a headband.
- Preferably each camera comprises a lens and an image sensor and preferably the lens provides a magnification of between 2× and 10×. Preferably each camera comprises a graphics card. Preferably each camera has a working distance of 150 mm to 400 mm. Preferably the glasses further comprise a processor for receiving the image captured by the cameras and transmitting the image to the display screen or the projector, wherein the processor is a computing device. Optionally the processor is housed inside the frames. Optionally the processor is housed in an external enclosure.
- Preferably the viewer assembly can be swiveled away from the frames. Preferably the viewer assembly displays at least one of: a magnified view captured by the cameras; a virtual reality view; an augmented reality view; a data view, or a combination of these. Preferably the glasses further comprise a controller selected from the group consisting of: a joystick, a foot pedal, speech control, and a switch.
- According to further embodiments of the present invention, there are provided magnification glasses comprising: wearable glasses frames; at least two fixed focus camera capturing a magnified view of the field of view of the glasses; and a display screen in front of the glasses for displaying the magnified view from one of the cameras.
- As used herein the term “image” is used to described the digital capture from the cameras but it should be appreciated that in practice a stream of images or video is captured and the term image as used herein covers all of these capture types. The term “magnification glasses” may also refer to “loupe glasses”.
- The terms “camera array” or “camera assembly” or “camera set” as used herein refer to a set of cameras each with different magnification or other capabilities.
- The term “close range” as used herein refers to a distance of 10-40 cm. The term “working distance” as used herein is the distance wherein the image captured by the cameras is in focus.
- Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The materials, methods, and examples provided herein are illustrative only and not intended to be limiting.
- Implementation of the method and system of the present invention involves performing or completing certain selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of preferred embodiments of the method and system of the present invention, several selected steps could be implemented by hardware or by software on any operating system of any firmware or a combination thereof. For example, as hardware, selected steps of the invention could be implemented as a chip or a circuit. As software, selected steps of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In any case, selected steps of the method and system of the invention could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
- Although the present invention is described with regard to a “computing device”, a “computer”, or “mobile device”, it should be noted that optionally any device featuring a data processor and the ability to execute one or more instructions may be described as a computer, including but not limited to any type of personal computer (PC), single board computer (SBC), field-programmable gate array (FPGA), a server, a distributed server, a virtual server, a cloud computing platform, a cellular telephone, an IP telephone, a smartphone, or a PDA (personal digital assistant). Any two or more of such devices in communication with each other may optionally comprise a “network” or a “computer network”.
- The invention will now be described in connection with certain preferred embodiments with reference to the following illustrative figures so that it may be more fully understood. With specific reference now to the figures in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
- In the drawings:
-
FIGS. 1A-1E , are illustrations of magnification glasses according to at least some embodiments of the present invention; -
FIG. 2 is an illustration of magnification glasses according to at least some embodiments of the present invention; -
FIGS. 3A-3F are illustrations of the use of magnification glasses according to at least some embodiments of the present invention; -
FIGS. 4A and 4B show an illustration of a magnification headset according to at least some embodiments of the present invention; and -
FIG. 5 shows an illustrative graph of the level of magnification provided by a magnification glasses according to at least some embodiments of the present invention. - In all the figures similar reference numerals identify similar parts.
- The present invention will be more fully understood from the following detailed description of the preferred embodiments thereof, taken together with the drawings. Reference is now made to
FIGS. 1A-1E , which show magnification glasses according to at least some embodiments of the present invention.FIGS. 1A and 1D shown a front view of themagnification glasses 100,FIGS. 1B and 1E show a rear view of themagnification glasses 100, andFIG. 1C shows a top view of themagnification glasses 100. - As shown,
glasses 100 comprise aglasses frame 102 as known in the art.Frame 102 is shaped as a frame of a standard pair of glasses as known in the art and is made from materials as known in the art such as plastic or any other suitable material. The design offrame 102 as shown in the figures should not be considered limiting.Glasses 100 compriselenses 104.Lenses 104 are optically clear and manufactured from glass or plastic or other composite material as known in the art.Lenses 104 are optionally adapted to the optical needs of the specific wearer such as featuring prescription lenses. In some embodiments,lenses 104 are optionally coated with reflective materials to enable projection of an image thereon as will be described further below.Optionally lenses 104 are not provided such as whenviewer 122 comprises adisplay screen 122S. - Frames comprise
arms 106 whereinarms 106 fit over the ears of the wearer andnose bridge 108 for supportingglasses 100 on the nose of the wearer as known in the art. - A camera assembly 110 is mounted on top of
frame 102. In the embodiment as shown, two 110A and 110B are provided, each providing information for the view provided to the left or right eye so as to enable stereoscopic vision where thecamera assemblies camera assembly 110A on the right provides for the image of the right eye and thecamera assembly 110B on the left provides for the image of the left eye. Each camera assembly 110 comprises at least two cameras 112 however the embodiment as shown should not be considered limiting and more than two cameras 112 may be provided. - In a preferred embodiment
first camera 112A andsecond camera 112B each comprise alens 160, andimage sensor 162. Optionally any suitable combination oflens 160 andimage sensor 162 are used and the specifications provided below should not be considered limiting. Cameras 112 are chosen with differing specifications resulting in a fixed working distance that suits the close range working environment. Camera specifications include but are not limited to apertures, sensor size, focal length, FOV (field of view), DOF (depth of field), and so forth. Camera 112 has a fixed working distance of between 100-400 mm. Different lens and camera combinations are preferably chosen to provide specific fixed close range working distances adapted to the specific application. - An exemplary lens of
first camera 112A is a 25 mm FL f/8 lens such as the Blue Series M12 Video™ Imaging Lens with the following specifications: -
- Focal Length FL (mm): 25.0
- Aperture (f/#): f/2.5
- Working Distance (mm): 150-400
- Maximum Camera Sensor Format: ½″
- Distortion (%): 0.3 Diagonal, 0.14 Horizontal
- Field of View: ½″
- Sensor: 35-60 mm
- Length (mm): 30.0
- Outer Diameter (mm): 14.0
- An exemplary lens of
second camera 112B is a 10 mm FL f/8, such as the Blue Series M12 Video™ Imaging Lens with the following specifications: -
- Focal Length FL (mm): 10.0
- Aperture (f/#): f/8
- Working Distance (mm): 150-400
- Maximum Camera Sensor Format: ⅓″
- Distortion (%): −1.5 Diagonal, −0.87 Horizontal
- Field of View, ⅓″
- Sensor: 72-120 mm
- Length (mm): 17
- Outer Diameter (mm): 14.0
- In each camera 112 the
lens 160 is fitted to animage sensor 162. Anexemplary image sensor 162 such as provided in each of 112A and 112B is the CAM130_CUMI1820_MOD 13 MP camera such as provided by e-con Systems™ with the following specifications:cameras -
- CMOS Image Sensor from Aptina™/ON Semiconductor®
- 1.25 μm pixel size with Aptina/ON Semiconductor A-PixHS™ BSI technology
- 1/2.3″ optical form-factor
- Dynamic Range: 65.8 dB
- SNRMAX: 36.3 dB
- Electronic Rolling Shutter
- Responsivity: 0.62V/lux-sec
- As above,
image sensor 162 is chosen along withlens 160 to provide a specific magnification and working distance and the exemplary image sensor described above should not be considered limiting. - As above, additional cameras 112 may be added to camera assembly 110 wherein each additional camera 112 provides a different level of magnification while maintaining the same working distance as other cameras 112 in camera assembly 110. Optionally, camera 112 comprises an infrared (IR) camera (not shown) capable of capturing IR images wherein
processor 130 converts the captured IR images into images in the visible spectrum for display onviewer assemblies 120. - Camera assembly 110 optionally comprises an
illumination source 116 which may be any of an LED, LED array, or fiber optic array or any other illumination source.Optionally illumination source 116 comprises an ultraviolet (UV) light. - Camera assembly 110 is connected to a
processor 130 and apower source 140. As shown (FIG. 1C )processor 130 andpower source 140 are embedded insideframe 102.Optionally processor 130 andpower source 140 are provided in an external enclosure 260 (FIG. 2A ). The electrical connections between camera assembly 110 andprocessor 130 andpower source 140 are made using cables embedded insideframe 102. Where anexternal enclosure 260 is provided, thecables 262 are visible connecting the components ofglasses 200. Although anexternal enclosure 260 is shown as part of the embodiment ofFIG. 2A , it should be appreciated that alternative embodiments are possible such as theglasses 100 with an external enclosure or theglasses 200 withprocessor 130 andpower source 140 embedded withinframe 102. Theimage sensor 162 of each camera 112 communicates withprocessor 130 using any suitable protocol such as the non-liming example of the MIPI Camera Serial Interface. -
Processor 130 is a computing device as defined herein.Processor 130 is preferably in wireless or wired communication with an external processor (not shown) or data source such as a server (not shown). A non-limiting example ofprocessor 130 is a Qualcomm® Snapdragon™ 410/410E processor. A non-limiting example of an external processor is an Intel® Core i7-5557U. Non-limiting examples ofpower source 140 include a battery or wired mains connection with a voltage adaptor. -
Optionally glasses 100 are connected to anexternal camera 132 by either wired or wireless connection. The image captured byexternal camera 132 is provided toprocessor 130 for transmitting toviewer assembly 120. -
Viewer assembly 120 is mounted on the side offrame 102.Viewer assembly 120 comprisesviewer 122 and associated electronics (not shown).Viewer assembly 120 is shown here as duplicated with oneviewer assembly 120 per eye of the user. Optionally, only oneviewer assembly 120 is provided. Optionally,viewer assembly 122 can be tilted away fromlenses 104 so as not to obstruct the view of the practitioner whenviewer assembly 120 is not needed.Viewer assembly 122 can optionally be tilted upwards or to the sides. The position ofviewers 122 as shown is illustrative and should not be considered limiting. - As shown in
FIGS. 1A-1C ,viewer 122 comprises adisplay screen 122S that is viewed by the user throughlenses 104. A non-limiting example of such a screen is the Vufine+™. - In an alternative embodiment as shown in
FIGS. 1D-1E ,viewer 122 comprises aprojector 122P for projecting an image ontolenses 104. A non-limiting example of such aprojector 122P is the Lumus™ OE33 with the following specifications: -
- Resolution: 1280×720
- FOV: Diagonal 40°
- Configuration: Side-mounted
- Orientation: Landscape
- Transparency: True see-through
-
Controller buttons 124 are integrated into viewer assembly for control of the functioning ofglasses 100. Alternatively controller buttons are provided on a separate controller (not shown) that is connected wirelessly or wired toprocessor 130. Non-limiting examples ofcontroller buttons 124 include a joystick, foot pedal, speech control, switch mounted onglasses 100 or attached toglasses 100 or similar Control ofglasses 100 bycontroller buttons 124 includes but is not limited to powering on and off, selecting the view shown onviewer 122, activatingillumination source 116, activating UV light (not shown), and so forth. - Reference is now made to
FIG. 2 which is an illustration of magnification glasses according to at least some embodiments of the present invention. The components ofglasses 200 as shown inFIG. 2 are the same as those ofFIGS. 1A-1C , however, in the embodiment ofFIG. 2 , camera assembly 110 is mounted on aheadband 210.Cables 262 interconnect camera assemblies 110,viewer assemblies 120,processor 130 andpower source 140 whereprocessor 130 andpower source 140 are provided insideenclosure 260. Where possible, cables (not shown) are embedded inside frames 102. - Reference is now made to
FIGS. 3A-3F which are illustrations of the use of magnification glasses according to at least some embodiments of the present invention. The usage is illustrated with the embodiment ofFIGS. 1A-1C and also 1D-1E as indicated below, howeverglasses 200 ofFIG. 2 are also used in the same way as illustrated inFIGS. 3A-3F and as described herein and therefore the description should be considered to include these. The views ofFIGS. 3A-3F are rear views showing the user view through 100 or 200. The illustrated non-limiting application is use of the glasses such as by a dentist or oral hygienist. Any other usage is possible and the use of the glasses of the present invention is not limited to dentistry.glasses - As shown in
FIG. 3A , a user sees an unaided view of a patient through thelenses 104 ofglasses 100. In the embodiment ofFIGS. 1A-1C ,viewer assemblies 120 are swiveled sideways so as not to obstruct the view of the user. - In
FIG. 3B , based on the embodiment ofFIGS. 1D-1E , the user chooses to view a magnified view of the patient and selects a first level ofmagnification using buttons 124. This first level of magnification is provided byfirst camera 112A. The video fromfirst camera 112A is communicated toprocessor 130 andprocessor 130 connects the video toviewer assemblies 120. When there are left and rightfirst cameras 112A, their respective video streams are connected to left andright viewer assemblies 120 respectively andviewer assemblies 120 use left andright projectors 122P to project the magnified image ontolenses 104 as shown. -
FIG. 3C depicts the view after the user has again selected to increase the magnification by pressing theappropriate button 124. In this case the image from thefirst camera 112A is digitally zoomed and then at a predefined crossover point the feed toviewer assembly 120 is switched to the second level of magnification provided bysecond camera 112B. The video fromsecond camera 112B is communicated toprocessor 130 andprocessor 130 connects the video toviewer assemblies 120 whereviewer assemblies 120use projectors 122P to project the magnified image ontolenses 104 as shown. When there are left and rightsecond cameras 112B, their respective video streams are connected to left andright viewer assemblies 120 respectively. As above, there may optionally be more cameras 112 each with a set magnification level which is selected by the processor depending on the control of the user such that as a user activatesbuttons 124 for increased magnification the view shown byviewer assembly 120 is magnified as shown inFIG. 3G . Further, as above, there may be cameras 112 with different capabilities such as IR or external cameras, and the images captured from these may be selected usingcontroller buttons 124 for display onviewer assemblies 120. - Depending on the type of
viewer 122 used, the method of displaying the magnified view will vary as follows: - When
viewer 122 is adisplay screen 122S such as shown inFIG. 3D , the magnified or other view is shown onviewer 122S; - When
viewer 122 is aprojector 122P such as shown inFIGS. 3B, 3C, 3E and 3F , the magnified or other view is projected on the front oflenses 104. - The type of video or image shown on the
screen 122S orprojection 122P may be any one of the following: - Magnification: where the view is the magnified video or image captured by camera assembly 110;
- Virtual reality (VR): where the projected VR view is constructed by
processor 130 or alternatively by a connected external processor (not shown); Augmented reality (AR): Where the view captured by camera assembly 110 is augmented with data or other indications where the AR view is constructed byprocessor 130 or alternatively by a connected external processor (not shown); - Data view: where data related to the particular application is displayed to the user. In a non-limiting example for a dentistry application the
viewer 122S could display an x-ray of the mouth of the patient such as inFIG. 3E . In a further non-limiting example a bifocal view as inFIG. 3F shows patient data on a bottom portion of the lens with the magnified or unmagnified view visible in the upper portion of the lens. Alternatively the split view may divide the lens in left and right portions, or multiple portions other than those shown. Preferably the same view is shown on each lens but optionally the view is different. The data for this data view is provided by an external data source that is connected to 100 or 200;glasses - Infrared view: where the view is the IR video or image captured by camera assembly 110 when camera assembly 110 comprises at least one IR camera;
- External camera view: where the view is the view captured by an
external camera 132 connected to 100 or 200.glasses - Reference is now made to
FIGS. 4A and 4B which show an illustration of a magnification headset according to at least some embodiments of the present invention.Magnification headset 400 is a headset as known in the art providing the functionality ofglasses 100 in a headset arrangement. Items with the same drawing numbers as used above with reference toFIGS. 1A-1E have the same functionality.Magnification headset 400 may also be referred to herein as a “wearable display” or as “magnification glasses”. -
Headset 400 comprises camera assemblies 110 each comprising two 112A and 112B. The images from camera assemblies 110 are displayed oncameras viewers 122H where each ofviewers 122H are screens mounted inheadset 400 and positioned in front of the eyes of the wearer ofheadset 400. Optionally asingle screen 122H is provided that is divided into two viewing portions (left and right) Optionally camera assemblies 110 comprise more than two cameras each. - Camera assembly 110 is connected to a
processor 130 and apower source 140 that are embedded insideheadset 400. The electrical connections between camera assembly 110 and theprocessor 130 andpower source 140 are made using cables embedded insideheadset 400. Theimage sensor 162 of each camera 112 communicates with theprocessor 130 using any suitable protocol such as the non-liming example of the MIPI Camera Serial Interface. -
Optionally headset 400 is connected to anexternal camera 132 by either wired or wireless connection. The image captured byexternal camera 132 is provided to theprocessor 130 for transmitting toviewer assembly 120. - The type of video or image shown on
screens 122H may be any one of the following: - Magnification: where the view is the magnified video or image captured by camera assembly 110;
- Virtual reality (VR): where the projected VR view is constructed by
processor 130 or alternatively by a connected external processor (not shown); - Augmented reality (AR): Where the view captured by camera assembly 110 is augmented with data or other indications where the AR view is constructed by
processor 130 or alternatively by a connected external processor (not shown); - Data view: where data related to the particular application is displayed to the user as described above with reference to
FIGS. 3E and 3F ; - Infrared view: where the view is the IR video or image captured by camera assembly 110 when camera assembly 110 comprises at least one IR camera;
- External camera view: where the view is the view captured by an
external camera 132 connected toheadset 400. - Reference is now made to
FIG. 5 which shows an illustrative graph of the level of magnification provided by a magnification glasses according to at least some embodiments of the present invention. Thegraph 500 shows magnification for 100, 200 or 400 as described in hereinabove. Forglasses graph 500 camera assembly 110 comprises two 112A and 112B, whereincameras first camera 112A provides magnification of 2X andsecond camera 112B provides magnification of 6X. These magnification levels are illustrative and should not be considered limiting. - As shown in
graph 500, the magnification of the captured image shown onviewing assembly 120 varies between 2× and 10×. The working distance remains fixed at a specific working distance even as the magnification changes. Atpoint 502,controller 130 transmits the output offirst camera 112A toviewing assembly 120 such that the captured image is shown with a magnification of 2X. Onslope 504 ofgraph 500 the image ofcamera 112A is digitally zoomed bycontroller 130 such that the captured image is shown with increasing magnification. The increase in magnification is preferably triggered by a user of the glasses activatingcontroller buttons 124 such as a button (not shown) for increasing magnification. - At
point 506 ofgraph 500, the image ofcamera 112A is digitally zoomed bycontroller 130 such that the captured image is shown on the viewing assembly with 6× magnification. At this point controller transmits the output ofsecond camera 112B toviewing assembly 120 such that the captured image is shown with a magnification of 6X. The switch atpoint 506 between the capture offirst camera 112A and the capture ofsecond camera 112B is preferably seamless and the user of glasses is not aware of the switch. Onslope 508 ofgraph 500 the image ofcamera 112B is digitally zoomed bycontroller 130 such that the captured image is shown with increasing magnification. As before the increase in magnification is preferably triggered by a user of the glasses activatingcontroller buttons 124 such as a button (not shown) for increasing magnification. - It should be appreciated that the magnification of the captured image displayed on
viewing assembly 120 is therefore in the range of 2×-10× and may be increased or decreased between this range by the user activatingcontroller buttons 124. When the output magnification decrease below 6× atpoint 506,controller 130 switches fromcamera 112B tocamera 112A. - In the detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that these are specific embodiments and that the present invention may be practiced also in different ways that embody the characterizing features of the invention as described and claimed herein. Combinations of the above embodiments are also considered. As a non-limiting example, the headset of
FIGS. 4A and 4B could use the projectors as described for the viewing assembly ofFIGS. 1A-1E . - It is to be understood that the invention is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways. Those skilled in the art will readily appreciate that various modifications and changes can be applied to the embodiments of the invention as hereinbefore described without departing from its scope, defined in and by the appended claims.
Claims (21)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/625,780 US10877262B1 (en) | 2017-06-21 | 2018-06-21 | Magnification glasses with multiple cameras |
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201762522692P | 2017-06-21 | 2017-06-21 | |
| PCT/IL2018/050691 WO2018235088A1 (en) | 2017-06-21 | 2018-06-21 | MAGNIFICATION LENSES WITH MULTIPLE CAMERAS |
| US16/625,780 US10877262B1 (en) | 2017-06-21 | 2018-06-21 | Magnification glasses with multiple cameras |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20200386982A1 true US20200386982A1 (en) | 2020-12-10 |
| US10877262B1 US10877262B1 (en) | 2020-12-29 |
Family
ID=64736952
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/625,780 Expired - Fee Related US10877262B1 (en) | 2017-06-21 | 2018-06-21 | Magnification glasses with multiple cameras |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US10877262B1 (en) |
| EP (1) | EP3642662B1 (en) |
| WO (1) | WO2018235088A1 (en) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12007563B2 (en) * | 2020-10-27 | 2024-06-11 | Fujifilm Corporation | Display control device, display control method, and display control program |
| US20240205522A1 (en) * | 2022-12-19 | 2024-06-20 | ARSpectra S.à.r.l | Head mounted display device and system |
| EP4388734A1 (en) | 2021-08-18 | 2024-06-26 | Augmedics Ltd. | Stereoscopic display and digital loupe for augmented-reality near-eye display |
| US12412346B2 (en) | 2022-04-21 | 2025-09-09 | Augmedics Ltd. | Methods for medical image visualization |
Families Citing this family (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2536650A (en) | 2015-03-24 | 2016-09-28 | Augmedics Ltd | Method and system for combining video-based and optic-based augmented reality in a near eye display |
| IL244255A (en) * | 2016-02-23 | 2017-04-30 | Vertical Optics Llc | Wearable vision redirecting devices |
| US9690119B2 (en) | 2015-05-15 | 2017-06-27 | Vertical Optics, LLC | Wearable vision redirecting devices |
| US12458411B2 (en) | 2017-12-07 | 2025-11-04 | Augmedics Ltd. | Spinous process clamp |
| US11980507B2 (en) | 2018-05-02 | 2024-05-14 | Augmedics Ltd. | Registration of a fiducial marker for an augmented reality system |
| US11766296B2 (en) | 2018-11-26 | 2023-09-26 | Augmedics Ltd. | Tracking system for image-guided surgery |
| US20220091442A1 (en) | 2019-01-24 | 2022-03-24 | Cao Group, Inc. | Electronic loupe |
| US12481174B2 (en) | 2019-01-24 | 2025-11-25 | Cao Group, Inc. | Electronic loupe |
| US11980506B2 (en) | 2019-07-29 | 2024-05-14 | Augmedics Ltd. | Fiducial marker |
| US12178666B2 (en) | 2019-07-29 | 2024-12-31 | Augmedics Ltd. | Fiducial marker |
| IT201900013164A1 (en) * | 2019-07-29 | 2021-01-29 | Eye Tech Lab S R L | AUGMENTED REALITY MAGNIFYING GLASSES |
| US11382712B2 (en) | 2019-12-22 | 2022-07-12 | Augmedics Ltd. | Mirroring in image guided surgery |
| US11166006B2 (en) | 2020-01-22 | 2021-11-02 | Photonic Medical Inc. | Open view, multi-modal, calibrated digital loupe with depth sensing |
| CN111552076B (en) * | 2020-05-13 | 2022-05-06 | 歌尔科技有限公司 | Image display method, AR glasses and storage medium |
| US11389252B2 (en) | 2020-06-15 | 2022-07-19 | Augmedics Ltd. | Rotating marker for image guided surgery |
| US12239385B2 (en) | 2020-09-09 | 2025-03-04 | Augmedics Ltd. | Universal tool adapter |
| US11896445B2 (en) | 2021-07-07 | 2024-02-13 | Augmedics Ltd. | Iliac pin and adapter |
| US12150821B2 (en) | 2021-07-29 | 2024-11-26 | Augmedics Ltd. | Rotating marker and adapter for image-guided surgery |
| EP4587881A1 (en) | 2022-09-13 | 2025-07-23 | Augmedics Ltd. | Augmented reality eyewear for image-guided medical intervention |
| IT202200020418A1 (en) * | 2022-10-04 | 2024-04-04 | Zoetec S R L | Optical magnifying device worn on an operator's head |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR2408848A1 (en) | 1977-11-14 | 1979-06-08 | Freche Charles | LIGHTING GLASSES WITH VARIABLE MAGNIFICATION |
| CA2233047C (en) * | 1998-02-02 | 2000-09-26 | Steve Mann | Wearable camera system with viewfinder means |
| EP1884819A1 (en) * | 2006-08-02 | 2008-02-06 | Swiss Medical Technology GmbH | Eyewear with segmented look-through elements |
| KR20150086477A (en) * | 2012-11-19 | 2015-07-28 | 오란게덴탈 게엠베하 운트 코카게 | Magnification loupe with display system |
| EP3108444A4 (en) * | 2014-02-19 | 2017-09-13 | Evergaze, Inc. | Apparatus and method for improving, augmenting or enhancing vision |
| CA2949241A1 (en) * | 2014-05-20 | 2015-11-26 | University Of Washington Through Its Center For Commercialization | Systems and methods for mediated-reality surgical visualization |
| IL244255A (en) | 2016-02-23 | 2017-04-30 | Vertical Optics Llc | Wearable vision redirecting devices |
| US9690119B2 (en) * | 2015-05-15 | 2017-06-27 | Vertical Optics, LLC | Wearable vision redirecting devices |
| US10473942B2 (en) * | 2015-06-05 | 2019-11-12 | Marc Lemchen | Apparatus and method for image capture of medical or dental images using a head mounted camera and computer system |
-
2018
- 2018-06-21 WO PCT/IL2018/050691 patent/WO2018235088A1/en not_active Ceased
- 2018-06-21 EP EP18820877.1A patent/EP3642662B1/en active Active
- 2018-06-21 US US16/625,780 patent/US10877262B1/en not_active Expired - Fee Related
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12007563B2 (en) * | 2020-10-27 | 2024-06-11 | Fujifilm Corporation | Display control device, display control method, and display control program |
| US12332440B2 (en) | 2020-10-27 | 2025-06-17 | Fujifilm Corporation | Display control device, display control method, and display control program |
| EP4388734A1 (en) | 2021-08-18 | 2024-06-26 | Augmedics Ltd. | Stereoscopic display and digital loupe for augmented-reality near-eye display |
| EP4388734A4 (en) * | 2021-08-18 | 2025-05-07 | Augmedics Ltd. | Stereoscopic display and digital loupe for augmented-reality near-eye display |
| US12417595B2 (en) | 2021-08-18 | 2025-09-16 | Augmedics Ltd. | Augmented-reality surgical system using depth sensing |
| US12412346B2 (en) | 2022-04-21 | 2025-09-09 | Augmedics Ltd. | Methods for medical image visualization |
| US20240205522A1 (en) * | 2022-12-19 | 2024-06-20 | ARSpectra S.à.r.l | Head mounted display device and system |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2018235088A1 (en) | 2018-12-27 |
| EP3642662A1 (en) | 2020-04-29 |
| EP3642662A4 (en) | 2020-06-10 |
| US10877262B1 (en) | 2020-12-29 |
| EP3642662B1 (en) | 2023-05-03 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10877262B1 (en) | Magnification glasses with multiple cameras | |
| US5777715A (en) | Low vision rehabilitation system | |
| US9301682B2 (en) | Eye examination apparatus with digital image output | |
| CN102954836B (en) | Ambient light sensor, user applying device and display device | |
| JP7678713B2 (en) | Imaging device and method | |
| US20090059364A1 (en) | Systems and methods for electronic and virtual ocular devices | |
| CN101840068A (en) | Head-worn optoelectronic automatic focusing visual aid | |
| EP2719318A1 (en) | Auto zoom for video camera | |
| US9916771B2 (en) | Portable vision aid with motion pan | |
| US20250362529A1 (en) | Electronic loupe | |
| WO2016056699A1 (en) | Wearable display device | |
| CN106338819A (en) | Digital viewing full-view-field AR (augmented reality) multimedia telescope | |
| WO2019009008A1 (en) | Imaging apparatus with second imaging element used for correcting vignetting in images captured by first imaging element | |
| CN110088662A (en) | Imaging system and the method for generating background image and focusedimage | |
| US20130050565A1 (en) | Image focusing | |
| CN113454989A (en) | Head-mounted display device | |
| CN104280882A (en) | Display system and display method | |
| JP3205552B2 (en) | 3D image pickup device | |
| CN109040737A (en) | A kind of live streaming glasses based on 3D augmented reality | |
| CN107907987A (en) | 3D microscopes based on mixed reality | |
| JP2008123257A (en) | Remote operation support system and display control method | |
| JP2008124795A (en) | Remote work support system and displaying method of the same | |
| US12481174B2 (en) | Electronic loupe | |
| US20250260902A1 (en) | Electronic device, control method for electronic device, and non-transitory computer readable medium | |
| CN103763408A (en) | Mobile terminal with magnifier function |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO MICRO (ORIGINAL EVENT CODE: MICR); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO MICRO (ORIGINAL EVENT CODE: MICR); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: MICROENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20241229 |