[go: up one dir, main page]

WO2025163631A1 - Multimodal smart eyeglasses for adaptive vision, predictive ocular and hemodynamic monitoring, and emergency response - Google Patents

Multimodal smart eyeglasses for adaptive vision, predictive ocular and hemodynamic monitoring, and emergency response

Info

Publication number
WO2025163631A1
WO2025163631A1 PCT/IB2025/053330 IB2025053330W WO2025163631A1 WO 2025163631 A1 WO2025163631 A1 WO 2025163631A1 IB 2025053330 W IB2025053330 W IB 2025053330W WO 2025163631 A1 WO2025163631 A1 WO 2025163631A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical
lens
eye
visual
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2025/053330
Other languages
French (fr)
Inventor
Faezehalsadat SEYEDKHAMOUSHI
Mohammad SASSANI ASL
Rozasadat SEYEDKHAMOOSHI
Hamid Reza ABDOLLAHI
Mohammadhossein SHARIFNIA
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to PCT/IB2025/053330 priority Critical patent/WO2025163631A1/en
Publication of WO2025163631A1 publication Critical patent/WO2025163631A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/107Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining the shape or measuring the curvature of the cornea
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/12Fluid-filled or evacuated lenses
    • G02B3/14Fluid-filled or evacuated lenses of variable focal length
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B2207/00Coding scheme for general features or characteristics of optical elements and systems of subclass G02B, but not including elements and systems which would be classified in G02B6/00 and subgroups
    • G02B2207/109Sols, gels, sol-gel materials

Definitions

  • This invention discloses a multimodal smart eyewear system comprising a pair of multilayer adaptive lenses with electrochromic gel, a surrounding elastic structure for dynamic shape modulation, integrated visible and ultraviolet light sensors, and a unified electronic system including a quantum microprocessor, micro-pump, micro-cameras, infrared transceivers, ultrasonic modules, and prism-based reflectors.
  • the eyewear is configured to dynamically adjust its focal length and light transmittance in real time, based on ambient illumination, user-specific pupillary responses, and object distance, without requiring mechanical translation.
  • Ocular surface imaging, fatigue detection, and predictive analysis of hemodynamic and balance-related abnormalities are enabled through infrared tracking, biometric learning, and high-frequency wave-based scanning, including ultrasonic ptychography.
  • Emergency detection is further supported by integrated gyroscopic sensors that initiate alert protocols upon detecting sudden acceleration or loss of consciousness, communicating wirelessly with smartphones, vehicles, or neural chips for immediate intervention.
  • the processor operates through parallel optical and electrical channels, allowing rapid control of visual parameters and environmental adaptation.
  • a brain-machine interface enables two-way neural communication, while the processor maintains override logic to ensure visual safety in contradiction scenarios.
  • This claimed invention can be searched through international codes (IPC) and international classified codes (Cooperative patent classification with the abbreviation CPC)), A61B3/00, A61B3/103, A61B3/16, A61B5/00, A61B5/0075, A61B5/0205, A61B5/02427, A61B5/1116, A61B5/1117, A61B5/4836, A61B5/6821, A61F9/00, A61H39/00, G02B21/0032, G02B26/0808, G02B2027/014, G02B2027/0187, G02B27/00, G02B27/0075, G02B27/01, G02B27/0093, G08B21/0492, G02C7/101, G02C13/005, G06F3/012, G06F3/013, G06V10/44, G06V40/166, H02J7/00 and H04N23 in search engines and international online databases.
  • a diagnostic eyewear system for assessing eye movement functions in immersive environments.
  • the system includes a head-mounted display, an eye sensor (video camera), and a head orientation sensor to measure saccades, vergence, eyelid closure, and gaze tracking. It presents virtual, augmented, or 3D synthetic content to stimulate ocular response. Using Fourier transform analysis, it generates vertical and horizontal gain signals for clinical interpretation of eye behavior. This allows non-invasive evaluation of ocular performance in both medical and research applications.
  • an interactive head-mounted eyepiece system for overlaying digital content onto a user's natural field of view.
  • the system includes an integrated image source and processor for handling visual content, which is introduced into an optical assembly containing a curved polarizing film and a reflective image display.
  • a light source directs illumination toward the curved film, which reflects light to the image display.
  • the reflected image is then passed through an optically flat film and a partially reflective curved mirror, allowing both the displayed and real-world scenes to merge into a unified visual output for the user.
  • an interactive head-mounted eyepiece is introduced for combining digital and real-world visuals through an optical assembly.
  • the system includes an integrated image source and processor for delivering content to the user’s eye.
  • the optical assembly features an optically flat film positioned at an angle in front of the eye, which partially reflects image light from the display while simultaneously allowing ambient scene light to pass through. This results in a composite image formed by both reflected display content and the transmitted real-world view, enabling a seamless augmented reality experience.
  • the system integrates an image source and processor within an eyepiece, where an LED lighting system is positioned along the edge of a wedge-shaped light guide.
  • the angled geometry of the wedge redirects light to uniformly illuminate a reflective image display.
  • This display reflects the generated image through the light guide and optical assembly, allowing the user to view both digital content and the real environment simultaneously.
  • the configuration enhances brightness distribution and optical clarity for augmented reality applications.
  • a dynamic see-through eyewear system is presented with intelligent brightness modulation.
  • the invention comprises an interactive head-mounted display with a see-through optical assembly, an integrated image source, and a processor capable of modifying brightness based on environmental light.
  • a key feature is the auto-brightness control, which adjusts the brightness of specific areas of the display independently, using components like electrochromic materials or liquid crystal devices. This system ensures adaptive visibility in changing lighting conditions and enhances user comfort by aligning display brightness with natural eye adaptation.
  • WHUDs wearable heads-up displays
  • the invention utilizes a calibration point model composed of multiple gaze targets, which is dynamically adjusted based on the user’s real-time interaction with user interface (UI) elements.
  • UI user interface
  • These UI elements are strategically designed to support seamless, in-use calibration, ensuring ongoing gaze precision without disrupting user experience. This approach enables adaptive eye tracking for enhanced visual responsiveness and display interaction in WHUDs.
  • a faceguard system for evaluating eye muscle responses to head impacts.
  • the invention integrates an eye sensor, such as a video camera, and a head orientation sensor into a faceguard with an open visual aperture.
  • the system measures eyeball movement, pupil size, and eyelid behavior, alongside pitch and yaw of the head.
  • An electronic circuit processes data from both sensors to assess ocular performance in response to motion or impact, enabling real-time monitoring of neurological or motor disturbances. This has applications in sports safety and concussion detection.
  • the disclosed system includes an optical assembly for displaying virtual content overlaid on real-world environments, an integrated processor, a mounted camera for gesture recognition, and a wireless communication module that connects users via an online gaming platform.
  • the system can interpret player gestures and relay gesture-related data between multiple users wearing similar devices. It also supports body-mounted controllers, motion sensors, and smartphone integration to enrich gameplay.
  • the interactive display supports 3D visuals, promoting immersive multiplayer experiences across social and gaming networks.
  • a portable ocular reflex measuring device for evaluating eye movement responses to head motion in real-world settings.
  • the system includes an eye orientation sensor (e.g., image detector, magnetic sensor) and a head orientation sensor (e.g., gyroscope, magnetometer, accelerometer) to track pitch and yaw between 0.01 Hz and 15 Hz.
  • a central processing unit compares data from both sensors, applies Fourier transforms, and calculates vertical/horizontal gain and phase to assess vestibulo-ocular reflex and related visual stability metrics.
  • the invention is optimized for dynamic occupational environments and enhances real-time visual performance monitoring.
  • This innovation provides a non-invasive, adaptive, and patient-specific method for managing sensory and neuro-vestibular disorders.
  • a dynamic vision correction system uses an advanced liquid crystal-based optical element responsive to eye movement.
  • the invention integrates a liquid crystal layer, a plurality of unit electrodes (each comprising a first and second electrode), and resistive layers strategically placed between these electrodes. These resistive layers have an intermediate electrical resistivity—greater than that of the electrodes but lower than an electrical insulator—to allow precise control of light refraction.
  • a control unit forms a potential gradient across the liquid crystal layer by applying a voltage to the unit electrodes, enabling tunable optical power.
  • An eye detection unit tracks the wearer’s gaze direction in real-time. When the gaze moves downward or inward relative to a set reference, the optical power is automatically increased toward the positive side, effectively adjusting lens strength dynamically.
  • This invention enables personalized, gaze-responsive lens adaptation for enhanced visual correction in various viewing conditions.
  • a transparent combiner which redirects them toward the user’s eye.
  • An infrared detector captures reflected IR light from the user's eye, enabling accurate gaze tracking.
  • a holographic optical element HOE may be used to apply different optical powers to visible and infrared beams, enhancing image clarity while preserving tracking accuracy. This design enables seamless integration of display and tracking within augmented and mixed reality wearables, enhancing both user experience and interactivity.
  • a computer then processes both data streams and analyzes the correlation between the PPG and iPPG signals.
  • the system can detect subtle physiological responses—such as allergic reactions, migraines, strokes, stress responses, emotional changes, and blood pressure variations—with improved reliability.
  • This approach enhances real-time health monitoring by cross-validating optical biosignals, and is particularly valuable in wearable health tech and telemedicine applications.
  • the system supports real-time wireless transmission of physiological signals—via radio waves, infrared, electromagnetic, or acoustic communication—to a remote processing station or to local output devices (e.g., visual/audio alerts).
  • Parameters monitored include brain activity, hydration, metabolic status, chemical compound levels in blood, and hydrodynamic conditions, enabling proactive clinical and therapeutic responses.
  • the invention comprises an anterior module (e.g., a lens frame) designed to hold one or more lenses within the user’s field of view, and a posterior module (e.g., a faceplate) engineered to conform to the contour of the wearer’s face.
  • anterior module e.g., a lens frame
  • posterior module e.g., a faceplate
  • a suspension assembly may connect the two modules, enabling articulation between them. This mechanism ensures even force distribution across the user's face, enhancing comfort during prolonged wear.
  • the design may also incorporate lens interchange systems, such as roll-off or tear-off mechanisms, to support quick and clean vision adjustments in demanding environments like motorsports or industrial use.
  • the selected frame undergoes 3D model adjustments to better conform to the user’s unique facial contours.
  • a rendered preview is displayed over the scanned facial model, providing a visual fitting simulation.
  • the system generates adjustment instructions, including annotated visuals, for physically modifying a real frame to match the digitally customized model.
  • the platform also enables anonymous aggregation of fitting data across users, enhancing frame design and inventory optimization strategies for future production.
  • an optical transfer function (OTF) is derived.
  • OTF optical transfer function
  • the OTF is applied in a non-blind deconvolution process to correct aberrations in the incoherent fluorescence image, generating a high-fidelity, aberration-corrected monochromatic output. This dual-mode approach enhances both resolution and accuracy, especially in biological imaging applications.
  • a device and method are introduced for enhancing vision in individuals with low vision conditions, such as age-related macular degeneration (AMD), by dynamically modifying the presentation of visual data based on eye-tracking and visual distortion mapping.
  • the invention comprises a wearable display system with at least one processor, an eye-tracking assembly, and embedded data indicating areas of compromised perception in the user’s visual field. By identifying less functional areas of the retina, the system redirects portions of the visual scene—which would normally fall into the damaged zones—to healthier retinal regions, ensuring more effective perception. Additionally, the system alters light frequency, intensity distribution, and color perception to enhance clarity and reduce visual distortion.
  • This invention offers a personalized image redirection technique, enabling real-time correction and vision optimization for individuals with degenerative visual impairments.
  • the method then applies deconvolution algorithms on the full-pupil and coded-aperture images to remove both sample-induced and system-induced aberrations, producing a sharply resolved and aberration-corrected image.
  • This approach enhances image clarity and precision for biomedical and material science imaging applications.
  • a second folded write image is obtained, which may represent the removed material layer, the updated sample structure, or even be used to infer internal electrical potential and dopant concentration within the sample. This method enhances imaging precision, structural control, and analytical capabilities, especially in high-resolution materials science and semiconductor applications.
  • the lens design ensures ghost-free imaging at all distances and features a sophisticated power profile, characterized by steep localized transitions ( ⁇ 2.5 diopters) and multiple local minima and maxima across radial ranges. These features enhance optical clarity while supporting dynamic accommodation and visual comfort.
  • This fusion of sensory and social input enables the system to curate and publish events in real-time to dedicated feeds, generate highlight reels, and offer personalized recommendations (e.g., friends, purchases, activities).
  • personalized recommendations e.g., friends, purchases, activities.
  • the platform also supports tag-based filtering and retrospective analysis, enhancing user engagement and data-driven storytelling.
  • a self-adjusting vision enhancement system that enables real-time, passive correction of visual acuity using dynamic lens adjustments based on eye-tracking and distance estimation.
  • the invention comprises an eyeglass frame equipped with an imaging subsystem (such as pupil-detecting cameras), an infrared illumination subsystem, a set of adjustable-focus lenses, and an embedded controller.
  • the system operates by capturing images of the user's pupils and calculating viewing angles and estimated viewing distances using image histograms and pupil feature analysis. Based on these calculations, the controller dynamically adjusts the focal power of each lens to correct vision according to the user's current line of sight and distance from objects.
  • This adaptive autofocus functionality allows continuous clarity for varying focal distances, particularly benefiting users with presbyopia or other refraction-related conditions.
  • a self-adjusting vision enhancement system that enables real-time, passive correction of visual acuity using dynamic lens adjustments based on eye-tracking and distance estimation.
  • the invention comprises an eyeglass frame equipped with an imaging subsystem (such as pupil-detecting cameras), an infrared illumination subsystem, a set of adjustable-focus lenses, and an embedded controller.
  • the system operates by capturing images of the user's pupils and calculating viewing angles and estimated viewing distances using image histograms and pupil feature analysis. Based on these calculations, the controller dynamically adjusts the focal power of each lens to correct vision according to the user's current line of sight and distance from objects.
  • This adaptive autofocus functionality allows continuous clarity for varying focal distances, particularly benefiting users with presbyopia or other refraction-related conditions.
  • the invention is applicable in 3D displays, virtual and augmented reality, telepresence, and other immersive systems. It provides adaptive visual correction and optimized viewing experiences, making it suitable for users needing enhanced depth perception, clarity, and focus precision in interactive environments.
  • the system features a target display, allowing it to conduct visual acuity assessments by displaying optotypes of varying sizes and recording the user's responses.
  • the invention enables non-invasive, high-resolution retinal imaging and vision testing, making it highly suitable for ophthalmic diagnostic centers and remote patient monitoring, while offering precision and user interactivity.
  • a control device is introduced to enhance interactive image acquisition and target tracking in systems mounted on moving platforms.
  • the invention features a touchscreen display that shows real-time images captured by an imaging device supported by a movable structure.
  • a touch gesture such as a long press, double tap, or deep touch
  • the system displays a zoom control menu, allowing the user to select a desired zoom level.
  • the device simultaneously executes automatic zoom adjustment and orientation control of the imaging device relative to the selected target.
  • the system further includes on-screen zoom controls, such as zoom-in, zoom-out, and user-configurable default zoom presets.
  • the control device Based on user input, the control device transmits positional and zoom data to the imaging device, its supporting mechanism, or the moving body itself, enabling precise real-time adjustments.
  • the invention supports automatic calculation of coordinate data and movement ratios for the zoom level and device orientation, ensuring that the target remains centered or optimally located in the display.
  • the system enables near-simultaneous or synchronized control over both zoom and posture of the imaging device, significantly improving automated tracking, framing, and focus in dynamic imaging environments.
  • the disclosed eyewear includes a forward-mounted supporting structure that houses several key components: an optical detector capable of measuring various types of radiation such as ultraviolet (UV), infrared (IR), or visible light, and an electronic circuit that processes the data captured by the detector to generate radiation-related insights for the user. Besides that, the system incorporates a motion detector to determine whether the eyewear is actively being worn, enabling context-aware functionality.
  • a wireless communication module embedded within the same forward structure, allows the eyewear to transmit or receive data wirelessly.
  • a controller is operatively connected to both the wireless module and the processing circuit, managing the overall operation of the system. All of these electronic components are mounted on an internal circuit substrate, enabling a compact and efficient integration within the eyewear frame.
  • This invention supports health monitoring, environmental sensing, and smart notifications, advancing the functionality of wearable devices in daily and professional applications.
  • a wearable system for detecting posture-related cardiovascular conditions such as orthostatic hypotension (OH) and postural-orthostatic tachycardia syndrome (POTS).
  • the system comprises a head-mounted device equipped with photoplethysmographic (PPG) sensors to measure blood flow and pulse signals on the user’s head, alongside a head-mounted camera designed to capture images that indicate changes in the user's posture.
  • PPG photoplethysmographic
  • a processing unit or computer analyzes the PPG signal to estimate systolic and diastolic blood pressure and correlates this information with detected postural transitions—such as moving from a supine to a sitting position, or from sitting to standing. If the blood pressure drops below a defined systolic or diastolic threshold within a specific timeframe following the change in posture, the system automatically flags the condition as orthostatic hypotension.
  • a smart eyewear system that dynamically adjusts visual content based on how the glasses are worn and user-specific physiological feedback.
  • the wearable glasses comprise a display for showing images, a sensor to detect user-specific eye data and wear state information, and a processor that adapts image properties accordingly.
  • the system determines the movement direction and distance of the glasses relative to the user’s eyes using sensor data, and detects changes in pupil size. Based on this real-time information, the processor adjusts both the image size (based on alignment and distance) and brightness (based on pupil dilation). This allows the display to render visuals with optimal clarity and comfort.
  • the invention enables context-aware image rendering by adapting to wearer posture and eye response, enhancing the viewing experience in augmented or wearable display technologies.
  • Each resistive layer is positioned to overlap partially with both the first and second electrodes, creating a refined gradient across the lens surface.
  • the resistance values are engineered to be higher than that of the electrodes but lower than insulating materials, enabling precise voltage control without signal degradation.
  • the design can include two optical elements per eye—one for variable focal length adjustment and the other for directional deflection—providing a multifocal or dynamically deflective visual enhancement. This innovation offers personalized visual support in real-time, adapting automatically to gaze shifts and enhancing optical comfort and functionality across various activities.
  • This information is used to correct for sensor shifts, thereby improving the accuracy and consistency of facial landmark detection.
  • the key advantage lies in the asynchronous operation—the system detects facial landmarks at a rate higher than the image capture rate, relying on real-time optical reflections enhanced by camera-derived corrections. This results in a robust facial tracking system suitable for applications in AR/VR devices, emotion recognition, and gesture-based interfaces, where both precision and temporal resolution are critical.
  • the system s embedded computer analyzes these images to calculate hemoglobin concentration values across at least three facial regions—an indicator associated with circulatory and thermal regulation changes that often accompany fever or intoxication.
  • the system can accurately detect signs of fever or abnormal physiological states, enhancing both precision and reliability compared to single-sensor approaches.
  • This invention is especially valuable for non-invasive, continuous health monitoring, with applications in telemedicine, public health screening, workplace safety, and wearable diagnostics.
  • a wearable system that allows users to interactively customize visual display settings for enhanced user experience.
  • the invention comprises an eyeglasses-type device attachable to a display, which presents a projected visual image.
  • the system includes an optical component set integrated with lenses and two user-operable adjustment members.
  • the first member enables the user to modify the projection angle of the visual image, allowing the display position to be tailored based on the shape or size of the user’s head. This ensures ergonomic alignment and optimal viewing comfort.
  • the second adjustment member controls visual image parameters such as color tone and brightness, offering adaptive customization based on lighting conditions or user preference. This dual-control mechanism enhances both the functionality and personalizability of head-mounted displays in wearable applications.
  • an innovative augmented reality (AR) eyeglasses system that combines gesture detection and eye motion tracking for interactive data input.
  • the device includes right and left eye frames aligned with the user's eyes, as well as nose pads equipped with eye motion detection electrodes. These electrodes function as sightline detection sensors to track eye movements, including direction and winks.
  • transmitter/receiver electrodes placed on parts of the eye frames act as gesture detectors to sense the user's hand or finger movements.
  • the system features a display capable of projecting a three-dimensional AR image that overlaps the real-world view, with separate right and left images adjusted by convergence angle to create depth.
  • the angle of convergence is limited to 20 degrees or less for visual comfort.
  • This wearable device uses two types of inputs: input A (hand/finger gesture detected by the gesture detector) and input B (eye motion detected by the eye motion detector), allowing users to interact with AR content intuitively.
  • the processor controls the AR interface by recognizing a user’s gaze and gesture, dynamically modifying the AR image accordingly and enabling actions such as cursor movement triggered by a detected wink. This fusion of eye and gesture control results in a highly responsive and immersive user experience in wearable AR systems.
  • a system for identifying the onset and monitoring the progression of respiratory tract infections (RTIs) such as COVID-19 by analyzing coughing patterns.
  • RTIs respiratory tract infections
  • the invention employs smart glasses embedded with an acoustic sensor and a movement sensor, both mounted at fixed positions relative to the user's head to ensure consistent data collection.
  • the system operates by capturing real-time acoustic and motion data associated with coughing episodes. These new data samples are then compared to previously recorded baseline measurements taken when the user had a known extent of RTI.
  • the system specifically analyzes changes in cough-related sounds and head movements during these episodes.
  • a computer analyzes both sets of data — the current coughing-related measurements and the baseline reference — to detect deviations. These deviations indicate changes in the condition of the user's respiratory system, providing an early warning of infection, and also allowing for ongoing tracking of disease progression or improvement. This approach enables continuous, non-invasive monitoring using wearable technology, supporting early diagnosis, remote health surveillance, and potentially reducing the need for more invasive or clinical evaluations in the early stages of respiratory infections.
  • the system employs machine learning algorithms to process and learn from users’ eye movement behavior over time. These algorithms help model the user’s subjective eye movement intention, allowing the system to predict and adapt to each user's unique interaction style. As a result, it reduces unintended selections and enhances the responsiveness and user experience of eye-controlled interfaces.
  • This invention is particularly valuable for assistive technologies, augmented reality (AR), and head-mounted displays, where hands-free and precise control is critical.
  • AR augmented reality
  • the system significantly advances the field of eye-tracking-based human-machine interaction.
  • an interactive head-mounted eyepiece for displaying digital content over real-world views.
  • the system includes an integrated processor and image source that directs light through a curved polarizing film onto a reflective display.
  • a key feature is the optical assembly containing a partially reflective, partially transmitting element that reflects image light while allowing ambient scene light to pass through. This creates a combined image of both digital content and the user’s environment, enabling seamless augmented reality experiences in a compact, wearable form—suitable for navigation, real-time guidance, or mixed-reality applications.
  • a wearable earpiece-based system is introduced for real-time health and motion monitoring of a subject.
  • the system integrates motion sensors and physiological sensors within an earpiece, such as a headset or hearing aid, to collect head motion, footstep motion, and pulse rate data.
  • an embedded processor uses an embedded processor to analyze this data to detect events such as falls or immobility, and filters out motion artifacts from the physiological readings.
  • the processed data is then transmitted to a remote device, which can provide corrective instructions directly to the user or notify a third party for assistance.
  • This multi-sensor system enables accurate health monitoring and real-time response, supporting applications in elder care, emergency response, and continuous wellness tracking, all through a discreet, wearable form factor.
  • the glasses comprise a main frame body with lenses, a first and second leg, two sensors, and a circuit.
  • the first and second sensors are integrated into the respective legs of the glasses, and the circuit is connected to both.
  • These sensors detect usage-related data—such as whether the glasses are currently worn—enabling the circuit to automatically adjust the operational state. If the sensors detect that the user is not wearing the glasses, the system enters a low-power consumption mode, significantly reducing energy usage and extending the device's standby time. This intelligent control mechanism enhances both usability and battery longevity, making the eyewear more efficient for everyday use.
  • a polarization analyzer calculates Stokes parameters (s2 and s3) corresponding to sine and cosine functions of the phase shift ( ⁇ ) caused by rotation.
  • a processing unit interprets these parameters to determine the precise rotation rate. This design provides a compact, interference-free gyroscopic solution, enhancing stability and accuracy in motion detection applications.
  • the phase modulator is embedded in a substrate cavity and precisely aligned with the multilayer waveguides, both horizontally and vertically, through sidewall exposure. This design significantly increases the coil length and sensitivity of the gyroscope within a miniaturized footprint, improving performance for inertial navigation and motion sensing applications.
  • This invention addresses multiple technical challenges related to ocular health, adaptive vision, and emergency detection.
  • Existing eyewear fails to detect sudden hemodynamic events, fatigue, or balance loss, nor can it respond dynamically to focal distance shifts or varying light intensities.
  • the present invention introduces a sophisticated pair of smart eyeglasses equipped with multilayer adaptive lenses, electrochromic gel, and advanced sensors.
  • the system employs visible and ultraviolet sensors to independently measure environmental light, dynamically adjusting lens opacity for glare protection and UV filtration.
  • a micro-pump modulates electrochromic gel volume to alter the lens’ internal curvature, simulating natural focal adaptation based on object distance and pupil behavior.
  • Micro-cameras and infrared modules collect ocular imagery, while embedded prisms allow triangulated vision and wave diffraction for real-time analysis of pupil diameter, eye motion, and fatigue.
  • ultrasonic imaging and optical ptychography provide deep-layer eye scanning to detect retinal or neurological complications, such as ocular stroke.
  • a gyroscope-based fall detection algorithm monitors acceleration and triggers safety verification or emergency alerts via smartphone or vehicle ECU integration.
  • Wireless communication modules enable proactive interventions, like slowing a vehicle or alerting emergency services.
  • Machine learning algorithms personalize visual responses by analyzing user-specific pupillary dynamics, light sensitivity, and visual habits.
  • the processor is equipped with optical ports, eliminating extra emitters, and supports low-latency control of the entire sensory array. Overall, this invention combines optics, sensor fusion, AI learning, and biomedical diagnostics into a wearable solution, enabling non-invasive monitoring and dynamic adaptation for vision optimization and safety.
  • the existing technical problem that this invention tries to solve is the early detection of various sudden motor, balance, and hemodynamic disorders in the person wearing the glasses, which can be issued by analyzing the pupil diameter and processing unusual glare. Also, these glasses allow the measurement of several vital elements to diagnose fatigue and the level of alertness, which is based on the particle diffraction velocity parameter and scanning the superficial, semi-deep, and deep layers of the eye. To solve the problem of ultraviolet radiation and visible light separately, a separate detector for each has been used to control the amount of unusual intensity entering the eye. Another problem that these glasses try to solve is the lack of adaptability of the focal length of the glasses lens, which in this invention allows the focal length to be changed while maintaining the hardness of the contact surface of the lenses.
  • the claimed multimodal smart eyeglasses (1) for early detection of sudden motor, balance, and hemodynamic disorders, with the ability to adapt to light and focal angle, are structurally similar to conventional glasses in terms of a hard frame (2) and arms (13) designed to rest on the ears.
  • substantial modifications have been made to the lens (3) and frame (2), including the integration of a light level detection sensor (11) and an ultraviolet radiation level detection sensor (10).
  • the arms (13) also house the batteries (9), a charger input port (12), an electrochromic gel volume control micropump (7), and a movable hinge (8).
  • the lens within the hard frame is surrounded by a cord (6).
  • the design of the glasses in this invention incorporates multi-modal sensory and adaptive systems into a compact wearable form factor.
  • the light level detection sensor (11) and ultraviolet radiation level detection sensor (10) provide real-time photometric analysis, enabling the system to differentiate between visible-spectrum intensity and high-energy UV radiation.
  • This dual-sensor arrangement facilitates intelligent regulation of incoming light via modulation of the electrochromic gel (21), a substance whose optical transmittance adjusts in response to electrical stimuli.
  • the electrochromic gel volume control micropump (7) functions as a microfluidic actuator, altering the curvature or refractive index of the lens by displacing or retracting volumes of gel between two rigid lens layers.
  • the battery (9) and charger input port (12), embedded within the arms (13), provide a continuous power supply while distributing electronic components ergonomically to maintain user comfort.
  • the movable hinge (8) serves both structural and functional purposes by embedding microtubing and conductive wiring to enable electrochemical modulation and high-speed data transfer, while allowing sufficient flexibility for head movement.
  • the surrounding cord (6) is composed of viscoelastic composite materials with frequency-dependent stiffness properties.
  • molecular alignment induces temporary rigidity, maintaining optical structural integrity during dynamic motion.
  • low-frequency biomechanical stresses ⁇ 1 Hz
  • the material becomes pliable, allowing controlled deformation of the lens (3) to adjust focal depth accordingly.
  • the fixed hinge (5) acts as a reinforced structural anchor designed to evenly distribute stress generated by peripheral optical modules and motion sensors. It interfaces with the movable hinge (8) through a hollow shaft (4), which serves as a multi-channel conduit for gel injection lines, optical fibers, and power wiring required for active optical components.
  • the piezo oscillator (16) secured by screws (15), functions as an ultrasonic actuator, enabling dynamic wave-based measurements through the inner hard lens layer (23) to support real-time biomechanical and hemodynamic scanning.
  • This specially designed cord (6) demonstrates early elasticity in response to motor, balance, and hemodynamic disturbances.
  • it behaves as a rigid material, stabilizing the hard-outer layer of the lens (22).
  • it acts like elastic rubber, enabling the lens (3) to change in thickness and thereby adapt its focal properties.
  • the fixed hinge (5) which is connected to the frame (2), is deliberately dimensioned to support and distribute the weight of integrated components as effectively as possible.
  • the movable hinge (8) is connected to the fixed hinge (5) via a hollow shaft (4).
  • these hinges are designed to facilitate the flow of electrochromic gel (21) and house electrical wiring for powering the optical and sensory components.
  • the structure is secured by a bottom screw (14) connected to the shaft (4), ensuring mechanical integrity and containment of the fluidic and electrical pathways.
  • the piezo oscillator (16) is mounted in contact with the hard-inner lens layer (23), and fixed in place using screws (15).
  • the piezo oscillator (16) functions by generating mechanical vibrations in the inner lens layer (23), enabling real-time measurement and monitoring of parameters related to balance, movement, and hemodynamic responses. These measurements are derived by evaluating the velocity and frequency-phase changes of the returning waves, which are directly influenced by the biomechanical state of the eye.
  • three prisms (17) are embedded to support the wave emission and reflection system. Their placement in the lower region is intentional, to avoid interference from the upper eyelid (32) and upper eyelashes (31), which may obstruct sensor line-of-sight from above.
  • the three prisms (17) re-emit waves at a frequency of 1 GHz and serve as a lightweight and compact alternative to deploying multiple cameras or infrared emitters. This approach reduces the overall weight and electronic density of the glasses, minimizing wiring complexity and the risk of component failure while maintaining high-resolution spatial detection.
  • the piezo oscillator (16) emits controlled ultrasonic pulses in the MHz range, specifically tuned to elicit responses from biological tissues such as the cornea, aqueous humor, and iris. By analyzing return velocities and phase shifts of these pulses, the system calculates key biomechanical metrics including intraocular pressure, micro-oscillatory movements, and vascular pulsatility. These indicators are critical for early detection of ocular hemodynamic anomalies such as ischemic micro-events or early-stage retinal or optic nerve strokes.
  • This optical configuration supports accurate triangulation of both optical and acoustic return signals.
  • the use of optical multiplexing within each prism module allows three-directional wave emission and detection using a single component, enhancing spatial resolution without requiring additional emitter-receiver pairs. This design significantly reduces power consumption and wiring requirements while preserving full field-of-view tracking.
  • the micro-camera (18) is developed by modifying existing wide-angle imaging hardware. Adaptations to the image sensor—whether CCD or CMOS—are applied to suit biomedical imaging needs. These modifications are similar to those seen in 180-degree automotive cameras, optimized here for close-range, high-resolution capture. Additionally, an infrared transmitter and receiver (19) are installed in close proximity to the micro-camera, allowing high-fidelity imaging and reflective wave analysis. Reflections are captured from three angles using the prisms (17), facilitating wave diffraction-based biometric assessments. For these processes to function reliably, unobstructed wave return paths (34) are ensured through careful spatial engineering of the internal optical geometry.
  • the three-point micro-camera (18), embedded within the movable hinge (8), provides stereoscopic imaging and real-time ocular monitoring through angular imaging paths (33).
  • the stereo-pair configuration allows simultaneous capture of both front-facing environmental views and inward-facing ocular surfaces (e.g., sclera, pupil, iris).
  • This method eliminates the need for additional mirrors or prisms.
  • changing the shape of the lens can be used to modulate the focal length.
  • the first method is to change the thickness using lateral pressure perpendicular to the optical axis of the lens. By increasing this pressure, the thickness increases at the center, while remaining relatively constant at the periphery.
  • This method is suitable for transparent, flexible lenses.
  • changing the diameter induces refraction.
  • the pressure—and consequently the volume—of the electrochromic gel (21) the inter-laminar distance between the lens layers (22 and 20) is modulated, enabling image clarity at different focal depths inside the eye.
  • the system works in tandem with the infrared transceiver (19) to emit and detect low-intensity near-infrared (NIR) signals.
  • NIR near-infrared
  • These signals are critical for tracking pupil dynamics, mapping ocular surface topography, and analyzing microvascular features using wave diffraction analysis—especially via Fresnel and Fourier ptychography techniques.
  • the embedded prism optics split and redirect light beams at fixed angles, simulating multi-perspective detection paths.
  • Return paths (34) are strategically designed to minimize signal loss during propagation and to enhance resolution, particularly for micromovements and fatigue diagnostics.
  • the modulation of the lens’ focal length is based on variable pressure applied to deform the internal gel layers without compromising the rigidity of the external surfaces.
  • This lateral pressure approach enables localized zonal deformation of the internal refractive index while preserving alignment along the optical axis.
  • the electrochromic gel (21) is actuated via micro-injected voltages from the control circuit, dynamically adjusting the thickness between the outer (22) and inner (20) hard lens plates. This adaptive optical response mimics the human eye’s natural accommodation mechanism, allowing for real-time focal adjustments across variable viewing distances.
  • the hard-inner layer of the lens may be engineered either as a flat surface (20) or a concave form (23), and in both cases, the object (30) is projected into the eye (32) with sufficient clarity at both long (31) and short (29) distances.
  • a key problem addressed by these glasses is the regulation of light entering the eye.
  • the photocell (11) measures ambient light levels (24), and accordingly determines the opacity level of the electrochromic gel (21).
  • This intelligent light adaptation system enables users with refractive conditions such as astigmatism, presbyopia, or hyperopia to experience improved visual acuity without changing glasses.
  • the photocell (11) continuously monitors environmental light intensity and dynamically adjusts the gel's opacity. This adjustment is informed by real-time analysis of pupillary diameter captured by the micro-camera (18). Together, these form a closed-loop bio-optical feedback system, offering personalized light filtering and enhanced comfort based on each user’s individual visual sensitivity.
  • the incoming ambient light directed toward the eyes via the optical interface (25) is dynamically regulated to preserve optimal retinal comfort and function.
  • the eyepiece (28) which functions as the anatomical aperture control (i.e., the pupil), inherently adjusts to light stimuli; however, its reactivity can be sluggish or insufficient in certain scenarios.
  • the real-time pupil diameter continuously tracked by the embedded micro-camera (18)—is transmitted to the processor, where it undergoes digital analysis. This biometric feedback, combined with ambient luminance measurements captured by the photocell (11), enables the glasses to generate a personalized adaptive lighting profile for each user.
  • the ultraviolet sensor (10) isolated from the visible spectrum detection systems, functions as a critical protective component. It autonomously detects UV radiation in real time—even when the user’s pupils are dilated due to darkness—and sends immediate corrective signals to the processor. Upon detection of excessive UV exposure, the system initiates two coordinated safety responses: (1) it triggers the electrochromic gel to increase opacity, effectively attenuating UV wave transmission, and (2) it simulates pupillary constriction by influencing the inner lens curvature, thereby narrowing the path of light entry. These actions protect the retina and prevent photochemical or thermal damage to the optic nerve (26). This mechanism is particularly effective in hazardous exposure scenarios, such as welding, where traditional photoprotective reactions are inadequate due to simultaneous pupil dilation and high UV flux.
  • the system learns the user's natural focusing behavior across a range of object distances. This is achieved using non-invasive, multi-frequency ultrasound scanning and time-resolved infrared reflectometry. These tools measure minute shifts in focal plane length and surface curvature of the biological lens (27). For individuals with accommodation deficiencies—e.g., in age-related presbyopia—the intelligent glasses adjust their own lens configuration to compensate for the insufficient thickening or flattening of the user's internal lens, restoring clarity at varying distances.
  • the focal adaptation system employs a machine-learning model trained on biometric data—including lens thickness changes, saccadic patterns, and blink frequency—to create a personalized response matrix.
  • biometric data including lens thickness changes, saccadic patterns, and blink frequency—to create a personalized response matrix.
  • This data-driven adaptation eliminates the need for progressive or bifocal lenses by enabling continuous and automatic modulation of focal power.
  • the system calibrates for each user’s tolerance to luminance shifts and contrast changes. This becomes especially important in occupations that involve rapid alternations between dark and bright environments, such as welding, long-haul driving, or screen-intensive tasks.
  • intelligent gel modulation ensures visual comfort, fatigue prevention, and long-term retinal protection.
  • the focal-length adaptation is executed by modulating the electrochromic gel’s distribution between rigid lens surfaces through microfluidic actuation.
  • the lens curvature is altered without deforming the external surfaces, preserving physical protection while enabling internal optical reframing.
  • the processor integrated into the system (36), housed in module (35) must be capable of extremely low-latency processing of biological and environmental inputs. It interfaces directly with infrared-based fiber optic lines through port (39) and connectors (44), eliminating the need for external emitters, photodiodes, or analog converters. This design ensures fast bidirectional transmission of infrared and optical signals with minimal energy loss or signal noise.
  • the embedded processor (36) is a quantum microcontroller designed for edge-computing applications with high-throughput signal analysis. It handles simultaneous optical, ultrasonic, and gyroscopic data flows. Port (39) enables direct communication over multiple optical fibers, allowing high-speed processing of multispectral wave data with precision and real-time responsiveness.
  • the processor supports multisensory fusion—integrating visual cues, photometric inputs, acoustic mapping, and motion analysis—to ensure accurate contextual awareness, enabling both diagnostic and preventive functionalities in dynamic environments.
  • the system also features wireless connectivity via port (38) and the wireless module (37), enabling communication with external smart platforms such as vehicles (49), smartphones, or brain-integrated computing devices (51).
  • external smart platforms such as vehicles (49), smartphones, or brain-integrated computing devices (51).
  • this connectivity allows the glasses to transmit biometric and geolocation data to external systems. This integration is particularly important for safety-critical applications, such as driver monitoring. If signs of fatigue or distress are detected, the glasses trigger protocols within the vehicle ECU (53), which may include automatic speed reduction, flasher activation, or steering adjustment to guide the vehicle to a safe halt.
  • a user operating a vehicle if a user operating a vehicle exhibits prolonged blink duration, slowed saccadic responses, or low variability in head movement, the glasses detect this pattern as cognitive or physical fatigue. Waves (50) are emitted to a connected dongle (52), which relays the signal to the vehicle’s onboard control unit (53).
  • the vehicle may autonomously initiate safety procedures, such as reducing throttle input, activating warning systems, or pulling to the shoulder.
  • the glasses communicate through the paired smartphone to contact emergency services and transmit GPS location, physiological parameters, and a brief health log.
  • the detection algorithm for dangerous conditions is driven by multi-parametric inputs: gyroscope signals from within the processor (36), infrared tracking from the emitter (19), and image analytics from the micro-camera (18).
  • This system fuses movement data from the embedded gyroscope (67)—including angular velocity, acceleration vectors, and orientation shifts—with biometric and optical inputs.
  • the fusion algorithm uses temporal modeling to distinguish between normal activity and potential emergencies (e.g., syncope, stroke, fall). It functions as a continuous health monitor that evaluates the kinetics and postural stability of the user.
  • the system activates a consciousness check protocol. This involves a looped verification prompt to the user through auditory or visual signals. If no response is detected within a defined timeframe, the glasses automatically initiate an emergency alert via the connected smartphone, transmitting distress signals, live biometric data, and location. This closed-loop system continues monitoring and awaits user acknowledgment to resume normal operations or remains in alert mode until external intervention.
  • safety thresholds e.g., from falling or collision
  • the proposed algorithm governs emergency condition recognition by continuously evaluating acceleration data from the internal gyroscope.
  • A represent the real-time acceleration value reported by the gyroscope system (67)
  • D denote the predefined maximum acceleration threshold considered safe.
  • the algorithm operates on a temporal loop that executes once per second, assessing whether A exceeds D. If A surpasses D, indicating a potential emergency event such as a fall, abrupt impact, or unexpected head motion, the system sends a query to the user's smartphone or linked device to confirm consciousness and well-being. A response window is initiated, allowing the user a short duration to confirm they are alert and unharmed. If the user responds, the monitoring loop continues normally.
  • failure to respond within the allotted time triggers an automatic emergency protocol—sending an alert message with geolocation data to emergency contacts or medical services.
  • This failsafe algorithm ensures autonomous detection of life-threatening situations without external input and is particularly effective for the elderly or individuals at high risk of syncope or neurological impairment.
  • integration with a brain-connected electronic interface enhances this system by introducing bidirectional cognitive and perceptual feedback.
  • the connection enables these alleged multimodal smart eyeglasses (1) to send processed visual and environmental data to the brain chip (51) in real time via directed waves (50).
  • the brain chip returns neural feedback signals (54) to the glasses, potentially adjusting image processing parameters, contrast, or brightness based on cognitive interpretation and visual demand.
  • This dynamic interaction allows the system to align vision augmentation with neurocognitive processing, optimizing user experience in real time.
  • the intelligent processing algorithm embedded in the processor (36) is designed to override brain feedback in the presence of hazardous environmental factors. For example, when ultraviolet (UV) exposure is abnormally high, but the brain chip signals demand for increased light due to perceived darkness, the glasses' logic prioritizes ocular safety. In such scenarios, processor (36) suppresses the UV exposure and modulates the electrochromic gel opacity accordingly, thereby protecting the retina and optic nerve from phototoxic damage despite neural override attempts.
  • This hierarchical control structure ensures fail-safe operation by deferring to physiological protection protocols in the event of conflict between sensory and cognitive commands.
  • the processor (36) supports multiple I/O ports to manage all subsystems of the glasses.
  • Port (41) connects to the micro-camera (18) through connector (48), providing real-time ocular surface and environmental image input.
  • Port (42) interfaces with the photocell (11) and ultraviolet sensor (10) via connector (47), facilitating continuous ambient light and UV level measurement.
  • Port (43) is linked to the internal rechargeable battery (9), supplying regulated power to all active elements of the system.
  • a dedicated operational amplifier (45) is housed externally and connected via port (40) to amplify analog signals—particularly for driving the ultrasonic piezo oscillator (vibrator)—whose output is made accessible via connector (46).
  • the glasses' ultrasonic and optical wave systems leverage frequency-specific behavior for diagnostic imaging. At frequencies exceeding 1 GHz—where wave characteristics begin to resemble those of visible light—wavelength tuning is hardware-limited. That is, a light-emitting diode (LED) or wave source designed for a specific wavelength (e.g., 650 nm for IR) cannot generate other wavelengths (e.g., 600 nm) without replacing the emitter.
  • LED light-emitting diode
  • ultrasonic wave modulation allows variable frequency operation (e.g., 5 MHz, 6 MHz, or 7 MHz) using the same output device by simply adjusting the oscillation rate from the processor (36). This flexible modulation facilitates multi-depth scanning without hardware redundancy.
  • Ultrasonic imaging is performed by emitting directed waves (55) toward the ocular structures. These waves reflect back (56) and are interpreted by return-phase analysis.
  • Short-wavelength ultrasound (59) reflects from superficial structures such as the cornea and sclera.
  • Mid-range frequencies (58) penetrate deeper into mid-layer regions, such as the ciliary body or choroid.
  • Low-frequency, long-wavelength ultrasound (57) achieves maximum penetration, mapping the posterior retina and optic nerve with minimal attenuation. The varied use of frequency-specific penetration enables stratified imaging and facilitates layered ocular reconstruction for diagnostic use.
  • a quantum microcontroller performs high-speed computation and real-time analysis while maintaining ultra-low power consumption and heat generation.
  • This microcontroller is optimized for simultaneous processing of visual data from the micro-camera (18) and signal data from the ultrasonic and optical systems.
  • Transistors (60) at the output of port (42), paired with impedance-matching resistors (61), ensure precise signal modulation and minimize distortion.
  • Current-limiting resistors (65) at port (40) protect processor (36) from electrical surges.
  • the voltage stabilizer (62) regulates supply voltages, while port (39) connects to optical emitters and receivers—specifically LEDs (63) and phototransistors (64)—ensuring high-fidelity light transmission and detection over optical fibers.
  • the intelligent glasses utilize advanced multi-frequency ultrasound imaging in conjunction with optical ptychography to perform comprehensive non-invasive eye scanning.
  • This hybrid system builds a volumetric 3D model of the eye, layer by layer.
  • High-frequency ultrasound maps anterior regions like the cornea and aqueous humor, while lower frequencies penetrate the retina, choroid, and optic nerve head.
  • These layers are then fused with visual data from the micro-camera (18) and infrared reflectometry to produce a unified, highly resolved image profile.
  • Such multimodal scanning enables early detection of disorders such as ocular stroke, retinal ischemia, optic neuritis, and fluid-related abnormalities.
  • the ptychographic scanning methodology uses ultra-short IR pulses generated by the transceiver (19), which reflect at varying depths depending on tissue density and absorption characteristics. These reflected signals are analyzed for return time and phase shift, from which the depth, structure, and optical properties of each layer are inferred. This data is then geometrically modeled, and images captured by the micro-camera (18) supplement this model by providing surface fidelity and optical continuity, especially for flatter surfaces that reflect IR uniformly.
  • three prisms embedded in the lower frame are used to refract and redirect both incoming and outgoing wave signals, ensuring comprehensive optical coverage despite anatomical obstructions such as the upper eyelid or brow ridge.
  • the triangulation data from these prisms supports precise depth perception and coordinate localization of ocular abnormalities.
  • ultrasonic waves are employed to cross-check and refine these distance measurements between each tissue layer, enhancing the accuracy of 3D reconstruction. This rigorous scanning methodology provides a diagnostic-grade platform for continuous ocular monitoring.
  • These glasses with the ability to change the focal length, eliminate the need for multiple glasses and, with the ability to measure ultraviolet intensity independently of light, minimize the damage caused by ultraviolet radiation in low-light environments.
  • the ability to detect sudden motor, balance, and hemodynamic disorders such as stroke and ocular stroke are other advantages.
  • collecting data in the diagnosis and treatment of optic nerve problems and other optical eye problems or reaction speed at different times to environmental stimuli is considered an advantage.
  • the ability to learn and use data in creating a reaction to a stimulus is considered another advantage, meaning that by repeatedly increasing and decreasing the light and measuring the light intensity and pupil diameter, the amount of color change of the electrochromic gel in it in other environments is personalized for the person. This is also true for the amount of change in focal length.
  • the ability to communicate with cars, mobile phones, and nutritional chips are other advantages of this invention, which helps to drastically reduce the risks of fatigue and stroke or loss of consciousness and keeps equipment performance within safe limits.
  • the top representation shows the top view
  • the middle schematic shows the front view
  • the bottom representation demonstrates the back view of the alleged multimodal glasses from the inner section.
  • Illustrating two different schematic representations one relates to a lens with a curved back surface, which is shown on the right, and the other relates to a lens with a flat back surface, which is shown on the left.
  • This figure depicts a general perspective view of the implemented multimodal smart eyeglasses system from the lateral direction, in alignment with three coordinate axes, and is represented in an isometric shape with hidden lines removed.
  • the figure illustrates the complete and assembled view of the glasses, detailing the spatial configuration and arrangement of key integrated modules.
  • the scale of the drawn figure is 1 to 1.25 real scale, and all dimensions are represented in centimeters.
  • Referred components include the intelligent eyeglass system (1), the hard frame (2), the adaptive multilayer lens (3), the hollow shaft (4), the fixed hinge (5), the elastic string surrounding the lens (6), the micropump for electrochromic gel modulation (7), the movable hinge (8), the embedded battery (9), the ultraviolet radiation sensor (10), the visible light detection photocell (11), the charging port (12), and the structural arms (13).
  • This figure demonstrates three different schematic representations of the claimed multimodal eyewear from the lateral, frontal, and upper directions, aligned with the three coordinate axes, and shown in a 2D shape with hidden lines removed. These orthographic views collectively present a comprehensive spatial orientation of the structural and functional elements of the eyewear from external perspectives.
  • the scale of the drawn figures is 1 to 1.25 real scale, and all dimensions are given in centimeters.
  • Referred components include the screw assembly securing the internal modules (14), the ultrasonic actuator frame (15), the piezo oscillator (16), the integrated optical prisms (17), the three-point micro-camera (18), and the infrared transceiver unit (19).
  • This figure presents two different schematic views of the claimed multimodal smart eyewear, shown from the frontal and side directions, aligned with the X and Z axes, rendered in a 2D shape with hidden lines removed. These views illustrate the precise location and cross-sectional alignment of the schematic section conducted through the inner region of the eyewear frame.
  • the visual representations are intended to support the structural and functional interpretation of the interior arrangement of the lens assembly.
  • the scale of the drawn figures is 1 to 1 real scale, and the dimensions shown are in centimeters.
  • This figure presents two different schematic views of the claimed multimodal smart eyewear, shown from the frontal and lateral directions, aligned with the X and Z axes, rendered in a 2D shape with hidden lines removed.
  • This figure specifically illustrates a schematic cut passing directly through the lens, capturing the internal structural alignment and functional layering of the adaptive lens mechanism and electrochromic gel system. This contrasts with , which depicts a section through the inner area of the frame.
  • the scale of the drawn figures is 1 to 1 real scale, and the dimensions shown are in centimeters.
  • Referred components include the outer hard lens surface (20), the inner hard lens surface (21), and the electrochromic gel (22).
  • This figure represents two various sectional cuts of the claimed multimodal smart eyewear, shown from the frontal and lateral directions, aligned with the X and Z coordinate axes, in a 2D shape with hidden lines removed.
  • the schematic cuts reveal internal structural details within the lens area, particularly emphasizing the arrangement of the outer and inner hard lens surfaces and the electrochromic gel chamber situated between them.
  • the scale of the drawn figures is 1.25 to 1 real scale, and the dimensions shown are in centimeters.
  • This figure presents two various schematic representations of the lens design configuration of the claimed multimodal smart glasses, observed from the lateral direction, aligned with the X and Z axes of the coordinate system, and displayed in a 2D shape with hidden lines visible to reveal internal contours of the lens structure.
  • the left schematic illustrates the lens with a flat back surface, while the right schematic shows a lens with a curved (concave) back surface—both contributing to adjustable focal performance.
  • the scale of the drawn figures is 4 to 1 real scale, and the dimensions are provided in centimeters.
  • Referred component includes the concave inner lens layer (23).
  • This figure demonstrates two different schematics focusing on the internal lens architecture of the claimed multimodal smart eyewear viewed from the lateral direction, aligned with the X and Z axes of the coordinate system, and depicted in a 2D shape with hidden lines visible to showcase internal structural detail.
  • the inner part of the frame and the ultrasonic vibrator have been removed to enhance the visibility of the lens configuration and interlayer spacing.
  • This representation assists in understanding the placement and interaction of the electrochromic gel between lens layers.
  • the scale of the drawn figures is 4 to 1 real scale, and the dimensions are provided in centimeters.
  • This figure displays two different schematic representations of the claimed multimodal smart glasses, specifically illustrating the mechanism of light modulation through the lens structure. Both views are presented from the lateral direction, aligned with the X and Z axes of the coordinate system, in a 2D shape, with the upper schematic shown in hidden lines visible format and the lower schematic in hidden lines removed format to distinguish the functional states of the electrochromic gel.
  • the upper schematic demonstrates the operational mode where the electrochromic gel (21) is actively modulating light, thus reducing the intensity of incoming rays (24), while the lower schematic shows the passive mode, where light fully passes through the lens to the surface of the eye (25).
  • the scale of the figures is 1.5 to 1 real scale, and dimensions are in centimeters.
  • Referred components include the measured incoming light (24), the corrected transmitted light (25), the optic nerve (26), the natural eye lens (27), and the pupillary region or eyepiece (28).
  • This figure provides two different schematic representations of the claimed multimodal smart eyewear, illustrating the adaptive focal length modulation mechanism based on object distance. Both illustrations are presented from the lateral direction, aligned with the X and Z coordinate axes, in a 2D shape, with the upper schematic drawn in hidden lines visible format and the lower schematic in hidden lines removed format.
  • the upper image depicts a scenario in which the observed object (30) is located at a greater distance, prompting the lens to increase in thickness, thereby modifying the focal length so that a clear image is formed in the eye (32).
  • the lower image shows the system in a near-object scenario, where the lens thickness decreases, facilitating near vision without visual distortion.
  • This figure presents a detailed representation of the anatomical eye region and the influence of lens and eyeglass placement on the field of view and signal transmission. It is depicted from the lateral direction, aligned with the X and Z axes of the coordinate system, in a 2D shape, incorporating both hidden lines visible and hidden lines removed formats for enhanced structural clarity.
  • This figure illustrates how the upper eyelid and eyelashes can obstruct the upper portion of the frame, thereby justifying the strategic positioning of visual prisms and sensors in the lower section of the eyeglass frame.
  • the schematic offers insight into biological interaction with the optical components, which ensures unimpeded signal redirection, image collection, and light pathway management.
  • the drawing is rendered at a 3 to 1 real scale, and the dimensions are shown in centimeters.
  • This figure illustrates the optical path and image acquisition system of the claimed multimodal smart eyeglasses, with a focus on both the prism-based image redirection and the coherent light transmission and return mechanisms. It is depicted from the frontal direction, aligned with the X and Z axes of the coordinate system, in a 2D shape with hidden lines removed to clearly demonstrate internal optical routing.
  • the left portion of the figure shows the image reception paths from the installed prisms (33), while the right portion presents the coherent light emission and return paths (34) used for wave-based diagnostic imaging and spatial mapping.
  • This setup enables multipoint detection and accurate triangulation without requiring bulky sensor arrays.
  • the figure is drawn at a 3 to 1 real scale, and all dimensions are shown in centimeters. Referred components include image transmission paths (33) and return wave paths (34).
  • This figure depicts a detailed schematic of the optical/electronic processor circuit integrated within the claimed multimodal smart eyeglasses, viewed from the frontal direction, aligned with the X and Z axes of the coordinate system. It illustrates the internal architecture of the main processing module, encompassing both optical and electronic subsystems, which coordinate data from sensors, cameras, and external communication modules.
  • the schematic includes optical fiber ports, wireless communication interfaces, signal amplification elements, and power distribution components essential for the real-time operation of adaptive vision and health monitoring.
  • the figure is illustrated in a 2D shape with hidden lines removed for maximum clarity of the circuit layout. The drawing is rendered at a 3 to 1 real scale, and all dimensions are provided in centimeters.
  • Referred components include the main processor module (35), quantum microcontroller (36), wireless communication module (37), wireless interface port (38), optical fiber interface port (39), main signal output port (40), micro-camera connector port (41), photocell and UV sensor connector port (42), battery and power port (43), optical fiber connectors (44), external operational amplifier (45), ultrasonic vibrator output connector (46), photocell and UV sensor connector (47), and micro-camera connector (48).
  • This figure demonstrates two schematics of various applications for the alleged multimodal smart eyewear, from lateral and upper perspectives, aligned with the three axes of the coordinate system, represented in a 3D shape, and shown in a combination of hidden lines visible and hidden lines removed.
  • the scale of the drawn figures is 1 to 3 real scale, and the dimensions shown are in centimeters.
  • Referred components include the connected vehicle (49), the transmitted wave to the vehicle (50), the brain chip or connected neuro-interface (51), the vehicle communication dongle (52), the vehicle’s ECU or processor (53), and the returned waves from the brain chip (54).
  • This figure illustrates two diverse schematic representations of the claimed multimodal smart eyewear’s ultrasonic imaging functionality, both from a lateral point of view, aligned with the x and z axes of the coordinate system, and depicted in a 2D shape with hidden lines removed.
  • the upper schematic drawn at a 5 to 1 real scale, demonstrates the transmission of ultrasonic waves (55) and the corresponding ultrasonic return waves (56) after interaction with ocular tissue.
  • the lower schematic drawn at a 7 to 1 real scale, represents the penetration depth of various ultrasonic wavelengths within the eye structure, showing long wavelength ultrasound for deep tissue penetration (57), medium wavelength ultrasound for sub-plane scanning (58), and short wavelength ultrasound for surface scanning (59).
  • Referred components include transmitted ultrasonic waves (55), ultrasonic return waves (56), long wavelength ultrasound penetrating deep (57), medium wavelength ultrasound for sub-plane scanning (58), and short wavelength ultrasound for surface scanning (59).
  • This figure illustrates a schematic representation of the gyroscope-based emergency analysis and response algorithm embedded within the claimed multimodal smart eyewear.
  • the diagram outlines the decision-making logic triggered by anomalous gyroscopic acceleration data, including continuous monitoring, user response verification, and escalation to emergency protocols if no feedback is received.
  • the reported parameters are acceleration data monitoring (A), maximum allowable acceleration threshold (D), response timeout logic, and escalation pathway to mobile communication.
  • This figure presents a detailed schematic representation of the quantum processor circuit integrated into the claimed multimodal smart eyewear.
  • the illustration is rendered from a lateral point of view, in alignment with the x and z axes of the coordinate system, and shown in a 2D shape with hidden lines removed.
  • the schematic includes all critical electrical pathways and protective mechanisms enabling the processor’s hybrid optical-electronic functionality.
  • Referred components include: current gain-increasing transistors (60), impedance-matching resistors (61), voltage stabilizer (62), LED emitters (63), phototransistors (64), current-limiting resistors (65), quantum microcontroller processor (66), and the gyroscope sensor system (67).
  • one of the critical applications includes real-time monitoring of the pupil diameter and microvascular changes in the sclera and retina. For instance, during a syncopal event or pre-stroke condition, the quantum processor can analyze irregular pupillary behavior or hemodynamic inconsistencies in ocular blood vessels using reflected infrared and ultrasonic signals. As an example, during fatigue-induced ocular drift or hypotensive events, the system alerts based on deviation from baseline waveform frequencies captured through ultrasonic scanning.
  • controlling focal length is achieved by deforming the lens structure using the electrochromic gel, activated via internal micropumps.
  • This mechanism reacts to ambient lighting conditions, as well as ocular feedback (e.g., eyelid squinting under bright conditions). For instance, in environments with alternating light patterns such as welding facilities or surgical operation rooms, the system responds within milliseconds to prevent retinal strain or overexposure, dynamically modifying lens curvature and opacity.
  • the infrared penetration depth will be between 1.2 and 3 micrometers. Besides that, the radiation time is between 0.25 and 10 seconds. The important point is that we do not need long radiation of more than 3 seconds, which allows for increasing power without damaging the eyes. Also, in the manufacture of these glasses, a very short-distance ptychography algorithm should be considered. At large distances, such as planets, distance and shape are determined by measuring the speed of return, but in eye scanning, the shape and depth of an object can be measured by sending short pulses and analyzing the change in the return phase. In the ultrasonic scanning method, the highest ultrasound frequency will be required, and piezoelectric sensors in the range of 5 to 7 MHz—such as an echo device—will perform the task.
  • the embedded three-point micro-cameras and embedded infrared emitters work together with the prism system to reconstruct high-resolution 3D models of the internal eye structure.
  • early-stage retinal detachment can be detected through minor displacements visualized using interferometric fringe pattern variations, analyzed with Fourier transform models and ptychographic phase reconstruction.
  • This technique mimics satellite-based topography used in telescopic imaging but adapted to millimeter and micrometer scales in biomedical contexts.
  • the Audio & Ultrasonic Techniques integrated in the glasses offer multipurpose functionality. Firstly, they provide internal ocular layer scanning using varying-frequency ultrasonic signals. Secondly, external environmental ultrasonic mapping (via echolocation) helps detect surrounding objects and alert visually impaired individuals. For example, while walking through a crowded or unfamiliar environment, the glasses emit low-amplitude ultrasonic pulses, and their return is interpreted to construct a proximity map, alerting the user with gentle haptic or auditory cues. Similarly, in case of a suspected fall (e.g., sudden gyroscopic spike), the onboard piezoelectric sensors verify impact angle and force through bone-conduction echo variance.
  • a suspected fall e.g., sudden gyroscopic spike
  • a quantum-class microcontroller ensures computational processing with low latency and minimal power draw.
  • Advanced cooling and energy optimization allow the device to operate under heavy computational load, such as real-time pattern recognition for emergency diagnostics or fatigue evaluation. For instance, this includes activating fallback protocols when overcurrent is detected, such as redirecting power loads via alternate circuit paths embedded in the frame arms, and engaging current limiting resistors around sensitive transistors.
  • the smart glasses utilize a dedicated emergency algorithmic protocol embedded within the processor’s firmware.
  • the device In case of potential stroke or blackout, based on data from the gyroscope and micro-camera, the device immediately sends alerts to a registered smartphone or connected car interface (via Bluetooth or optical communication).
  • alerts to a registered smartphone or connected car interface (via Bluetooth or optical communication).
  • a practical example is when the driver exhibits ocular fatigue patterns and delayed blink response; the glasses can command the car’s ECU to reduce speed, issue auditory alerts, or automatically signal emergency lights, facilitating a safe stop.
  • the smart glasses feature adaptive personalization through continuous real-time machine learning algorithms. For example, the system learns and models the pupil’s adaptive behavior over days or weeks in different lighting conditions and emotional states. Using predictive modeling and feedback-based adjustments, it becomes possible to pre-adjust lens thickness and tint based on historical data and live contextual cues, such as time of day, location (e.g., indoors vs. outdoors), or user’s routine. AI modules also track eye accommodation patterns, differentiating between voluntary squinting and pathological myopia progression, enabling early-stage correction or optometric referrals.
  • the application of this invention can be categorized in different cases. People with difficulty in adapting images, such as presbyopia, can overcome their problem by using these glasses. People at risk of loss of consciousness, such as drivers, can also use these glasses and connect them to the car to transfer fatigue and drowsiness to the electronic processors of the car, and corrections or possible restrictions will be applied to the driving style. These glasses are also useful for people who are exposed to high acceleration and the possibility of loss of consciousness, such as pilots, or who need to change the viewing distance quickly. Another application is for people at risk of being in places with high ultraviolet radiation, such as hunting or excessively bright environments. People with a history of some eye diseases can also use these glasses for a short time to record data and better diagnose the problem by analyzing the data recorded by these glasses.
  • these glasses are used for military personnel or guards or nature hikers who need near and far vision without using a camera and in environments with different light intensities.
  • another embodiment of the claimed invention's applications in the field of monitoring and diagnosis is the possibility of detecting various sudden motor, balance, and hemodynamic disorders such as stroke, ocular stroke, seizure, and fainting.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Ophthalmology & Optometry (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Otolaryngology (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Pathology (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

This invention relates to a pair of intelligent, multimodal smart eyeglasses designed for adaptive vision, biometric ocular monitoring, and predictive emergency response. The system integrates multilayer lenses filled with electrochromic gel and surrounded by elastic structures to enable dynamic focal length modulation in response to user visual behavior. An array of sensors—including visible light, ultraviolet, micro-cameras, infrared emitters, gyroscopes, and ultrasonic transceivers—collects multispectral data to evaluate environmental and physiological parameters. Real-time data processing is performed by an embedded quantum microcontroller, which manages light transmittance, detects eye fatigue, maps ocular surfaces, and identifies hemodynamic abnormalities such as fainting, seizure, or ocular stroke. Advanced imaging techniques, such as ultrasonic ptychography and wave diffraction analysis, enable non-invasive 3D mapping of ocular layers and internal eye motion. The system also supports wireless communication with external platforms including smartphones, vehicles, and brain-machine interfaces, enabling proactive safety interventions. These features work in tandem to support early diagnosis of critical conditions, protect vision, and personalize user experiences through machine learning and adaptive feedback mechanisms.

Description

Multimodal Smart Eyeglasses for Adaptive Vision, Predictive Ocular and Hemodynamic Monitoring, and Emergency Response
This invention discloses a multimodal smart eyewear system comprising a pair of multilayer adaptive lenses with electrochromic gel, a surrounding elastic structure for dynamic shape modulation, integrated visible and ultraviolet light sensors, and a unified electronic system including a quantum microprocessor, micro-pump, micro-cameras, infrared transceivers, ultrasonic modules, and prism-based reflectors.
The eyewear is configured to dynamically adjust its focal length and light transmittance in real time, based on ambient illumination, user-specific pupillary responses, and object distance, without requiring mechanical translation. Ocular surface imaging, fatigue detection, and predictive analysis of hemodynamic and balance-related abnormalities are enabled through infrared tracking, biometric learning, and high-frequency wave-based scanning, including ultrasonic ptychography. Emergency detection is further supported by integrated gyroscopic sensors that initiate alert protocols upon detecting sudden acceleration or loss of consciousness, communicating wirelessly with smartphones, vehicles, or neural chips for immediate intervention.
The processor operates through parallel optical and electrical channels, allowing rapid control of visual parameters and environmental adaptation. A brain-machine interface enables two-way neural communication, while the processor maintains override logic to ensure visual safety in contradiction scenarios. Altogether, the invention integrates adaptive optics, real-time health diagnostics, emergency response, and intelligent interfacing into a wearable platform for continuous vision enhancement and medical protection.
This claimed invention can be searched through international codes (IPC) and international classified codes (Cooperative patent classification with the abbreviation CPC)), A61B3/00, A61B3/103, A61B3/16, A61B5/00, A61B5/0075, A61B5/0205, A61B5/02427, A61B5/1116, A61B5/1117, A61B5/4836, A61B5/6821, A61F9/00, A61H39/00, G02B21/0032, G02B26/0808, G02B2027/014, G02B2027/0187, G02B27/00, G02B27/0075, G02B27/01, G02B27/0093, G08B21/0492, G02C7/101, G02C13/005, G06F3/012, G06F3/013, G06V10/44, G06V40/166, H02J7/00 and H04N23 in search engines and international online databases.
By searching keywords such as " Intelligent Eyeglasses", "Responsive Glasses", "Detecting Eyeglasses", "Health AND Glasses”, “Hemodynamic AND Glasses”, “Monitoring Eyeglasses”, “Eye AND Gyroscope” and “Intelligent Monitoring Glasses”, in international patent databases such as Google Patent, Patent Scope, and Lens, similar patent documents and declarations were obtained as follows.
In patent No. US10231614B2, under the title of "Systems and Methods for Using Virtual Reality, Augmented Reality, and/or a Synthetic 3-Dimensional Information for the Measurement of Human Ocular Performance," which was registered on 2017-09-22, a diagnostic eyewear system is introduced for assessing eye movement functions in immersive environments. The system includes a head-mounted display, an eye sensor (video camera), and a head orientation sensor to measure saccades, vergence, eyelid closure, and gaze tracking. It presents virtual, augmented, or 3D synthetic content to stimulate ocular response. Using Fourier transform analysis, it generates vertical and horizontal gain signals for clinical interpretation of eye behavior. This allows non-invasive evaluation of ocular performance in both medical and research applications.
In patent No. US11294462B2, under the title of "Multimodal Eye Tracking," which was registered on 2021-04-23, a dual-sensor method is introduced for precise, real-time tracking of eye movement using multiple sampling frequencies. The disclosed system receives position data from two distinct sensors: a low-frequency sensor capturing absolute eye position and a high-frequency sensor detecting incremental (delta) movement. By combining these data streams, the system calculates a highly accurate third eye position at any moment. This fusion method enhances spatial and temporal resolution beyond what a single sensor could achieve. The system then outputs the computed position as a signal for use in applications such as gaze tracking, augmented reality, and neurological assessment.
In patent No. US9223134B2, under the title of "Optical Imperfections in a Light Transmissive Illumination System for See-Through Near-Eye Display Glasses," which was registered on 2012-03-25, an interactive head-mounted display system is disclosed that integrates an image source, processor, and LED lighting system within a light-transmissive optical assembly. The invention uses controlled optical imperfections within the illumination system to scatter LED light uniformly across a reflective image display. This uniformly irradiated display produces visual content that is reflected back through the optical system, enabling the user to simultaneously view both the digital content and the surrounding environment. This design enhances display brightness and image clarity in compact, see-through wearable devices.
In patent No. US12141351B2, under the title of "Eye Gesture Tracking," which was registered on 2023-07-21, a system is disclosed for determining eye gaze direction and focus using depth-resolved optical signals. The invention integrates a photodetector and modulated optical source into a display device to illuminate the user’s eye and detect reflected signals. By analyzing phase differences between the reflected optical signals and a reference signal, the system generates a depth map of the eye, enabling accurate recognition of eye gestures and gaze direction. A machine-learning module interprets this data to dynamically control visual content on the display. The system’s optical emitter operates at radio-wave or microwave frequencies, allowing real-time, high-resolution, non-invasive gaze tracking embedded within display hardware.
In patent No. EP3271776B1, under the title of "Methods and Systems for Diagnosing and Treating Health Ailments," which was registered on 2016-03-16, a user-wearable diagnostic health system is disclosed that integrates health analysis functions into an augmented reality (AR) eyewear device. The system includes a frame-mounted AR display composed of a stack of waveguides, each projecting images with varying divergence levels while allowing real-world transparency. A light detector integrated into the frame captures light reflected from the user's eye, and a processor analyzes these signals to detect ocular abnormalities, including intraocular pressure, fatigue, blinking patterns, eye vergence, and depth of focus. The device may include a fiber-scanning display, multi-wavelength light sources, and adaptive imaging based on user-specific eye features, enabling real-time eye health diagnostics.
In patent No. CN113709410B, under the title of "Method, System and Equipment for Enhancing Human Eye Vision Ability Based on MR Glasses," which was registered on 2020-05-21, a vision-enhancing system is introduced utilizing mixed reality (MR) glasses that support telescopic imaging based on user need. The system captures distant scenes using a long-range camera, enhances or optimizes the focused region through AI processing, and displays the magnified image to the user. To reduce weight and improve form factor, the front-facing optical structure incorporates waveguide, light guide fiber, or periscope designs. Eye and head tracking systems determine the focal area and adjust magnification through gaze or gesture inputs. The system previews the selected region in real-time, allowing for enhanced distant vision beyond natural capabilities.
In patent No. US9129295B2, under the title of "See-Through Near-Eye Display Glasses with a Fast Response Photochromic Film System for Quick Transition from Dark to Clear," which was registered on 2012-03-26, a smart eyewear system is disclosed that integrates a fast-response photochromic layer and a heater layer into a see-through optical display. The system uses the heater to accelerate the transition of the photochromic film from dark to clearly states, allowing rapid light adaptation. An integrated processor controls this function based on data from various sensors—including a camera, accelerometer, microphone, ambient light sensor, and eye gaze detection system. These inputs enable the system to respond to gestures, head movements, ambient light changes, or gaze direction, offering intelligent modulation of lens transparency for both user comfort and situational awareness.
In patent No. US8488246B2, under the title of "See-through Near-Eye Display Glasses Including a Curved Polarizing Film in the Image Source, a Partially Reflective, Partially Transmitting Optical Element and an Optically Flat Film," which was registered on 2012-03-26, an interactive head-mounted eyepiece system is disclosed for overlaying digital content onto a user's natural field of view. The system includes an integrated image source and processor for handling visual content, which is introduced into an optical assembly containing a curved polarizing film and a reflective image display. A light source directs illumination toward the curved film, which reflects light to the image display. The reflected image is then passed through an optically flat film and a partially reflective curved mirror, allowing both the displayed and real-world scenes to merge into a unified visual output for the user.
In patent No. US9182596B2, under the title of "See-Through Near-Eye Display Glasses with the Optical Assembly Including Absorptive Polarizers or Anti-Reflective Coatings to Reduce Stray Light," which was registered on 2012-03-26, a head-mounted display system is disclosed that enhances visual clarity by minimizing stray and ambient light interference. The system features a partially transparent curved mirror and a polarizing beam splitter that together guide virtual content toward the user’s eye while still allowing a view of the surrounding environment. To reduce optical noise, the assembly includes absorptive polarizers and anti-reflective coatings applied to various components, such as polarizers, retarding films, and outer surfaces of mirrors. This design improves contrast, minimizes reflection artifacts, and supports a clear augmented reality experience under varied lighting conditions.
In patent No. US8482859B2, under the title of "See-through Near-Eye Display Glasses Wherein Image Light is Transmitted to and Reflected from an Optically Flat Film," which was registered on 2012-03-26, an interactive head-mounted eyepiece is introduced for combining digital and real-world visuals through an optical assembly. The system includes an integrated image source and processor for delivering content to the user’s eye. The optical assembly features an optically flat film positioned at an angle in front of the eye, which partially reflects image light from the display while simultaneously allowing ambient scene light to pass through. This results in a composite image formed by both reflected display content and the transmitted real-world view, enabling a seamless augmented reality experience.
In patent No. US8477425B2, under the title of "See-through Near-Eye Display Glasses Including a Partially Reflective, Partially Transmitting Optical Element," which was registered on 2012-03-25, an advanced head-mounted eyepiece is introduced for delivering mixed-reality visual content. The system features an integrated image source that emits light toward a single curved polarizing film, which reflects part of the light to a reflective image display. The optical assembly includes a partially reflective, partially transmitting optical element—such as a beam splitter or curved mirror—that merges the reflected image light with ambient scene light. The result is a combined visual output that overlays digital content onto the user’s real-world view. The design enables efficient light routing, polarization management, and ambient brightness adaptation, offering enhanced performance for augmented reality applications.
In patent No. US9229227B2, under the title of "See-through Near-Eye Display Glasses with a Light Transmissive Wedge-Shaped Illumination System," which was registered on 2012-03-25, a head-mounted display system is introduced that utilizes a wedge-shaped optical illumination design.
The system integrates an image source and processor within an eyepiece, where an LED lighting system is positioned along the edge of a wedge-shaped light guide. The angled geometry of the wedge redirects light to uniformly illuminate a reflective image display. This display reflects the generated image through the light guide and optical assembly, allowing the user to view both digital content and the real environment simultaneously. The configuration enhances brightness distribution and optical clarity for augmented reality applications.
In patent No. US9097891B2, under the title of "See-through Near-Eye Display Glasses Including an Auto-Brightness Control for the Display Brightness Based on the Brightness in the Environment," registered on 2012-03-26, a dynamic see-through eyewear system is presented with intelligent brightness modulation. The invention comprises an interactive head-mounted display with a see-through optical assembly, an integrated image source, and a processor capable of modifying brightness based on environmental light. A key feature is the auto-brightness control, which adjusts the brightness of specific areas of the display independently, using components like electrochromic materials or liquid crystal devices. This system ensures adaptive visibility in changing lighting conditions and enhances user comfort by aligning display brightness with natural eye adaptation.
IIn patent No. US10579141B2, under the title of "Dynamic Calibration Methods for Eye Tracking Systems of Wearable Heads-Up Displays," which was registered on 2018-07-16, a system is disclosed for continuously refining the accuracy of eye tracking in wearable heads-up displays (WHUDs). The invention utilizes a calibration point model composed of multiple gaze targets, which is dynamically adjusted based on the user’s real-time interaction with user interface (UI) elements. These UI elements are strategically designed to support seamless, in-use calibration, ensuring ongoing gaze precision without disrupting user experience. This approach enables adaptive eye tracking for enhanced visual responsiveness and display interaction in WHUDs.
In patent No. US11389059B2, under the title of "Ocular-Performance-Based Head Impact Measurement Using a Faceguard," which was registered on 2020-02-28, a faceguard system is disclosed for evaluating eye muscle responses to head impacts. The invention integrates an eye sensor, such as a video camera, and a head orientation sensor into a faceguard with an open visual aperture. The system measures eyeball movement, pupil size, and eyelid behavior, alongside pitch and yaw of the head. An electronic circuit processes data from both sensors to assess ocular performance in response to motion or impact, enabling real-time monitoring of neurological or motor disturbances. This has applications in sports safety and concussion detection.
In patent No. US9788714B2, under the title of "Systems and Methods Using Virtual Reality or Augmented Reality Environments for the Measurement and/or Improvement of Human Vestibulo-Ocular Performance," which was registered on 2016-05-23, a wearable system is introduced for assessing and enhancing vestibulo-ocular functions. The system integrates a video-based eye orientation sensor, a head orientation sensor, a display, and an electronic circuit connecting these components. Operating within 0.01 to 15 Hz, it applies Fourier transform analysis to evaluate vestibulo-ocular reflex, retinal stability, and dynamic visual acuity. Designed for portability and non-clinical settings, this self-contained, head-mounted device allows users to monitor and improve eye-head coordination through immersive VR/AR environments.
In patent No. CN111897435B, under the title of "Man-Machine Identification Method, Identification System, MR Intelligent Glasses and Application," which was registered on 2020-08-06, a mixed reality (MR)-based system is introduced for distinguishing between human users and automated systems. The method utilizes MR glasses to present holographic questions for the user to interact with via eye movement, head tracking, gesture input, or 6DoF controllers. The user's interactive behavior is monitored in real time. If the task is completed correctly, the user is verified as human. Alternatively, the system may compare collected behavior data with a pretrained human-machine recognition model to compute a human probability score, enabling advanced man-machine differentiation.
In patent No. US8814691B2, under the title of "System and Method for Social Networking Gaming with an Augmented Reality," which was registered on 2011-03-16, a head-mounted interactive system is introduced to enable augmented reality-based multiplayer gaming.
The disclosed system includes an optical assembly for displaying virtual content overlaid on real-world environments, an integrated processor, a mounted camera for gesture recognition, and a wireless communication module that connects users via an online gaming platform. The system can interpret player gestures and relay gesture-related data between multiple users wearing similar devices. It also supports body-mounted controllers, motion sensors, and smartphone integration to enrich gameplay. The interactive display supports 3D visuals, promoting immersive multiplayer experiences across social and gaming networks.
In patent No. US9370302B2, under the title of "System and Method for the Measurement of Vestibulo-Ocular Reflex to Improve Human Performance in an Occupational Environment," which was registered on 2014-07-08, a portable ocular reflex measuring device is disclosed for evaluating eye movement responses to head motion in real-world settings. The system includes an eye orientation sensor (e.g., image detector, magnetic sensor) and a head orientation sensor (e.g., gyroscope, magnetometer, accelerometer) to track pitch and yaw between 0.01 Hz and 15 Hz. A central processing unit compares data from both sensors, applies Fourier transforms, and calculates vertical/horizontal gain and phase to assess vestibulo-ocular reflex and related visual stability metrics. The invention is optimized for dynamic occupational environments and enhances real-time visual performance monitoring.
In patent No. US9164588B1, under the title of "Wearable Computing Device with Gesture Recognition," which was registered on 2013-02-05, a gesture-recognition system is introduced for wearable computing devices (WCDs) that utilize motion sensors to detect head movements. The system employs accelerometer-based level-indication data to determine when the user's head is in a neutral, level position. Upon establishing this baseline, the device monitors for a "look-up" gesture, defined by an upward head tilt. Once detected, the system issues a gesture-recognition trigger, enabling hands-free control and context-aware responses in wearable applications. This motion-based input expands the user interface of smart glasses and head-mounted devices by integrating intuitive head gestures.
In patent No. US12128003B2, under the title of "Methods and Systems for Diagnosis and Treatment of a Defined Condition, and Methods for Operating Such Systems," which was registered on 2022-12-13, a novel medical approach is introduced that uses visual correction zones to treat specific neurological or physiological conditions. The disclosed system comprises a wearable device—typically eyewear—equipped with at least one lens and one or more correcting elements. These correcting elements are strategically placed within defined correction zones in the wearer’s field of view, based on angular coordinates measured from the optical center of the lens. The zones are chosen according to the diagnosed condition, such as dizziness or spatial disorientation, and are customized for the left or right eye using a polar coordinate framework. By overlaying optical modifications (e.g., filters, textures, or patterns) in these zones, the system is designed to therapeutically modulate sensory perception, offering symptom relief through visual correction. This innovation provides a non-invasive, adaptive, and patient-specific method for managing sensory and neuro-vestibular disorders.
In patent No. JP7436059B2, under the title of "Eyeglasses and Optical Elements," which was registered on 2022-02-03, a dynamic vision correction system is disclosed that uses an advanced liquid crystal-based optical element responsive to eye movement. The invention integrates a liquid crystal layer, a plurality of unit electrodes (each comprising a first and second electrode), and resistive layers strategically placed between these electrodes. These resistive layers have an intermediate electrical resistivity—greater than that of the electrodes but lower than an electrical insulator—to allow precise control of light refraction. A control unit forms a potential gradient across the liquid crystal layer by applying a voltage to the unit electrodes, enabling tunable optical power. An eye detection unit tracks the wearer’s gaze direction in real-time. When the gaze moves downward or inward relative to a set reference, the optical power is automatically increased toward the positive side, effectively adjusting lens strength dynamically. This invention enables personalized, gaze-responsive lens adaptation for enhanced visual correction in various viewing conditions.
In patent No. US10534182B2, under the title of "Optical Splitter for Integrated Eye Tracking and Scanning Laser Projection in Wearable Heads-Up Displays," which was registered on 2018-09-10, an advanced wearable heads-up display (WHUD) system is disclosed that integrates eye-tracking functionality with scanning laser projection (SLP). This innovation enables simultaneous visual content projection and real-time eye tracking within a compact optical framework. The system features both visible light sources (for RGB projection) and an infrared light source, with a scan mirror directing both toward the user’s field of view. An optical splitter is employed to transmit the infrared light unaffected while splitting visible light into angle-separated beams. These light streams are then guided through a transparent combiner, which redirects them toward the user’s eye. An infrared detector captures reflected IR light from the user's eye, enabling accurate gaze tracking. Additionally, a holographic optical element (HOE) may be used to apply different optical powers to visible and infrared beams, enhancing image clarity while preserving tracking accuracy. This design enables seamless integration of display and tracking within augmented and mixed reality wearables, enhancing both user experience and interactivity.
In patent No. US10799122B2, under the title of "Utilizing Correlations Between PPG Signals and iPPG Signals to Improve Detection of Physiological Responses," which was registered on 2019-11-20, a multimodal system is introduced for accurately detecting a variety of physiological responses using complementary photoplethysmography techniques. The invention utilizes a head-mounted contact PPG sensor that captures traditional photoplethysmogram (PPG) signals from exposed skin regions on the user's head. Simultaneously, a camera positioned at a distance (more than 10 mm from the head) captures visual data to extract imaging photoplethysmogram (iPPG) signals from another region of the head. A computer then processes both data streams and analyzes the correlation between the PPG and iPPG signals. By combining these synchronized modalities, the system can detect subtle physiological responses—such as allergic reactions, migraines, strokes, stress responses, emotional changes, and blood pressure variations—with improved reliability. This approach enhances real-time health monitoring by cross-validating optical biosignals, and is particularly valuable in wearable health tech and telemedicine applications.
In patent No. US11045092B2, under the title of "Apparatus and Method for Measuring Biologic Parameters," which was registered on 2018-07-19, a system is introduced that utilizes specialized support structures for positioning sensors over physiologic tunnels to continuously measure various physical, chemical, and biological parameters of the human body. The invention employs support elements such as patches, clips, eyeglasses, and head-mounted gear to stabilize both passive and active sensors at precise anatomical zones—particularly where the body’s physiological signals can be optimally accessed. These tunnels include locations where key circulatory, neurological, and metabolic data can be captured with minimal interference. The system supports real-time wireless transmission of physiological signals—via radio waves, infrared, electromagnetic, or acoustic communication—to a remote processing station or to local output devices (e.g., visual/audio alerts). Parameters monitored include brain activity, hydration, metabolic status, chemical compound levels in blood, and hydrodynamic conditions, enabling proactive clinical and therapeutic responses.
In patent No. US20210186329A1, under the title of "Mesh Network Personal Emergency Response Appliance," which was registered on 2021-02-17, a system is disclosed for remotely monitoring a person's cardiac function using a wearable cardiac monitor that is wirelessly connected to an internet-enabled computer. The method includes equipping a user with the wearable device, which continuously gathers physiological data over a defined monitoring period—such as during work, sleep, or exercise. This data is then forwarded to a cloud-based healthcare monitoring center, where specialized software analyzes the signals to generate statistical health reports, which are shared with healthcare professionals. The invention also allows for emergency intervention, as the cardiac monitor can send immediate alerts if critical conditions are detected, either automatically or when the user manually presses a distress button. Alerts may be routed to healthcare providers, emergency responders (911), relatives, or the user’s physician, enhancing real-time responsiveness and personalized medical care.
In patent No. US11428960B2, under the title of "Method and System to Create Custom, User-Specific Eyewear," which was registered on 2020-05-22, a system is presented for designing and manufacturing personalized eyewear by integrating anatomical data capture, modeling, and parametric customization. The method involves using a computer system paired with an image capture device to obtain detailed anatomical images of an individual’s face. From these images, a 3D anatomical model is constructed. A parametric model of eyewear, containing configurable and manually-adjustable portions (e.g., nose bridge, temple length), is then superimposed over the anatomical model on a display. Users or automated systems adjust the eyewear model within predefined physical constraints, allowing real-time visualization of custom fit and appearance. The finalized design is then sent to a manufacturer for fabrication. This technology allows the production of fully custom-fit eyewear, improving comfort, aesthetic alignment, and performance without relying on standard, off-the-shelf components.
In patent No. AU2018203480B2, under the title of "Eyewear with Outriggers," which was registered on 2018-05-17, an advanced modular eyewear system is introduced, particularly suited for goggles, featuring an innovative dual-module design for improved comfort and functionality. The invention comprises an anterior module (e.g., a lens frame) designed to hold one or more lenses within the user’s field of view, and a posterior module (e.g., a faceplate) engineered to conform to the contour of the wearer’s face. These two modules are selectively interchangeable, allowing users to adapt the eyewear's physical characteristics—such as fit, function, or aesthetics—based on application or preference. A suspension assembly may connect the two modules, enabling articulation between them. This mechanism ensures even force distribution across the user's face, enhancing comfort during prolonged wear. The design may also incorporate lens interchange systems, such as roll-off or tear-off mechanisms, to support quick and clean vision adjustments in demanding environments like motorsports or industrial use.
In patent No. CN114730101B, under the title of "System and Method for Adjusting Inventory Eyeglass Frames Using 3D Scanning of Facial Features," which was registered on 2020-09-24, a comprehensive method is disclosed for customizing standard eyeglass frames to fit individual users by utilizing 3D facial scanning and digital modeling technologies. The system begins by acquiring a 3D scan or image of a user’s face, followed by extraction of anatomical measurements. These facial dimensions are then compared with 3D CAD files of existing eyeglass frames in inventory to calculate a set of personalized fitting parameters. Based on this analysis, the system identifies a subset of eyeglass frames that satisfy both aesthetic and functional fit criteria, including optical constraints and adjustability limits. The selected frame undergoes 3D model adjustments to better conform to the user’s unique facial contours. A rendered preview is displayed over the scanned facial model, providing a visual fitting simulation. Finally, the system generates adjustment instructions, including annotated visuals, for physically modifying a real frame to match the digitally customized model. The platform also enables anonymous aggregation of fitting data across users, enhancing frame design and inventory optimization strategies for future production.
In patent No. US10684458B2, under the title of "Correcting for Aberrations in Incoherent Imaging Systems Using Fourier Ptychographic Techniques," which was registered on 2016-03-11, a hybrid imaging method is disclosed that enhances the resolution and accuracy of fluorescence images by correcting optical aberrations using coherent imaging data. The system integrates Fourier ptychography with an embedded pupil function recovery process to simultaneously reconstruct a high-resolution image and estimate the pupil function of the imaging optics. Initially, the system acquires a sequence of coherent images of a specimen under spatially coherent plane wave illumination from varying angles. These images are used to estimate pupil function and build a super-resolved coherent image. Subsequently, the specimen is illuminated with fluorescence excitation light, and an incoherent fluorescence image is captured. Using the previously estimated pupil function, an optical transfer function (OTF) is derived. The OTF is applied in a non-blind deconvolution process to correct aberrations in the incoherent fluorescence image, generating a high-fidelity, aberration-corrected monochromatic output. This dual-mode approach enhances both resolution and accuracy, especially in biological imaging applications.
In patent No. US10275024B1, under the title of "Light Management for Image and Data Control," which was registered on 2017-10-16, a device and method are introduced for enhancing vision in individuals with low vision conditions, such as age-related macular degeneration (AMD), by dynamically modifying the presentation of visual data based on eye-tracking and visual distortion mapping. The invention comprises a wearable display system with at least one processor, an eye-tracking assembly, and embedded data indicating areas of compromised perception in the user’s visual field. By identifying less functional areas of the retina, the system redirects portions of the visual scene—which would normally fall into the damaged zones—to healthier retinal regions, ensuring more effective perception. Additionally, the system alters light frequency, intensity distribution, and color perception to enhance clarity and reduce visual distortion. This invention offers a personalized image redirection technique, enabling real-time correction and vision optimization for individuals with degenerative visual impairments.
In patent No. US11457807B2, under the title of "System and Method for Enabling Customers to Obtain Refraction Specifications and Purchase Eyeglasses or Contact Lenses," which was registered on 2007-02-16, a comprehensive remote vision care system is introduced to allow customers to receive optical prescriptions and purchase corrective lenses efficiently, without the need for traditional in-person visits. The system includes self-operated or technician-assisted diagnostic stations equipped with digital imaging devices and optical instruments that conduct sight screenings and gather refraction data. These stations are networked to a remotely located eye care professional, who uses live audio-video communication to review the results, interact with the customer, and operate instruments remotely when needed. After diagnosing the visual condition, the remote eye doctor can authorize prescriptions, which are digitally transmitted to a lens manufacturing lab. This enables end-to-end service, from vision testing to lens fabrication and delivery, offering convenience, speed, and accessibility to vision care and eyewear purchases.
In patent No. US20190331902A1, under the title of "Laser-Based Fourier Ptychographic Imaging Systems and Methods," which was filed on 2019-01-18, a novel imaging system is introduced that enhances optical resolution using Fourier ptychographic reconstruction and laser-based illumination. The system employs an angle direction device to direct laser light onto a sample at a variety of illumination angles across sequential sampling times. The optical system, consisting of both collection and focusing elements, captures the scattered light from the sample and projects it onto a light detector. Each image acquired corresponds to a different illumination angle, allowing overlapping regions in the Fourier domain. By computationally combining these intensity images, the system reconstructs a high-resolution composite image of the specimen. This technique enables detailed visualization of microscopic samples beyond conventional resolution limits, offering significant advantages in biomedical imaging, semiconductor inspection, and precision metrology.
In patent No. US11092795B2, under the title of "Systems and Methods for Coded-Aperture-Based Correction of Aberration Obtained from Fourier Ptychography," which was filed on 2017-06-12, an advanced computational imaging system is introduced that corrects optical aberrations in incoherent imaging using coded-aperture and Fourier ptychographic techniques. The system includes an aperture modulator located at the Fourier plane of an imaging sensor, capable of modulating aperture patterns to capture a series of coded-aperture, limited-aperture, and full-pupil images. By processing the limited-aperture images, the system first recovers the pupil function, which characterizes the optical system’s aberrations. Using this recovered pupil function, the method then applies deconvolution algorithms on the full-pupil and coded-aperture images to remove both sample-induced and system-induced aberrations, producing a sharply resolved and aberration-corrected image. This approach enhances image clarity and precision for biomedical and material science imaging applications.
In patent No. CN103531427B, under the title of "Method for Preparing and Imaging a Lamella in a Particle-Optical Apparatus," which was filed on 2013-06-28, a specialized method is introduced for thinning and imaging samples, particularly lamellae, using a particle-optical apparatus equipped with both electron and ion beam columns, a camera system, and a manipulator. The disclosed technique involves acquiring a first folded write image of the sample based on electron imaging. This image is then used to guide a thinning process performed by the ion beam. After thinning, a second folded write image is obtained, which may represent the removed material layer, the updated sample structure, or even be used to infer internal electrical potential and dopant concentration within the sample. This method enhances imaging precision, structural control, and analytical capabilities, especially in high-resolution materials science and semiconductor applications.
In patent No. JP7618639B2, under the title of "Lenses, Devices, Methods and Systems for Refractive Error," which was registered on 2012-10-17, a method is disclosed for designing ophthalmic lenses that correct refractive errors while influencing visual quality and ocular development. The lens incorporates at least two optical surfaces and an aberration profile centered on the optical axis, with the prescription focal power precisely tailored for each user. The invention emphasizes control of higher-order spherical aberrations—specifically from 1st to 9th orders (e.g., C(4,0) to C(20,0))—to create a retinal image quality (RIQ) that decreases in the direction of eye growth, potentially providing a myopia control benefit. Furthermore, the lens design ensures ghost-free imaging at all distances and features a sophisticated power profile, characterized by steep localized transitions (≥2.5 diopters) and multiple local minima and maxima across radial ranges. These features enhance optical clarity while supporting dynamic accommodation and visual comfort.
In patent No. US9940508B2, under the title of "Event Detection, Confirmation and Publication System That Integrates Sensor Data and Social Media," which was registered on 2017-05-09, a system is introduced for detecting, confirming, and publishing real-world events by integrating sensor-generated data with user-generated content from social media platforms. The invention employs various sensors—such as motion, heart rate, temperature, sound, and elevation detectors—to capture raw environmental or physiological data. These data streams are processed and tagged with event identifiers (e.g., activity type, score, or performance level), which are then cross-referenced with social media posts (text, images, video, or audio) for event validation and contextual enrichment. This fusion of sensory and social input enables the system to curate and publish events in real-time to dedicated feeds, generate highlight reels, and offer personalized recommendations (e.g., friends, purchases, activities). The platform also supports tag-based filtering and retrospective analysis, enhancing user engagement and data-driven storytelling.
In patent No. US11107368B1, under the title of "System for Wireless Devices and Intelligent Glasses with Real-Time Connectivity," which was registered on 2013-01-26, a mobile communication system is introduced that facilitates real-time digital interaction across multiple platforms, including smartphones, tablets, wearable devices, and intelligent glasses. The invention enables the acquisition, processing, and display of digital content—particularly images and video—through a network of interconnected mobile and wearable devices. These devices communicate wirelessly and operate within a hierarchy or priority system that dictates the order of operations and data exchange between servers, smart appliances, and user devices. The system supports seamless data and telephony transmission, allowing for synchronized multimedia sharing and interaction. It provides enhanced capabilities for real-time connectivity, which can be applied in multimedia collaboration, augmented reality, or smart environment integration. This design enhances user experience by integrating digital content acquisition and delivery within an intelligent and dynamically responsive communication framework.
In patent No. US11982882B2, under the title of "Electronic Spectacles," which was registered on 2023-06-29, a system is disclosed that enables dynamic color-coding of objects in the field of view for multiple spectacle wearers using electronically controlled spectacles equipped with liquid crystal lenses and synchronized RGB light sources. Each user wears a pair of spectacles containing liquid crystal cells capable of switching between high and low transmission states, thereby modulating light visibility through the lens. Accompanying each pair of spectacles is an RGB light source, which can be precisely controlled in terms of luminance timing, color, and intensity. The system operates such that the RGB light source of each user emits a specific color when the lenses are in a high-transmission state, allowing for individualized visual tagging or highlighting. This configuration facilitates user-specific visual augmentation, enabling the color-based differentiation of objects or areas viewed by different users within the same shared environment—potentially useful in collaborative AR, training, or situational awareness applications.
In patent No. US9946098B2, under the title of "Eyewear with a Cellular GPS Module," which was registered on 2016-12-29, an advanced eyewear system is introduced that integrates GPS tracking and cellular communication functionalities directly into the temples of the glasses. The invention incorporates a cellular communication module programmed with a unique identification number, enabling real-time wireless communication with an external computing device for location tracking. Through an eyewear-detecting application, the computing device can retrieve GPS coordinates of the eyewear, facilitating tracking in case of loss or emergency. Moreover, the eyewear includes a push button embedded in the temple structure. This button is pre-programmed with emergency contact numbers and allows the wearer to initiate emergency calls instantly, providing a safety mechanism for users in distress. This invention enhances personal safety and connectivity while maintaining a compact and wearable form factor, ideal for real-time monitoring and emergency response applications.
In patent No. US10048513B2, under the title of "Continuous Autofocusing Eyewear," which was registered on 2016-11-02, a self-adjusting vision enhancement system is introduced that enables real-time, passive correction of visual acuity using dynamic lens adjustments based on eye-tracking and distance estimation. The invention comprises an eyeglass frame equipped with an imaging subsystem (such as pupil-detecting cameras), an infrared illumination subsystem, a set of adjustable-focus lenses, and an embedded controller. The system operates by capturing images of the user's pupils and calculating viewing angles and estimated viewing distances using image histograms and pupil feature analysis. Based on these calculations, the controller dynamically adjusts the focal power of each lens to correct vision according to the user's current line of sight and distance from objects. This adaptive autofocus functionality allows continuous clarity for varying focal distances, particularly benefiting users with presbyopia or other refraction-related conditions.
In patent No. US10048513B2, under the title of "Continuous Autofocusing Eyewear," which was registered on 2016-11-02, a self-adjusting vision enhancement system is introduced that enables real-time, passive correction of visual acuity using dynamic lens adjustments based on eye-tracking and distance estimation. The invention comprises an eyeglass frame equipped with an imaging subsystem (such as pupil-detecting cameras), an infrared illumination subsystem, a set of adjustable-focus lenses, and an embedded controller. The system operates by capturing images of the user's pupils and calculating viewing angles and estimated viewing distances using image histograms and pupil feature analysis. Based on these calculations, the controller dynamically adjusts the focal power of each lens to correct vision according to the user's current line of sight and distance from objects. This adaptive autofocus functionality allows continuous clarity for varying focal distances, particularly benefiting users with presbyopia or other refraction-related conditions.
In patent No. CN110376757B, under the title of "Self-Adjusting Glasses," which was registered on 2019-02-12, a modular eyewear system is introduced that allows users to customize the appearance and form factor of their glasses, including rim style, color, and display features, through user-interchangeable parts and integrated micro-LED displays. The invention enables the wearer to assemble and modify core components of the glasses—such as the frame, temples, bridge, and lenses—into full-rim, half-rim, or rimless styles, while selecting custom colors, patterns, and decorative elements. Uniquely, the system incorporates transparent micro-light-emitting diode (micro-LED) display housings, which are applied to the lenses, frames, or both, allowing for dynamic visual modifications. This technology permits the spectacle wearer to instantly change the aesthetic appearance—such as patterns or colors—on demand, making the glasses not only functional but also expressive and fashion-adaptive. It offers a blend of personalization, modularity, and visual display control.
In patent No. US11233934B2, under the title of "Automated Adjustment of Digital Camera Image Capture Parameters," which was registered on 2020-08-11, an adaptive image stabilization system is introduced for portable electronic devices equipped with digital cameras and haptic input mechanisms. The invention discloses a method in which user interaction with a single-action control button—such as the duration of a press—is used to automatically or semi-automatically adjust image capture parameters. For instance, a short press duration may trigger a high-rigidity image stabilization mode (useful for still photography), while a longer press duration activates a less intense stabilization mode (more suitable for video or continuous capture). This system allows for context-aware optimization of the image-capturing process, enhancing user experience by minimizing manual configuration and providing intelligent control over photo and video quality based on simple haptic engagement.
In patent No. US20220117784A1, under the title of "Sonic and Ultrasonic Methods and Apparatus for Treatment of Glaucoma," registered on 2021-12-27, therapeutic sound energy delivery is introduced as a novel method for treating glaucoma. The invention details a portable contact lens that incorporates sonic or ultrasonic transducers. These transducers are capable of emitting sound energy at multiple frequencies and are designed to deliver ultrasonic energy to specific regions of the eye, including the Schlemm's canal and the trabecular meshwork. The energy bursts are emitted at a power range of 1 microwatt to 5 watts and with a duration between 0.1 seconds and 5 seconds. The primary goal is to destroy debris obstructing the trabecular network and enhance the outflow of intraocular fluid, ultimately lowering intraocular pressure and providing a non-invasive therapeutic treatment for glaucoma.
In patent No. CN108051925B, under the title of "Eyeglasses Device with Focus-Adjustable Lens," which was registered on 2017-10-31, a novel eyewear apparatus is presented that integrates focus-adjustable lenses to enhance visual adaptability across various applications. The system comprises left and right lens assemblies, each featuring a focus-adjustable lens positioned to cover key regions of the wearer's field of view, including central, near peripheral, and middle peripheral zones. Surrounding each adjustable lens is a corresponding focus-fixed lens, which maintains visual clarity in outer regions. This hybrid structure enables dynamic focusing where needed, while preserving stable peripheral vision. Designed for both vision correction and advanced visual interfaces, the invention is applicable in 3D displays, virtual and augmented reality, telepresence, and other immersive systems. It provides adaptive visual correction and optimized viewing experiences, making it suitable for users needing enhanced depth perception, clarity, and focus precision in interactive environments.
In patent No. US10945597B2, under the title of "Optical Coherence Tomography-Based Ophthalmic Testing Methods, Devices and Systems," which was registered on 2018-11-08, a comprehensive system is disclosed for performing self-administered structural and functional eye examinations using optical coherence tomography (OCT) technology. The system includes an eyepiece designed to align with a user's eye, a light source for directing illumination through the eyepiece, and an interferometer that generates optical interference patterns from reflected retinal light. These interference patterns are captured by an optical detector, and the resulting data is processed via an integrated electronics and processing unit. Additionally, the system features a target display, allowing it to conduct visual acuity assessments by displaying optotypes of varying sizes and recording the user's responses. The invention enables non-invasive, high-resolution retinal imaging and vision testing, making it highly suitable for ophthalmic diagnostic centers and remote patient monitoring, while offering precision and user interactivity.
In patent No. JP6835392B2, under the title of "Systems and Methods for Controlling Images Acquired by Imaging Devices," which was registered on 2016-10-24, a control device is introduced to enhance interactive image acquisition and target tracking in systems mounted on moving platforms. The invention features a touchscreen display that shows real-time images captured by an imaging device supported by a movable structure. When a user performs a touch gesture—such as a long press, double tap, or deep touch—on a region of the display containing a visible target, the system displays a zoom control menu, allowing the user to select a desired zoom level. Upon releasing the gesture, the device simultaneously executes automatic zoom adjustment and orientation control of the imaging device relative to the selected target. The system further includes on-screen zoom controls, such as zoom-in, zoom-out, and user-configurable default zoom presets. Based on user input, the control device transmits positional and zoom data to the imaging device, its supporting mechanism, or the moving body itself, enabling precise real-time adjustments. The invention supports automatic calculation of coordinate data and movement ratios for the zoom level and device orientation, ensuring that the target remains centered or optimally located in the display. The system enables near-simultaneous or synchronized control over both zoom and posture of the imaging device, significantly improving automated tracking, framing, and focus in dynamic imaging environments.
In patent No. US11644361B2, under the title of "Eyewear with Detection System," which was registered on 2022-02-25, a smart eyewear system is introduced that integrates radiation and motion sensing capabilities into a compact wearable form. The disclosed eyewear includes a forward-mounted supporting structure that houses several key components: an optical detector capable of measuring various types of radiation such as ultraviolet (UV), infrared (IR), or visible light, and an electronic circuit that processes the data captured by the detector to generate radiation-related insights for the user. Besides that, the system incorporates a motion detector to determine whether the eyewear is actively being worn, enabling context-aware functionality. A wireless communication module, embedded within the same forward structure, allows the eyewear to transmit or receive data wirelessly. A controller is operatively connected to both the wireless module and the processing circuit, managing the overall operation of the system. All of these electronic components are mounted on an internal circuit substrate, enabling a compact and efficient integration within the eyewear frame. This invention supports health monitoring, environmental sensing, and smart notifications, advancing the functionality of wearable devices in daily and professional applications.
In patent No. US10667697B2, under the title of "Identification of Posture-Related Syncope Using Head-Mounted Sensors," which was registered on 2019-06-26, a wearable system is introduced for detecting posture-related cardiovascular conditions such as orthostatic hypotension (OH) and postural-orthostatic tachycardia syndrome (POTS). The system comprises a head-mounted device equipped with photoplethysmographic (PPG) sensors to measure blood flow and pulse signals on the user’s head, alongside a head-mounted camera designed to capture images that indicate changes in the user's posture. A processing unit or computer analyzes the PPG signal to estimate systolic and diastolic blood pressure and correlates this information with detected postural transitions—such as moving from a supine to a sitting position, or from sitting to standing. If the blood pressure drops below a defined systolic or diastolic threshold within a specific timeframe following the change in posture, the system automatically flags the condition as orthostatic hypotension. This real-time, non-invasive approach enables early detection and monitoring of syncope-related conditions using compact, head-mounted sensors, providing valuable applications in clinical diagnostics, elderly care, and remote health monitoring.
In patent No. US10983593B2, under the title of "Wearable Glasses and Method of Displaying Image via the Wearable Glasses," which was registered on 2015-02-10, a smart eyewear system is introduced that dynamically adjusts visual content based on how the glasses are worn and user-specific physiological feedback. The wearable glasses comprise a display for showing images, a sensor to detect user-specific eye data and wear state information, and a processor that adapts image properties accordingly. The system determines the movement direction and distance of the glasses relative to the user’s eyes using sensor data, and detects changes in pupil size. Based on this real-time information, the processor adjusts both the image size (based on alignment and distance) and brightness (based on pupil dilation). This allows the display to render visuals with optimal clarity and comfort. The invention enables context-aware image rendering by adapting to wearer posture and eye response, enhancing the viewing experience in augmented or wearable display technologies.
In patent No. JP7436059B2, under the title of "Eyeglasses and Optical Elements," which was registered on 2022-02-03, an advanced eyewear system is disclosed incorporating liquid crystal lenses whose optical power is dynamically adjusted based on the wearer’s gaze direction. The eyeglasses utilize a liquid crystal layer sandwiched between unit electrodes and a set of resistive layers, forming a controlled potential gradient. This gradient is used to modulate the refractive properties of the lenses in real time. A key innovation lies in the control unit, which detects eye movement using an eye detection unit and adjusts the lens power accordingly. When the line of sight shifts downward or inward—relative to a predefined reference point—the optical power of the lenses increases to support near or intermediate vision, such as for reading. Each resistive layer is positioned to overlap partially with both the first and second electrodes, creating a refined gradient across the lens surface. The resistance values are engineered to be higher than that of the electrodes but lower than insulating materials, enabling precise voltage control without signal degradation. Additionally, the design can include two optical elements per eye—one for variable focal length adjustment and the other for directional deflection—providing a multifocal or dynamically deflective visual enhancement. This innovation offers personalized visual support in real-time, adapting automatically to gaze shifts and enhancing optical comfort and functionality across various activities.
In patent No. US11983317B2, under the title of "Using a Camera to Compensate for Sensor-Shift in a Photosensor-Based Facial Expression Detector," which was registered on 2023-02-04, a hybrid system is introduced for accurate facial landmark detection using both photosensors and camera-based imaging. The invention centers on a head-mounted device that integrates discrete light sources and photosensors to measure light reflected from a region of the user’s face. A distinctive aspect of the system is its dual-modality approach: while the photosensors, distributed over an area exceeding 2 cm, provide high-speed reflection data, a camera is used to capture broader facial images. A computer processes these camera images to estimate the position and orientation of the wearable device relative to the user’s face. This information is used to correct for sensor shifts, thereby improving the accuracy and consistency of facial landmark detection. The key advantage lies in the asynchronous operation—the system detects facial landmarks at a rate higher than the image capture rate, relying on real-time optical reflections enhanced by camera-derived corrections. This results in a robust facial tracking system suitable for applications in AR/VR devices, emotion recognition, and gesture-based interfaces, where both precision and temporal resolution are critical.
In patent No. US11154203B2, under the title of "Detecting Fever from Images and Temperatures," which was registered on 2020-09-01, a wearable system is introduced that enables real-time fever detection by combining thermal sensing and facial imaging technologies. The system is designed as a head-mounted device and includes at least two temperature sensors and a specialized imaging camera. The first temperature sensor measures skin temperature (Tskin) from a specific region on the user's head, while the second sensor captures the ambient environmental temperature (Tenv). A camera sensitive to sub-1050 nm wavelengths (commonly used in near-infrared imaging) captures facial images of a second region of the user’s face. The system’s embedded computer analyzes these images to calculate hemoglobin concentration values across at least three facial regions—an indicator associated with circulatory and thermal regulation changes that often accompany fever or intoxication. By correlating Tskin, Tenv, and hemoglobin data, the system can accurately detect signs of fever or abnormal physiological states, enhancing both precision and reliability compared to single-sensor approaches. This invention is especially valuable for non-invasive, continuous health monitoring, with applications in telemedicine, public health screening, workplace safety, and wearable diagnostics.
In patent No. US11604367B2, under the title of "Smartglasses with Bendable Temples," which was registered on 2021-01-22, a novel untethered smartglasses design is introduced that integrates wireless connectivity and ergonomic adaptability. The invention centers on a structural configuration where at least one temple of the smartglasses is segmented into three parts. The first portion, which connects to the front frame supporting the lenses, contains key electronic components. The second portion is a bendable segment that includes flexible wiring and is specifically designed to wrap securely around the user’s ear, improving the device’s fit and wearability. The third portion, connected at the far end, houses secondary electronic modules. Unlike the bendable second portion, both the first and third sections are rigid to maintain structural integrity for housing electronics. This segmented and partially flexible design enhances user comfort without sacrificing functionality, making it ideal for wearable technology applications such as augmented reality, communications, and health monitoring.
In patent No. US12135432B2, under the title of "System and Method Directed to an Eyeglasses-Type Wearable Device," which was registered on 2023-08-17, a wearable system is introduced that allows users to interactively customize visual display settings for enhanced user experience. The invention comprises an eyeglasses-type device attachable to a display, which presents a projected visual image. The system includes an optical component set integrated with lenses and two user-operable adjustment members. The first member enables the user to modify the projection angle of the visual image, allowing the display position to be tailored based on the shape or size of the user’s head. This ensures ergonomic alignment and optimal viewing comfort. The second adjustment member controls visual image parameters such as color tone and brightness, offering adaptive customization based on lighting conditions or user preference. This dual-control mechanism enhances both the functionality and personalizability of head-mounted displays in wearable applications.
In patent No. US11880508B2, under the title of "Eyeglasses-Type Wearable Device and Method Using the Same," which was registered on 2021-09-29, an innovative augmented reality (AR) eyeglasses system is introduced that combines gesture detection and eye motion tracking for interactive data input. The device includes right and left eye frames aligned with the user's eyes, as well as nose pads equipped with eye motion detection electrodes. These electrodes function as sightline detection sensors to track eye movements, including direction and winks. Furthermore, transmitter/receiver electrodes placed on parts of the eye frames act as gesture detectors to sense the user's hand or finger movements. The system features a display capable of projecting a three-dimensional AR image that overlaps the real-world view, with separate right and left images adjusted by convergence angle to create depth. The angle of convergence is limited to 20 degrees or less for visual comfort. This wearable device uses two types of inputs: input A (hand/finger gesture detected by the gesture detector) and input B (eye motion detected by the eye motion detector), allowing users to interact with AR content intuitively. The processor controls the AR interface by recognizing a user’s gaze and gesture, dynamically modifying the AR image accordingly and enabling actions such as cursor movement triggered by a detected wink. This fusion of eye and gesture control results in a highly responsive and immersive user experience in wearable AR systems.
In patent No. US10813559B2, under the title of "Detecting Respiratory Tract Infection Based on Changes in Coughing Sounds," which was registered on 2020-04-21, a system is introduced for identifying the onset and monitoring the progression of respiratory tract infections (RTIs) such as COVID-19 by analyzing coughing patterns. The invention employs smart glasses embedded with an acoustic sensor and a movement sensor, both mounted at fixed positions relative to the user's head to ensure consistent data collection. The system operates by capturing real-time acoustic and motion data associated with coughing episodes. These new data samples are then compared to previously recorded baseline measurements taken when the user had a known extent of RTI. The system specifically analyzes changes in cough-related sounds and head movements during these episodes. A computer analyzes both sets of data — the current coughing-related measurements and the baseline reference — to detect deviations. These deviations indicate changes in the condition of the user's respiratory system, providing an early warning of infection, and also allowing for ongoing tracking of disease progression or improvement. This approach enables continuous, non-invasive monitoring using wearable technology, supporting early diagnosis, remote health surveillance, and potentially reducing the need for more invasive or clinical evaluations in the early stages of respiratory infections.
In patent No. CN111949131B, under the title of "Eye Movement Interaction Method, System and Equipment Based on Eye Movement Tracking Technology," which was registered on 2020-08-17, an advanced system is introduced for enabling intuitive user interaction based on real-time eye movement tracking. The invention focuses on enhancing the accuracy and stability of eye-based control systems, which are increasingly used in wearable technologies and smart devices. The method establishes sensing areas or effective clicking zones corresponding to interactive targets on a display. When a fixation cursor, guided by the user’s gaze, moves into a sensing area, the system determines whether the fixation is stable enough to indicate intentional selection. This is achieved by analyzing factors such as eye jitter, glancing distance, and dwell time to decide if the system should actively "adsorb" or lock the cursor onto the target. To further improve interaction reliability, the system employs machine learning algorithms to process and learn from users’ eye movement behavior over time. These algorithms help model the user’s subjective eye movement intention, allowing the system to predict and adapt to each user's unique interaction style. As a result, it reduces unintended selections and enhances the responsiveness and user experience of eye-controlled interfaces. This invention is particularly valuable for assistive technologies, augmented reality (AR), and head-mounted displays, where hands-free and precise control is critical. By combining real-time tracking, behavioral modeling, and adaptive interaction logic, the system significantly advances the field of eye-tracking-based human-machine interaction.
In patent No. US8477425B2, under the title of "See-Through Near-Eye Display Glasses Including a Partially Reflective, Partially Transmitting Optical Element," registered on 2012-03-25, an interactive head-mounted eyepiece is disclosed for displaying digital content over real-world views. The system includes an integrated processor and image source that directs light through a curved polarizing film onto a reflective display. A key feature is the optical assembly containing a partially reflective, partially transmitting element that reflects image light while allowing ambient scene light to pass through. This creates a combined image of both digital content and the user’s environment, enabling seamless augmented reality experiences in a compact, wearable form—suitable for navigation, real-time guidance, or mixed-reality applications.
In patent No. US12078870B2, under the title of "Eyewear Housing for Charging Embedded Battery in Eyewear Frame," registered on 2018-11-07, a smart eyewear system is introduced with integrated electrical components designed to preserve the aesthetic integrity of traditional eyewear. The device includes a rechargeable battery, sensor(s), controller, and circuit board, all compactly embedded within the eyewear arms. A conductive pad, partially exposed on the arm, allows for direct electrical contact with a charging connector in a compatible charging apparatus. The system enables automatic recharging of the internal battery when the eyewear is placed in the dock, supporting both functional enhancement and sleek form factor for wearable electronics.
In patent No. US12078870B2, under the title of "Eyewear Housing for Charging Embedded Battery in Eyewear Frame," registered on 2018-11-07, a system is disclosed that integrates electrical components into eyewear without compromising its aesthetic design. The invention includes a rechargeable battery, controller, sensor, and circuit board, all fully embedded within the eyewear frame—specifically inside the arm of the glasses. A conductive pad, partially exposed on the frame, allows electrical contact with a charging connector when placed in a charging apparatus. This enables convenient recharging of the internal battery. The system supports seamless integration or after-market enhancements and allows the electronics to function independently or in combination with other components, offering flexibility and modular design. The approach preserves traditional form while enhancing functionality, enabling a wide range of smart eyewear applications such as sensing, data collection, and wireless connectivity.
In patent No. US10595730B2, under the title of "Physiological Monitoring Methods," registered on 2018-05-01, a wearable earpiece-based system is introduced for real-time health and motion monitoring of a subject. The system integrates motion sensors and physiological sensors within an earpiece, such as a headset or hearing aid, to collect head motion, footstep motion, and pulse rate data. Using an embedded processor, the system analyzes this data to detect events such as falls or immobility, and filters out motion artifacts from the physiological readings. The processed data is then transmitted to a remote device, which can provide corrective instructions directly to the user or notify a third party for assistance. This multi-sensor system enables accurate health monitoring and real-time response, supporting applications in elder care, emergency response, and continuous wellness tracking, all through a discreet, wearable form factor.
In patent No. CN109946853B, under the title of "Intelligent Glasses," registered on 2019-03-26, a smart eyewear system is presented that optimizes power efficiency through user-intent recognition. The glasses comprise a main frame body with lenses, a first and second leg, two sensors, and a circuit. The first and second sensors are integrated into the respective legs of the glasses, and the circuit is connected to both. These sensors detect usage-related data—such as whether the glasses are currently worn—enabling the circuit to automatically adjust the operational state. If the sensors detect that the user is not wearing the glasses, the system enters a low-power consumption mode, significantly reducing energy usage and extending the device's standby time. This intelligent control mechanism enhances both usability and battery longevity, making the eyewear more efficient for everyday use.
In patent No. US10451420B2, under the title of "Non-interferometric Optical Gyroscope Based on Polarization Sensing," registered on 2017-11-20, a novel rotation sensing device is introduced that operates without traditional optical interferometry. Instead, the system leverages changes in optical polarization to determine angular rotation. The device includes a Wollaston prism to split incoming light into two orthogonally polarized beams, which are then guided through polarization-maintaining fibers into a common optical loop. These beams propagate in opposite directions and exit the loop via the same optical path, allowing polarization changes induced by rotation to be analyzed. A polarization analyzer calculates Stokes parameters (s₂ and s₃) corresponding to sine and cosine functions of the phase shift (Δϕ) caused by rotation. A processing unit interprets these parameters to determine the precise rotation rate. This design provides a compact, interference-free gyroscopic solution, enhancing stability and accuracy in motion detection applications.
In patent No. US10852137B2, under the title of "Multilayer Waveguide Optical Gyroscope," registered on 2018-09-26, an advanced optical gyroscope system is introduced featuring a compact multilayer waveguide rotation sensor built on a substrate. The innovation centers on multiple vertically separated spiraling waveguides, which are non-intersecting and stacked to reduce optical cross-coupling, allowing for dense coil integration in a small volume while preserving signal integrity. These layered waveguides are interconnected via a vertical waveguide coupler and are optically linked to integrated gyroscope components, including a light source, a detector, and a lithium niobate phase modulator chip. The phase modulator is embedded in a substrate cavity and precisely aligned with the multilayer waveguides, both horizontally and vertically, through sidewall exposure. This design significantly increases the coil length and sensitivity of the gyroscope within a miniaturized footprint, improving performance for inertial navigation and motion sensing applications.
In patent No. US20240180424A1, under the title of "Systems and Methods for Monitoring Eye Health," which was published on 2024-01-17, a system is presented for evaluating ocular health by measuring scleral strain. This measurement can be achieved through various monitoring platforms, including an implantable monitor, a wearable monitor embedded in eyeglasses, or an external monitor integrated with a portable tablet device. The core functionality centers on detecting and analyzing strain patterns in the scleral tissue of the eye, which may indicate pathological conditions or changes in intraocular pressure. Notably, the system is versatile: the strain monitor can also be adapted for non-medical applications, such as structural monitoring of surfaces like buildings or bridges. This multi-use strain detection system enables non-invasive, continuous eye health tracking, offering potential for early diagnosis and real-time health analytics in both clinical and personal health environments.
This invention addresses multiple technical challenges related to ocular health, adaptive vision, and emergency detection. Existing eyewear fails to detect sudden hemodynamic events, fatigue, or balance loss, nor can it respond dynamically to focal distance shifts or varying light intensities. To resolve these limitations, the present invention introduces a sophisticated pair of smart eyeglasses equipped with multilayer adaptive lenses, electrochromic gel, and advanced sensors. The system employs visible and ultraviolet sensors to independently measure environmental light, dynamically adjusting lens opacity for glare protection and UV filtration. A micro-pump modulates electrochromic gel volume to alter the lens’ internal curvature, simulating natural focal adaptation based on object distance and pupil behavior. Micro-cameras and infrared modules collect ocular imagery, while embedded prisms allow triangulated vision and wave diffraction for real-time analysis of pupil diameter, eye motion, and fatigue.
For health monitoring, ultrasonic imaging and optical ptychography provide deep-layer eye scanning to detect retinal or neurological complications, such as ocular stroke. A gyroscope-based fall detection algorithm monitors acceleration and triggers safety verification or emergency alerts via smartphone or vehicle ECU integration. Wireless communication modules enable proactive interventions, like slowing a vehicle or alerting emergency services. Machine learning algorithms personalize visual responses by analyzing user-specific pupillary dynamics, light sensitivity, and visual habits. The processor is equipped with optical ports, eliminating extra emitters, and supports low-latency control of the entire sensory array. Overall, this invention combines optics, sensor fusion, AI learning, and biomedical diagnostics into a wearable solution, enabling non-invasive monitoring and dynamic adaptation for vision optimization and safety.
The existing technical problem that this invention tries to solve is the early detection of various sudden motor, balance, and hemodynamic disorders in the person wearing the glasses, which can be issued by analyzing the pupil diameter and processing unusual glare. Also, these glasses allow the measurement of several vital elements to diagnose fatigue and the level of alertness, which is based on the particle diffraction velocity parameter and scanning the superficial, semi-deep, and deep layers of the eye. To solve the problem of ultraviolet radiation and visible light separately, a separate detector for each has been used to control the amount of unusual intensity entering the eye. Another problem that these glasses try to solve is the lack of adaptability of the focal length of the glasses lens, which in this invention allows the focal length to be changed while maintaining the hardness of the contact surface of the lenses.
The claimed multimodal smart eyeglasses (1), for early detection of sudden motor, balance, and hemodynamic disorders, with the ability to adapt to light and focal angle, are structurally similar to conventional glasses in terms of a hard frame (2) and arms (13) designed to rest on the ears. However, substantial modifications have been made to the lens (3) and frame (2), including the integration of a light level detection sensor (11) and an ultraviolet radiation level detection sensor (10). The arms (13) also house the batteries (9), a charger input port (12), an electrochromic gel volume control micropump (7), and a movable hinge (8).
The lens within the hard frame is surrounded by a cord (6). The design of the glasses in this invention incorporates multi-modal sensory and adaptive systems into a compact wearable form factor. The light level detection sensor (11) and ultraviolet radiation level detection sensor (10) provide real-time photometric analysis, enabling the system to differentiate between visible-spectrum intensity and high-energy UV radiation. This dual-sensor arrangement facilitates intelligent regulation of incoming light via modulation of the electrochromic gel (21), a substance whose optical transmittance adjusts in response to electrical stimuli.
The electrochromic gel volume control micropump (7) functions as a microfluidic actuator, altering the curvature or refractive index of the lens by displacing or retracting volumes of gel between two rigid lens layers. The battery (9) and charger input port (12), embedded within the arms (13), provide a continuous power supply while distributing electronic components ergonomically to maintain user comfort. The movable hinge (8) serves both structural and functional purposes by embedding microtubing and conductive wiring to enable electrochemical modulation and high-speed data transfer, while allowing sufficient flexibility for head movement.
The surrounding cord (6) is composed of viscoelastic composite materials with frequency-dependent stiffness properties. When exposed to high-frequency ultrasonic piezoelectric oscillations, molecular alignment induces temporary rigidity, maintaining optical structural integrity during dynamic motion. In contrast, under low-frequency biomechanical stresses (<1 Hz) such as prolonged eye strain or gradual head tilting, the material becomes pliable, allowing controlled deformation of the lens (3) to adjust focal depth accordingly.
The fixed hinge (5) acts as a reinforced structural anchor designed to evenly distribute stress generated by peripheral optical modules and motion sensors. It interfaces with the movable hinge (8) through a hollow shaft (4), which serves as a multi-channel conduit for gel injection lines, optical fibers, and power wiring required for active optical components. The piezo oscillator (16), secured by screws (15), functions as an ultrasonic actuator, enabling dynamic wave-based measurements through the inner hard lens layer (23) to support real-time biomechanical and hemodynamic scanning.
This specially designed cord (6) demonstrates early elasticity in response to motor, balance, and hemodynamic disturbances. When subjected to low-amplitude, high-frequency ultrasonic piezoelectric vibrations, it behaves as a rigid material, stabilizing the hard-outer layer of the lens (22). However, under low-frequency fluctuations of less than one hertz, it acts like elastic rubber, enabling the lens (3) to change in thickness and thereby adapt its focal properties. The fixed hinge (5), which is connected to the frame (2), is deliberately dimensioned to support and distribute the weight of integrated components as effectively as possible.
The movable hinge (8) is connected to the fixed hinge (5) via a hollow shaft (4). In addition to transmitting mechanical power, these hinges are designed to facilitate the flow of electrochromic gel (21) and house electrical wiring for powering the optical and sensory components. The structure is secured by a bottom screw (14) connected to the shaft (4), ensuring mechanical integrity and containment of the fluidic and electrical pathways. On the inner view of the glasses, positioned within the peripheral region of the hard frame (2) surrounding the lens (3), the piezo oscillator (16) is mounted in contact with the hard-inner lens layer (23), and fixed in place using screws (15).
The piezo oscillator (16) functions by generating mechanical vibrations in the inner lens layer (23), enabling real-time measurement and monitoring of parameters related to balance, movement, and hemodynamic responses. These measurements are derived by evaluating the velocity and frequency-phase changes of the returning waves, which are directly influenced by the biomechanical state of the eye. In the lower half of the frame (2), three prisms (17) are embedded to support the wave emission and reflection system. Their placement in the lower region is intentional, to avoid interference from the upper eyelid (32) and upper eyelashes (31), which may obstruct sensor line-of-sight from above.
To determine the spatial coordinates of an observed object, at least nine sensors are needed and strategically co-located to triangulate accurate measurements of length, width, and height. The three prisms (17) re-emit waves at a frequency of 1 GHz and serve as a lightweight and compact alternative to deploying multiple cameras or infrared emitters. This approach reduces the overall weight and electronic density of the glasses, minimizing wiring complexity and the risk of component failure while maintaining high-resolution spatial detection.
The piezo oscillator (16) emits controlled ultrasonic pulses in the MHz range, specifically tuned to elicit responses from biological tissues such as the cornea, aqueous humor, and iris. By analyzing return velocities and phase shifts of these pulses, the system calculates key biomechanical metrics including intraocular pressure, micro-oscillatory movements, and vascular pulsatility. These indicators are critical for early detection of ocular hemodynamic anomalies such as ischemic micro-events or early-stage retinal or optic nerve strokes.
The three prisms (17), integrated into the lower frame, optimize angular reflection while avoiding occlusion from anatomical structures like the upper eyelid (32) and eyelashes (31). This optical configuration supports accurate triangulation of both optical and acoustic return signals. The use of optical multiplexing within each prism module allows three-directional wave emission and detection using a single component, enhancing spatial resolution without requiring additional emitter-receiver pairs. This design significantly reduces power consumption and wiring requirements while preserving full field-of-view tracking.
Direct image observation is impractical in these glasses due to spatial constraints; therefore, reflective imaging methods are employed. Of the two options—mirrors and prisms—prisms are preferred due to their optical clarity, durability, and longevity. The three-point vision micro-camera (18), mounted within the movable hinge (8) inside the handle, enables full coverage of the eye surface. A separate camera is provided for each eye, with optical pathways (33) guiding the view through the lens assembly and prisms for real-time monitoring.
The micro-camera (18) is developed by modifying existing wide-angle imaging hardware. Adaptations to the image sensor—whether CCD or CMOS—are applied to suit biomedical imaging needs. These modifications are similar to those seen in 180-degree automotive cameras, optimized here for close-range, high-resolution capture. Additionally, an infrared transmitter and receiver (19) are installed in close proximity to the micro-camera, allowing high-fidelity imaging and reflective wave analysis. Reflections are captured from three angles using the prisms (17), facilitating wave diffraction-based biometric assessments. For these processes to function reliably, unobstructed wave return paths (34) are ensured through careful spatial engineering of the internal optical geometry.
Given the limited spatial envelope of the glasses, direct imaging is replaced with indirect prism-based imaging, utilizing internal reflectance mechanisms. The choice of prisms over mirror reflectors stems from their superior optical clarity, durability, and low chromatic dispersion. The three-point micro-camera (18), embedded within the movable hinge (8), provides stereoscopic imaging and real-time ocular monitoring through angular imaging paths (33). The stereo-pair configuration allows simultaneous capture of both front-facing environmental views and inward-facing ocular surfaces (e.g., sclera, pupil, iris).
This method eliminates the need for additional mirrors or prisms. As mentioned earlier, changing the shape of the lens can be used to modulate the focal length. There are two ways to achieve this. The first method is to change the thickness using lateral pressure perpendicular to the optical axis of the lens. By increasing this pressure, the thickness increases at the center, while remaining relatively constant at the periphery. This method is suitable for transparent, flexible lenses. However, for the claimed glasses, which require hard surfaces on both the outer (22) and inner (20) lens layers, changing the diameter induces refraction. Instead, by adjusting the pressure—and consequently the volume—of the electrochromic gel (21), the inter-laminar distance between the lens layers (22 and 20) is modulated, enabling image clarity at different focal depths inside the eye.
Furthermore, the micro-camera (18), equipped with high dynamic range (HDR) CMOS or CCD sensors, operates with customized firmware capable of recognizing pupillary behavior, saccadic movements, and corneal reflectivity. The system works in tandem with the infrared transceiver (19) to emit and detect low-intensity near-infrared (NIR) signals. These signals are critical for tracking pupil dynamics, mapping ocular surface topography, and analyzing microvascular features using wave diffraction analysis—especially via Fresnel and Fourier ptychography techniques. The embedded prism optics split and redirect light beams at fixed angles, simulating multi-perspective detection paths. Return paths (34) are strategically designed to minimize signal loss during propagation and to enhance resolution, particularly for micromovements and fatigue diagnostics.
The modulation of the lens’ focal length is based on variable pressure applied to deform the internal gel layers without compromising the rigidity of the external surfaces. This lateral pressure approach enables localized zonal deformation of the internal refractive index while preserving alignment along the optical axis. The electrochromic gel (21) is actuated via micro-injected voltages from the control circuit, dynamically adjusting the thickness between the outer (22) and inner (20) hard lens plates. This adaptive optical response mimics the human eye’s natural accommodation mechanism, allowing for real-time focal adjustments across variable viewing distances.
The hard-inner layer of the lens may be engineered either as a flat surface (20) or a concave form (23), and in both cases, the object (30) is projected into the eye (32) with sufficient clarity at both long (31) and short (29) distances. The selected shape—flat or concave—modulates the convergence of incident light rays on the retina. A key problem addressed by these glasses is the regulation of light entering the eye. By analyzing pupil diameter in conjunction with either preset or user-specific configurations, the photocell (11) measures ambient light levels (24), and accordingly determines the opacity level of the electrochromic gel (21).
This intelligent light adaptation system enables users with refractive conditions such as astigmatism, presbyopia, or hyperopia to experience improved visual acuity without changing glasses. The photocell (11) continuously monitors environmental light intensity and dynamically adjusts the gel's opacity. This adjustment is informed by real-time analysis of pupillary diameter captured by the micro-camera (18). Together, these form a closed-loop bio-optical feedback system, offering personalized light filtering and enhanced comfort based on each user’s individual visual sensitivity.
In this configuration, the incoming ambient light directed toward the eyes via the optical interface (25) is dynamically regulated to preserve optimal retinal comfort and function. The eyepiece (28), which functions as the anatomical aperture control (i.e., the pupil), inherently adjusts to light stimuli; however, its reactivity can be sluggish or insufficient in certain scenarios. To complement and enhance this natural modulation, the real-time pupil diameter—continuously tracked by the embedded micro-camera (18)—is transmitted to the processor, where it undergoes digital analysis. This biometric feedback, combined with ambient luminance measurements captured by the photocell (11), enables the glasses to generate a personalized adaptive lighting profile for each user. This results in a finely tuned electrochromic gel modulation tailored to the user's current environment, biological state, and learned historical responses. One notable limitation in natural vision is the eye’s inability to recognize harmful ultraviolet (UV) radiation, particularly in low-light or indirect-light conditions, which this system is designed to overcome.
The ultraviolet sensor (10), isolated from the visible spectrum detection systems, functions as a critical protective component. It autonomously detects UV radiation in real time—even when the user’s pupils are dilated due to darkness—and sends immediate corrective signals to the processor. Upon detection of excessive UV exposure, the system initiates two coordinated safety responses: (1) it triggers the electrochromic gel to increase opacity, effectively attenuating UV wave transmission, and (2) it simulates pupillary constriction by influencing the inner lens curvature, thereby narrowing the path of light entry. These actions protect the retina and prevent photochemical or thermal damage to the optic nerve (26). This mechanism is particularly effective in hazardous exposure scenarios, such as welding, where traditional photoprotective reactions are inadequate due to simultaneous pupil dilation and high UV flux.
In such high-risk conditions—most notably in industrial environments with strong UV sources—this intelligent feedback mechanism preserves ocular integrity. When a user views a welding arc in a darkened room, the absence of visible light causes natural pupil dilation, inadvertently allowing high UV flux into the eye. The glasses' independent UV detection and compensatory darkening prevent this otherwise unmitigated retinal exposure. In parallel, the glasses assess the internal optical behavior of the user’s eye lens (27), whose adaptive thickness serves as a biological marker for near or far focus. The inability of the eye lens to sufficiently thicken during near-vision tasks, such as in presbyopia, leads to optical blur and strain. The system measures this lens behavior using real-time imaging and interferometric feedback.
By quantifying the eye lens thickness variations during visual accommodation attempts, the system learns the user's natural focusing behavior across a range of object distances. This is achieved using non-invasive, multi-frequency ultrasound scanning and time-resolved infrared reflectometry. These tools measure minute shifts in focal plane length and surface curvature of the biological lens (27). For individuals with accommodation deficiencies—e.g., in age-related presbyopia—the intelligent glasses adjust their own lens configuration to compensate for the insufficient thickening or flattening of the user's internal lens, restoring clarity at varying distances.
The focal adaptation system employs a machine-learning model trained on biometric data—including lens thickness changes, saccadic patterns, and blink frequency—to create a personalized response matrix. This data-driven adaptation eliminates the need for progressive or bifocal lenses by enabling continuous and automatic modulation of focal power. In addition to standard refractive correction, the system calibrates for each user’s tolerance to luminance shifts and contrast changes. This becomes especially important in occupations that involve rapid alternations between dark and bright environments, such as welding, long-haul driving, or screen-intensive tasks. Here, intelligent gel modulation ensures visual comfort, fatigue prevention, and long-term retinal protection.
The focal-length adaptation is executed by modulating the electrochromic gel’s distribution between rigid lens surfaces through microfluidic actuation. The lens curvature is altered without deforming the external surfaces, preserving physical protection while enabling internal optical reframing. Due to the compact and lightweight design requirements, the processor integrated into the system (36), housed in module (35), must be capable of extremely low-latency processing of biological and environmental inputs. It interfaces directly with infrared-based fiber optic lines through port (39) and connectors (44), eliminating the need for external emitters, photodiodes, or analog converters. This design ensures fast bidirectional transmission of infrared and optical signals with minimal energy loss or signal noise.
The embedded processor (36) is a quantum microcontroller designed for edge-computing applications with high-throughput signal analysis. It handles simultaneous optical, ultrasonic, and gyroscopic data flows. Port (39) enables direct communication over multiple optical fibers, allowing high-speed processing of multispectral wave data with precision and real-time responsiveness. The processor supports multisensory fusion—integrating visual cues, photometric inputs, acoustic mapping, and motion analysis—to ensure accurate contextual awareness, enabling both diagnostic and preventive functionalities in dynamic environments.
The system also features wireless connectivity via port (38) and the wireless module (37), enabling communication with external smart platforms such as vehicles (49), smartphones, or brain-integrated computing devices (51). In scenarios involving fatigue, visual impairment, or neurological disturbance, this connectivity allows the glasses to transmit biometric and geolocation data to external systems. This integration is particularly important for safety-critical applications, such as driver monitoring. If signs of fatigue or distress are detected, the glasses trigger protocols within the vehicle ECU (53), which may include automatic speed reduction, flasher activation, or steering adjustment to guide the vehicle to a safe halt.
In a real-world scenario, if a user operating a vehicle exhibits prolonged blink duration, slowed saccadic responses, or low variability in head movement, the glasses detect this pattern as cognitive or physical fatigue. Waves (50) are emitted to a connected dongle (52), which relays the signal to the vehicle’s onboard control unit (53). The vehicle may autonomously initiate safety procedures, such as reducing throttle input, activating warning systems, or pulling to the shoulder. Alternatively, if the user experiences a health event while alone, the glasses communicate through the paired smartphone to contact emergency services and transmit GPS location, physiological parameters, and a brief health log.
The detection algorithm for dangerous conditions is driven by multi-parametric inputs: gyroscope signals from within the processor (36), infrared tracking from the emitter (19), and image analytics from the micro-camera (18). This system fuses movement data from the embedded gyroscope (67)—including angular velocity, acceleration vectors, and orientation shifts—with biometric and optical inputs. The fusion algorithm uses temporal modeling to distinguish between normal activity and potential emergencies (e.g., syncope, stroke, fall). It functions as a continuous health monitor that evaluates the kinetics and postural stability of the user.
In practice, if the gyroscope (67) detects rapid acceleration exceeding safety thresholds (e.g., from falling or collision), the system activates a consciousness check protocol. This involves a looped verification prompt to the user through auditory or visual signals. If no response is detected within a defined timeframe, the glasses automatically initiate an emergency alert via the connected smartphone, transmitting distress signals, live biometric data, and location. This closed-loop system continues monitoring and awaits user acknowledgment to resume normal operations or remains in alert mode until external intervention.
As illustrated in , the proposed algorithm governs emergency condition recognition by continuously evaluating acceleration data from the internal gyroscope. Let “A” represent the real-time acceleration value reported by the gyroscope system (67), and “D” denote the predefined maximum acceleration threshold considered safe. The algorithm operates on a temporal loop that executes once per second, assessing whether A exceeds D. If A surpasses D, indicating a potential emergency event such as a fall, abrupt impact, or unexpected head motion, the system sends a query to the user's smartphone or linked device to confirm consciousness and well-being. A response window is initiated, allowing the user a short duration to confirm they are alert and unharmed. If the user responds, the monitoring loop continues normally. However, failure to respond within the allotted time triggers an automatic emergency protocol—sending an alert message with geolocation data to emergency contacts or medical services. This failsafe algorithm ensures autonomous detection of life-threatening situations without external input and is particularly effective for the elderly or individuals at high risk of syncope or neurological impairment.
Integration with a brain-connected electronic interface (brain chip 51) enhances this system by introducing bidirectional cognitive and perceptual feedback. The connection enables these alleged multimodal smart eyeglasses (1) to send processed visual and environmental data to the brain chip (51) in real time via directed waves (50). Simultaneously, the brain chip returns neural feedback signals (54) to the glasses, potentially adjusting image processing parameters, contrast, or brightness based on cognitive interpretation and visual demand. This dynamic interaction allows the system to align vision augmentation with neurocognitive processing, optimizing user experience in real time.
Nevertheless, the intelligent processing algorithm embedded in the processor (36) is designed to override brain feedback in the presence of hazardous environmental factors. For example, when ultraviolet (UV) exposure is abnormally high, but the brain chip signals demand for increased light due to perceived darkness, the glasses' logic prioritizes ocular safety. In such scenarios, processor (36) suppresses the UV exposure and modulates the electrochromic gel opacity accordingly, thereby protecting the retina and optic nerve from phototoxic damage despite neural override attempts. This hierarchical control structure ensures fail-safe operation by deferring to physiological protection protocols in the event of conflict between sensory and cognitive commands.
The processor (36) supports multiple I/O ports to manage all subsystems of the glasses. Port (41) connects to the micro-camera (18) through connector (48), providing real-time ocular surface and environmental image input. Port (42) interfaces with the photocell (11) and ultraviolet sensor (10) via connector (47), facilitating continuous ambient light and UV level measurement. Port (43) is linked to the internal rechargeable battery (9), supplying regulated power to all active elements of the system. To preserve processor integrity, a dedicated operational amplifier (45) is housed externally and connected via port (40) to amplify analog signals—particularly for driving the ultrasonic piezo oscillator (vibrator)—whose output is made accessible via connector (46).
The glasses' ultrasonic and optical wave systems leverage frequency-specific behavior for diagnostic imaging. At frequencies exceeding 1 GHz—where wave characteristics begin to resemble those of visible light—wavelength tuning is hardware-limited. That is, a light-emitting diode (LED) or wave source designed for a specific wavelength (e.g., 650 nm for IR) cannot generate other wavelengths (e.g., 600 nm) without replacing the emitter. In contrast, ultrasonic wave modulation allows variable frequency operation (e.g., 5 MHz, 6 MHz, or 7 MHz) using the same output device by simply adjusting the oscillation rate from the processor (36). This flexible modulation facilitates multi-depth scanning without hardware redundancy.
Ultrasonic imaging is performed by emitting directed waves (55) toward the ocular structures. These waves reflect back (56) and are interpreted by return-phase analysis. Short-wavelength ultrasound (59) reflects from superficial structures such as the cornea and sclera. Mid-range frequencies (58) penetrate deeper into mid-layer regions, such as the ciliary body or choroid. Low-frequency, long-wavelength ultrasound (57) achieves maximum penetration, mapping the posterior retina and optic nerve with minimal attenuation. The varied use of frequency-specific penetration enables stratified imaging and facilitates layered ocular reconstruction for diagnostic use.
Within processor (36), a quantum microcontroller (66) performs high-speed computation and real-time analysis while maintaining ultra-low power consumption and heat generation. This microcontroller is optimized for simultaneous processing of visual data from the micro-camera (18) and signal data from the ultrasonic and optical systems. Transistors (60) at the output of port (42), paired with impedance-matching resistors (61), ensure precise signal modulation and minimize distortion. Current-limiting resistors (65) at port (40) protect processor (36) from electrical surges. The voltage stabilizer (62) regulates supply voltages, while port (39) connects to optical emitters and receivers—specifically LEDs (63) and phototransistors (64)—ensuring high-fidelity light transmission and detection over optical fibers.
The intelligent glasses utilize advanced multi-frequency ultrasound imaging in conjunction with optical ptychography to perform comprehensive non-invasive eye scanning. This hybrid system builds a volumetric 3D model of the eye, layer by layer. High-frequency ultrasound maps anterior regions like the cornea and aqueous humor, while lower frequencies penetrate the retina, choroid, and optic nerve head. These layers are then fused with visual data from the micro-camera (18) and infrared reflectometry to produce a unified, highly resolved image profile. Such multimodal scanning enables early detection of disorders such as ocular stroke, retinal ischemia, optic neuritis, and fluid-related abnormalities.
The ptychographic scanning methodology uses ultra-short IR pulses generated by the transceiver (19), which reflect at varying depths depending on tissue density and absorption characteristics. These reflected signals are analyzed for return time and phase shift, from which the depth, structure, and optical properties of each layer are inferred. This data is then geometrically modeled, and images captured by the micro-camera (18) supplement this model by providing surface fidelity and optical continuity, especially for flatter surfaces that reflect IR uniformly.
For enhanced spatial and depth accuracy, three prisms embedded in the lower frame are used to refract and redirect both incoming and outgoing wave signals, ensuring comprehensive optical coverage despite anatomical obstructions such as the upper eyelid or brow ridge. The triangulation data from these prisms supports precise depth perception and coordinate localization of ocular abnormalities. Besides that, ultrasonic waves are employed to cross-check and refine these distance measurements between each tissue layer, enhancing the accuracy of 3D reconstruction. This rigorous scanning methodology provides a diagnostic-grade platform for continuous ocular monitoring.
These glasses, with the ability to change the focal length, eliminate the need for multiple glasses and, with the ability to measure ultraviolet intensity independently of light, minimize the damage caused by ultraviolet radiation in low-light environments. The ability to detect sudden motor, balance, and hemodynamic disorders such as stroke and ocular stroke are other advantages. Furthermore, collecting data in the diagnosis and treatment of optic nerve problems and other optical eye problems or reaction speed at different times to environmental stimuli is considered an advantage.
The ability to learn and use data in creating a reaction to a stimulus is considered another advantage, meaning that by repeatedly increasing and decreasing the light and measuring the light intensity and pupil diameter, the amount of color change of the electrochromic gel in it in other environments is personalized for the person. This is also true for the amount of change in focal length. The ability to communicate with cars, mobile phones, and nutritional chips are other advantages of this invention, which helps to drastically reduce the risks of fatigue and stroke or loss of consciousness and keeps equipment performance within safe limits.
: Depicting a general perspective of the implemented multimodal smart eyeglasses, with its main parts identified.
: Demonstrating more details of the claimed multimodal eyewear in three different points of view. The top representation shows the top view, the middle schematic shows the front view, and the bottom representation demonstrates the back view of the alleged multimodal glasses from the inner section.
: Illustrating a cut and the location of the schematic section, which was conducted from the inside area of the frame.
: Showing a schematic cut of the claimed multimodal glasses and the location of the section, which was applied by passing through the lens.
: Representing a half-cut of the referred glasses, and the removal of the handle has resulted in a proper magnification.
: Illustrating two different schematic representations, one relates to a lens with a curved back surface, which is shown on the right, and the other relates to a lens with a flat back surface, which is shown on the left.
: Depicting a clearer representation of the lens by removing the inner part of the frame and the ultrasonic vibrator.
: Demonstrating two modes of light passing through the lens. Above, with the lens and the electrochromic gel inside it functioning, the amount of light passing through is reduced, and below, without the gel functioning, the incoming light completely reaches the surface of the eye.
: Illustrating the alteration in focal length in two various 2D schematics, in the upper representation, the distance between the object and the lens and the eye is far, and the lens has increased in thickness to make the image appear clearer in the eye and in the image below, a close object is being viewed and the thickness of the lens has decreased.
: Displaying the eyeball and the surface of the eyelashes and eyelids to show the reason for the placement of the prisms in the lower half of the frame.
: Representing the path and image reception from the installed prisms in the left, and the paths of the coherent light from the source and then their return, in the right.
: Depicting the optical/electronic processor, along with other parts of the electronic module.
: Showing the connection of the claimed multimodal intelligent glasses with the vehicle processor, on the top, and the connection of the glasses with the brain chip, on the bottom of the figure.
: Illustrating the method of sending and receiving ultrasonic waves to the eye surfaces schematically, on the top, and the penetration rate of each wavelength, on the bottom of the represented figure.
: Depicting the algorithm for the operation of the gyroscope sensor.
: Demonstrating the proposed circuit for building a quantum processor for simultaneous operation by electronic and optical currents.
: This figure depicts a general perspective view of the implemented multimodal smart eyeglasses system from the lateral direction, in alignment with three coordinate axes, and is represented in an isometric shape with hidden lines removed. The figure illustrates the complete and assembled view of the glasses, detailing the spatial configuration and arrangement of key integrated modules. The scale of the drawn figure is 1 to 1.25 real scale, and all dimensions are represented in centimeters. Referred components include the intelligent eyeglass system (1), the hard frame (2), the adaptive multilayer lens (3), the hollow shaft (4), the fixed hinge (5), the elastic string surrounding the lens (6), the micropump for electrochromic gel modulation (7), the movable hinge (8), the embedded battery (9), the ultraviolet radiation sensor (10), the visible light detection photocell (11), the charging port (12), and the structural arms (13).
: This figure demonstrates three different schematic representations of the claimed multimodal eyewear from the lateral, frontal, and upper directions, aligned with the three coordinate axes, and shown in a 2D shape with hidden lines removed. These orthographic views collectively present a comprehensive spatial orientation of the structural and functional elements of the eyewear from external perspectives. The scale of the drawn figures is 1 to 1.25 real scale, and all dimensions are given in centimeters. Referred components include the screw assembly securing the internal modules (14), the ultrasonic actuator frame (15), the piezo oscillator (16), the integrated optical prisms (17), the three-point micro-camera (18), and the infrared transceiver unit (19).
: This figure presents two different schematic views of the claimed multimodal smart eyewear, shown from the frontal and side directions, aligned with the X and Z axes, rendered in a 2D shape with hidden lines removed. These views illustrate the precise location and cross-sectional alignment of the schematic section conducted through the inner region of the eyewear frame. The visual representations are intended to support the structural and functional interpretation of the interior arrangement of the lens assembly. The scale of the drawn figures is 1 to 1 real scale, and the dimensions shown are in centimeters.
: This figure presents two different schematic views of the claimed multimodal smart eyewear, shown from the frontal and lateral directions, aligned with the X and Z axes, rendered in a 2D shape with hidden lines removed. This figure specifically illustrates a schematic cut passing directly through the lens, capturing the internal structural alignment and functional layering of the adaptive lens mechanism and electrochromic gel system. This contrasts with , which depicts a section through the inner area of the frame. The scale of the drawn figures is 1 to 1 real scale, and the dimensions shown are in centimeters. Referred components include the outer hard lens surface (20), the inner hard lens surface (21), and the electrochromic gel (22).
: This figure represents two various sectional cuts of the claimed multimodal smart eyewear, shown from the frontal and lateral directions, aligned with the X and Z coordinate axes, in a 2D shape with hidden lines removed. The schematic cuts reveal internal structural details within the lens area, particularly emphasizing the arrangement of the outer and inner hard lens surfaces and the electrochromic gel chamber situated between them. The scale of the drawn figures is 1.25 to 1 real scale, and the dimensions shown are in centimeters.
: This figure presents two various schematic representations of the lens design configuration of the claimed multimodal smart glasses, observed from the lateral direction, aligned with the X and Z axes of the coordinate system, and displayed in a 2D shape with hidden lines visible to reveal internal contours of the lens structure. The left schematic illustrates the lens with a flat back surface, while the right schematic shows a lens with a curved (concave) back surface—both contributing to adjustable focal performance. The scale of the drawn figures is 4 to 1 real scale, and the dimensions are provided in centimeters. Referred component includes the concave inner lens layer (23).
: This figure demonstrates two different schematics focusing on the internal lens architecture of the claimed multimodal smart eyewear viewed from the lateral direction, aligned with the X and Z axes of the coordinate system, and depicted in a 2D shape with hidden lines visible to showcase internal structural detail. In this view, the inner part of the frame and the ultrasonic vibrator have been removed to enhance the visibility of the lens configuration and interlayer spacing. This representation assists in understanding the placement and interaction of the electrochromic gel between lens layers. The scale of the drawn figures is 4 to 1 real scale, and the dimensions are provided in centimeters.
: This figure displays two different schematic representations of the claimed multimodal smart glasses, specifically illustrating the mechanism of light modulation through the lens structure. Both views are presented from the lateral direction, aligned with the X and Z axes of the coordinate system, in a 2D shape, with the upper schematic shown in hidden lines visible format and the lower schematic in hidden lines removed format to distinguish the functional states of the electrochromic gel. The upper schematic demonstrates the operational mode where the electrochromic gel (21) is actively modulating light, thus reducing the intensity of incoming rays (24), while the lower schematic shows the passive mode, where light fully passes through the lens to the surface of the eye (25). These drawings help clarify the bio-adaptive mechanism for controlling ocular light exposure. The scale of the figures is 1.5 to 1 real scale, and dimensions are in centimeters. Referred components include the measured incoming light (24), the corrected transmitted light (25), the optic nerve (26), the natural eye lens (27), and the pupillary region or eyepiece (28).
: This figure provides two different schematic representations of the claimed multimodal smart eyewear, illustrating the adaptive focal length modulation mechanism based on object distance. Both illustrations are presented from the lateral direction, aligned with the X and Z coordinate axes, in a 2D shape, with the upper schematic drawn in hidden lines visible format and the lower schematic in hidden lines removed format. The upper image depicts a scenario in which the observed object (30) is located at a greater distance, prompting the lens to increase in thickness, thereby modifying the focal length so that a clear image is formed in the eye (32). In contrast, the lower image shows the system in a near-object scenario, where the lens thickness decreases, facilitating near vision without visual distortion. These schematics demonstrate how the lens dynamically alters its geometry to simulate the natural accommodation function of the human eye. The drawings are scaled at 1.5 to 1 real scale, and all dimensions are provided in centimeters. Referred components include the distant or close object being observed (30), the upper eyelash structure (31), and the eye (32).
: This figure presents a detailed representation of the anatomical eye region and the influence of lens and eyeglass placement on the field of view and signal transmission. It is depicted from the lateral direction, aligned with the X and Z axes of the coordinate system, in a 2D shape, incorporating both hidden lines visible and hidden lines removed formats for enhanced structural clarity. This figure illustrates how the upper eyelid and eyelashes can obstruct the upper portion of the frame, thereby justifying the strategic positioning of visual prisms and sensors in the lower section of the eyeglass frame. The schematic offers insight into biological interaction with the optical components, which ensures unimpeded signal redirection, image collection, and light pathway management. The drawing is rendered at a 3 to 1 real scale, and the dimensions are shown in centimeters.
: This figure illustrates the optical path and image acquisition system of the claimed multimodal smart eyeglasses, with a focus on both the prism-based image redirection and the coherent light transmission and return mechanisms. It is depicted from the frontal direction, aligned with the X and Z axes of the coordinate system, in a 2D shape with hidden lines removed to clearly demonstrate internal optical routing. The left portion of the figure shows the image reception paths from the installed prisms (33), while the right portion presents the coherent light emission and return paths (34) used for wave-based diagnostic imaging and spatial mapping. This setup enables multipoint detection and accurate triangulation without requiring bulky sensor arrays. The figure is drawn at a 3 to 1 real scale, and all dimensions are shown in centimeters. Referred components include image transmission paths (33) and return wave paths (34).
: This figure depicts a detailed schematic of the optical/electronic processor circuit integrated within the claimed multimodal smart eyeglasses, viewed from the frontal direction, aligned with the X and Z axes of the coordinate system. It illustrates the internal architecture of the main processing module, encompassing both optical and electronic subsystems, which coordinate data from sensors, cameras, and external communication modules. The schematic includes optical fiber ports, wireless communication interfaces, signal amplification elements, and power distribution components essential for the real-time operation of adaptive vision and health monitoring. The figure is illustrated in a 2D shape with hidden lines removed for maximum clarity of the circuit layout. The drawing is rendered at a 3 to 1 real scale, and all dimensions are provided in centimeters. Referred components include the main processor module (35), quantum microcontroller (36), wireless communication module (37), wireless interface port (38), optical fiber interface port (39), main signal output port (40), micro-camera connector port (41), photocell and UV sensor connector port (42), battery and power port (43), optical fiber connectors (44), external operational amplifier (45), ultrasonic vibrator output connector (46), photocell and UV sensor connector (47), and micro-camera connector (48).
: This figure demonstrates two schematics of various applications for the alleged multimodal smart eyewear, from lateral and upper perspectives, aligned with the three axes of the coordinate system, represented in a 3D shape, and shown in a combination of hidden lines visible and hidden lines removed. The scale of the drawn figures is 1 to 3 real scale, and the dimensions shown are in centimeters. Referred components include the connected vehicle (49), the transmitted wave to the vehicle (50), the brain chip or connected neuro-interface (51), the vehicle communication dongle (52), the vehicle’s ECU or processor (53), and the returned waves from the brain chip (54).
: This figure illustrates two diverse schematic representations of the claimed multimodal smart eyewear’s ultrasonic imaging functionality, both from a lateral point of view, aligned with the x and z axes of the coordinate system, and depicted in a 2D shape with hidden lines removed. The upper schematic, drawn at a 5 to 1 real scale, demonstrates the transmission of ultrasonic waves (55) and the corresponding ultrasonic return waves (56) after interaction with ocular tissue. The lower schematic, drawn at a 7 to 1 real scale, represents the penetration depth of various ultrasonic wavelengths within the eye structure, showing long wavelength ultrasound for deep tissue penetration (57), medium wavelength ultrasound for sub-plane scanning (58), and short wavelength ultrasound for surface scanning (59). Referred components include transmitted ultrasonic waves (55), ultrasonic return waves (56), long wavelength ultrasound penetrating deep (57), medium wavelength ultrasound for sub-plane scanning (58), and short wavelength ultrasound for surface scanning (59).
: This figure illustrates a schematic representation of the gyroscope-based emergency analysis and response algorithm embedded within the claimed multimodal smart eyewear. The diagram outlines the decision-making logic triggered by anomalous gyroscopic acceleration data, including continuous monitoring, user response verification, and escalation to emergency protocols if no feedback is received. The reported parameters are acceleration data monitoring (A), maximum allowable acceleration threshold (D), response timeout logic, and escalation pathway to mobile communication.
: This figure presents a detailed schematic representation of the quantum processor circuit integrated into the claimed multimodal smart eyewear. The illustration is rendered from a lateral point of view, in alignment with the x and z axes of the coordinate system, and shown in a 2D shape with hidden lines removed. The schematic includes all critical electrical pathways and protective mechanisms enabling the processor’s hybrid optical-electronic functionality. Referred components include: current gain-increasing transistors (60), impedance-matching resistors (61), voltage stabilizer (62), LED emitters (63), phototransistors (64), current-limiting resistors (65), quantum microcontroller processor (66), and the gyroscope sensor system (67).
Examples
To make and manufacture these glasses, it is necessary to design a processor with the possibility of direct connection to optical fiber, which can be implemented in microelectronics industries. This quantum processor enables the manufacture of glasses in optical industries. In addition to the electronic module, manufacturing a multilayer lens is another challenge that lens manufacturers can overcome with a little change. The manufacture of a micropump related to the field of mechatronics and the manufacture of frames are feasible in manufacturing and production. In the infrared emitter, due to the possibility of the vulnerability of the lens, cornea, and retina due to unwanted temperature increases and changes in proteins, the intensity and duration of radiation are considered as short as possible.
In the Physiological Monitoring & Hemodynamic Abnormality Diagnostics domain, one of the critical applications includes real-time monitoring of the pupil diameter and microvascular changes in the sclera and retina. For instance, during a syncopal event or pre-stroke condition, the quantum processor can analyze irregular pupillary behavior or hemodynamic inconsistencies in ocular blood vessels using reflected infrared and ultrasonic signals. As an example, during fatigue-induced ocular drift or hypotensive events, the system alerts based on deviation from baseline waveform frequencies captured through ultrasonic scanning.
In the Optical & Light Management section, controlling focal length is achieved by deforming the lens structure using the electrochromic gel, activated via internal micropumps. This mechanism reacts to ambient lighting conditions, as well as ocular feedback (e.g., eyelid squinting under bright conditions). For instance, in environments with alternating light patterns such as welding facilities or surgical operation rooms, the system responds within milliseconds to prevent retinal strain or overexposure, dynamically modifying lens curvature and opacity.
The infrared penetration depth will be between 1.2 and 3 micrometers. Besides that, the radiation time is between 0.25 and 10 seconds. The important point is that we do not need long radiation of more than 3 seconds, which allows for increasing power without damaging the eyes. Also, in the manufacture of these glasses, a very short-distance ptychography algorithm should be considered. At large distances, such as planets, distance and shape are determined by measuring the speed of return, but in eye scanning, the shape and depth of an object can be measured by sending short pulses and analyzing the change in the return phase. In the ultrasonic scanning method, the highest ultrasound frequency will be required, and piezoelectric sensors in the range of 5 to 7 MHz—such as an echo device—will perform the task.
In terms of Vision & Imaging, the embedded three-point micro-cameras and embedded infrared emitters work together with the prism system to reconstruct high-resolution 3D models of the internal eye structure. For example, early-stage retinal detachment can be detected through minor displacements visualized using interferometric fringe pattern variations, analyzed with Fourier transform models and ptychographic phase reconstruction. This technique mimics satellite-based topography used in telescopic imaging but adapted to millimeter and micrometer scales in biomedical contexts.
The Audio & Ultrasonic Techniques integrated in the glasses offer multipurpose functionality. Firstly, they provide internal ocular layer scanning using varying-frequency ultrasonic signals. Secondly, external environmental ultrasonic mapping (via echolocation) helps detect surrounding objects and alert visually impaired individuals. For example, while walking through a crowded or unfamiliar environment, the glasses emit low-amplitude ultrasonic pulses, and their return is interpreted to construct a proximity map, alerting the user with gentle haptic or auditory cues. Similarly, in case of a suspected fall (e.g., sudden gyroscopic spike), the onboard piezoelectric sensors verify impact angle and force through bone-conduction echo variance.
In the domain of Electronics & Embedded Systems, a quantum-class microcontroller ensures computational processing with low latency and minimal power draw. Advanced cooling and energy optimization allow the device to operate under heavy computational load, such as real-time pattern recognition for emergency diagnostics or fatigue evaluation. For instance, this includes activating fallback protocols when overcurrent is detected, such as redirecting power loads via alternate circuit paths embedded in the frame arms, and engaging current limiting resistors around sensitive transistors.
From the standpoint of Communication & External Integration, the smart glasses utilize a dedicated emergency algorithmic protocol embedded within the processor’s firmware. In case of potential stroke or blackout, based on data from the gyroscope and micro-camera, the device immediately sends alerts to a registered smartphone or connected car interface (via Bluetooth or optical communication). A practical example is when the driver exhibits ocular fatigue patterns and delayed blink response; the glasses can command the car’s ECU to reduce speed, issue auditory alerts, or automatically signal emergency lights, facilitating a safe stop.
In terms of Artificial Intelligence & Learning, the smart glasses feature adaptive personalization through continuous real-time machine learning algorithms. For example, the system learns and models the pupil’s adaptive behavior over days or weeks in different lighting conditions and emotional states. Using predictive modeling and feedback-based adjustments, it becomes possible to pre-adjust lens thickness and tint based on historical data and live contextual cues, such as time of day, location (e.g., indoors vs. outdoors), or user’s routine. AI modules also track eye accommodation patterns, differentiating between voluntary squinting and pathological myopia progression, enabling early-stage correction or optometric referrals.
Each of these domains contributes to the holistic function of the smart glasses, ensuring accurate diagnostics, visual clarity, protective adaptability, and real-time responsiveness—all while maintaining feasibility within current industrial manufacturing limits. These examples demonstrate how complex interdisciplinary techniques merge into a compact, ergonomic, and life-saving wearable medical technology.
The application of this invention can be categorized in different cases. People with difficulty in adapting images, such as presbyopia, can overcome their problem by using these glasses. People at risk of loss of consciousness, such as drivers, can also use these glasses and connect them to the car to transfer fatigue and drowsiness to the electronic processors of the car, and corrections or possible restrictions will be applied to the driving style. These glasses are also useful for people who are exposed to high acceleration and the possibility of loss of consciousness, such as pilots, or who need to change the viewing distance quickly. Another application is for people at risk of being in places with high ultraviolet radiation, such as hunting or excessively bright environments. People with a history of some eye diseases can also use these glasses for a short time to record data and better diagnose the problem by analyzing the data recorded by these glasses. In one application, these glasses are used for military personnel or guards or nature hikers who need near and far vision without using a camera and in environments with different light intensities. Finally, another embodiment of the claimed invention's applications in the field of monitoring and diagnosis is the possibility of detecting various sudden motor, balance, and hemodynamic disorders such as stroke, ocular stroke, seizure, and fainting.

Claims (36)

  1. The design and implementation of a Multimodal Smart Eyewear for Adaptive Vision, Predictive Ocular and Hemodynamic Monitoring, and Emergency Response is claimed, comprising the following components:
    • A smart eyewear frame with integrated arms and movable hinges designed to house internal wiring, tubing, and functional elements
    • A pair of multilayer adaptive lenses, each consisting of hard outer surfaces and either a flat or concave inner surface, dynamically modulated by electrochromic gel and surrounded by an elastic viscoelastic structure enabling lens deformation for adaptive focal length control
    • A dual-sensor optical monitoring system comprising a visible light intensity detector and an ultraviolet radiation detector, configured for independent light environment analysis and automatic visual response
    • A consolidated electronic architecture embedded within the eyewear, including microcontrollers, piezoelectric and ultrasonic transducers, gyroscopes, infrared transceivers, image sensors, optical ports, wireless communication modules, and an emergency detection and response system
    • A rechargeable battery unit and power supply interface integrated within the arms of the eyewear for sustained energy support
    • A tri-prism wave redirection module and stereo micro-camera system configured for multipath imaging, 3D reconstruction, and ocular surface scanning
    • An optical-electronic processor module configured to receive input from biometric sensors, optical channels, and external interfaces for real-time data processing and emergency intervention
  2. According to claim 1, the multilayer adaptive lenses comprise a structural configuration of at least two rigid optical surfaces enclosing a chamber filled with electrochromic gel. The gel volume is actively modulated using a microfluidic actuator to alter the optical focal length of the lens in response to visual demands. The outer surfaces retain their mechanical rigidity to provide stable refraction, while internal curvature is dynamically adjusted via pressure-controlled deformation.
  3. According to claim 2, the adaptive lenses are surrounded by a viscoelastic elastic ring that functions as a frequency-sensitive interface. Under low-frequency biomechanical signals, such as slow eye movements or changes in pupil diameter, the ring allows flexible deformation. Under high-frequency stimuli, such as ultrasonic piezoelectric vibration, the ring stiffens to maintain structural integrity during dynamic conditions, enabling stable visual performance during motion or tremors.
  4. According to claim 1, the system includes a dual-sensor optical monitoring module comprising a visible light intensity detector and an ultraviolet radiation sensor, configured to operate independently. The visible light sensor provides real-time luminance data, enabling the electrochromic gel to adjust lens opacity and reduce light-induced visual strain, while the ultraviolet detector triggers safety protocols to block harmful UV radiation and prevent retinal damage, even under dim visible lighting.
  5. According to claim 4, the ultraviolet sensor operates autonomously and prioritizes protective responses by overriding visual optimization settings. In cases where ultraviolet exposure is high but ambient visible light is low — such as in welding environments — the sensor signals immediate darkening of the lens and restricts incoming light regardless of user preference, safeguarding the retina and optic nerve from UV-induced injury.
  6. According to claim 1, the eyewear includes an integrated electronic architecture comprising a dedicated quantum microcontroller, optical-electronic processor, signal amplifiers, and current stabilizers. These components coordinate the operation of biometric sensing modules, visual modulation systems, wave-based imaging subsystems, wireless communication interfaces, and emergency alert mechanisms in real time.
  7. According to claim 6, the processor is configured with optical fiber ports for high-speed infrared and light-based signal exchange, enabling real-time image capture, wave response mapping, and signal processing without the need for external emitters or additional sensors. This design reduces system complexity, energy consumption, and structural size.
  8. According to claim 6, the electronic system is further equipped with gyroscopic sensors and accelerometers, integrated with an emergency detection algorithm that continuously monitors user motion for signs of abnormal acceleration, falls, or seizures. Upon detecting irregularities, the system issues a consciousness verification prompt, and in the absence of user input, it transmits location and emergency alerts to connected devices.
  9. According to claim 1, the eyewear includes a wave redirection and 3D imaging system, comprising triangular prisms installed in the lower region of the frame and a three-point micro-camera array. These components enable multi-perspective wave emission and reception using infrared or ultrasonic signals to generate accurate surface topography and internal structure mapping of the eye.
  10. According to claim 9, the prisms are designed to eliminate visual obstruction caused by the upper eyelid and eyelashes, allowing wave transmission and reception from a lower vantage point. They are optimized for high-frequency wave diffraction and triangulation, enabling precise ocular scanning and image reconstruction using reflected signals from the sclera, pupil, and lens surfaces.
  11. A multimodal smart eyewear system for adaptive vision modulation, predictive ocular and hemodynamic monitoring, and real-time emergency response, comprising a combination of biometric, optical, and electronic subsystems integrated into a wearable frame, configured to:
    • Dynamically adjust the focal length of multilayer lenses based on biometric signals from the user's eyes;
    • Modulate incoming light and ultraviolet radiation using electrochromic gel, controlled in real time by photometric and UV sensors;
    • Perform multi-layer scanning of the eye’s anatomical structures using ultrasonic, infrared, and wave diffraction techniques;
    • Analyze ocular motion, pupillary diameter, lens thickness, and corneal behavior using micro-camera feedback and wave-based imaging to predict conditions such as fatigue, presbyopia, and retinal stress;
    • Detect abnormal gyroscopic movement or physiological anomalies to issue autonomous emergency alerts;
    • Communicate wirelessly with external devices such as vehicles, smartphones, and brain-machine interfaces to initiate behavioral or operational modifications in high-risk scenarios;
    • Operate continuously via a quantum microcontroller that processes optical and ultrasonic input, stabilizes circuit power, and coordinates device-wide feedback mechanisms for safe, adaptive, and responsive vision enhancement.
  12. According to claim 11, the system’s adaptive focal modulation function is achieved by varying the inter-laminar distance between two hard lens surfaces through microfluidic displacement of electrochromic gel, enabling the lens to simulate natural accommodation for near and far vision in real time.
  13. According to claim 1, the multilayer adaptive lenses are configured to dynamically regulate their focal distance by altering the spatial thickness between the inner and outer lens layers. The electrochromic gel is injected or withdrawn via a micro-pump system, modulating the refractive index and lens curvature in real-time in response to incoming visual stimuli and detected object distance, without mechanical translation of the entire lens.
  14. According to claim 13, the outer hard surfaces of the lens remain structurally fixed, while the electrochromic gel is expanded or compressed by the microfluidic actuation system, resulting in central thickening or thinning of the lens body. This thickness change is executed without affecting the peripheral zones, thereby controlling light refraction at the central axis and modulating optical focus along the primary line of sight.
  15. According to claim 11, the system continuously evaluates ambient light intensity and UV exposure using an integrated dual-sensor module. Based on sensor input and individual user calibration, the processor adjusts the opacity of the electrochromic gel and provides personalized luminance regulation and UV filtration.
  16. According to claim 1, the embedded light intensity sensor and ultraviolet detection sensor are electronically interfaced with the central processor and configured to independently detect visible and ultraviolet spectral bands. The collected photonic data is used to calculate light thresholds and trigger automatic control over the optical density of the electrochromic gel situated between the multilayer lenses.
  17. According to claim 16, the sensor-derived data is continuously transmitted to the processor, which regulates the degree of opacity or transparency of the electrochromic gel through controlled electrical stimuli, based on predefined threshold levels and real-time light measurements, ensuring precise modulation of incoming light intensity and ultraviolet wave exposure.
  18. According to claim 1, the micro-camera and infrared emitter-receiver units are mounted along the lens frame and configured to collect ocular surface imagery, scan pupil diameter, track eye movement, and map eye surface geometries. The camera modules operate in coordination with prism reflectors to acquire visual data from multiple angles simultaneously.
  19. According to claim 18, the captured visual and infrared signals are transmitted to the internal processor, where diffraction-based wavefront analysis is used to identify the orientation, position, and physiological state of ocular components, enabling continuous biometric data extraction, including the detection of pupil diameter fluctuations and eye positioning coordinates.
  20. According to claim 11, the system detects eye fatigue and visual strain by tracking rapid and prolonged changes in pupillary size, frequency of eyelid closure, saccadic eye movements, and blink duration through micro-camera imaging and AI-assisted pattern recognition.
  21. According to claim 11, the system employs ultrasonic imaging and ptychography for real-time mapping of ocular layers. By transmitting ultrasonic waves at multiple frequencies, the system measures return time, intensity, and phase shift to construct a 3D image of eye structures, enabling early detection of pathologies like retinal detachment, ischemic stroke, and optic nerve compression.
  22. According to claim 1, the ultrasonic wave generation system is configured to emit oscillating pressure waves at variable frequencies. These waves are directed toward the eye and surrounding biological structures, and the reflections are captured by internal receivers. The time delay and return wave characteristics are analyzed to generate real-time measurements of ocular layer displacement, mechanical response, and tissue structure continuity.
  23. According to claim 11, the system features a gyroscope-driven emergency alert protocol that detects abnormal acceleration patterns associated with seizures, syncope, or falls. When thresholds are exceeded, the system initiates a safety verification prompt and, if no response is received, automatically transmits distress signals and location data via a connected smartphone or vehicle ECU.
  24. According to claim 11, the smart eyewear communicates with external systems, including autonomous vehicles, smartphones, and neural interfaces. When ocular fatigue or disorientation is detected, the eyewear instructs vehicles to reduce speed, activates emergency lighting, and triggers contact with emergency services using stored geo-coordinates.
  25. According to claim 11, the system incorporates biometric learning algorithms that adjust its visual response parameters over time by evaluating the user’s typical pupillary responses, light sensitivity, preferred focal ranges, and environmental adaptation history. These personalized profiles ensure that visual and protective behaviors are uniquely optimized for each individual.
  26. A processor system embedded within the smart eyeglasses, designed to perform parallel processing of multispectral sensory inputs, comprising a custom-designed quantum microcontroller equipped with direct optical fiber interfaces, eliminating the need for additional emitters or photodetectors.
  27. According to claim 26, the said processor module further comprises multiple ports for integrated connections to micro-cameras, infrared transceivers, photocells, ultraviolet sensors, and operational amplifiers and supports real-time analog-to-digital conversion internally.
  28. According to claims 26 and 27, the processor system is configured to drive electrochromic gel modulation based on environmental inputs and user-specific physiological signals, facilitating dynamic lens transmittance and focus adjustment.
  29. According to claim 26, the processor features internal heat-reduction logic and clock-frequency optimization protocols to ensure low thermal output during real-time, high-throughput image and wave signal processing.
  30. An integrated wireless communication module configured to interface the eyeglasses with peripheral equipment, including smartphones, vehicle ECU systems, and brain-machine interfaces, enabling bidirectional data exchange for safety intervention and adaptive control.
  31. According to claim 30, the communication module is capable of transmitting emergency alerts based on processor-analyzed biometric, optical, or motion-based abnormalities and includes GPS-based geolocation tagging for emergency response deployment.
  32. According to claim 30, the said wireless system is configured to interact with external control modules (e.g., vehicle ECUs) to adjust user interaction environments in case of medical fatigue, motion instability, or visual degradation, such as reducing vehicle speed or activating emergency flashers.
  33. A fall and shock detection module integrated into the processor system, comprising a gyroscopic sensor array and acceleration monitoring circuit configured to trigger a user alert and verify consciousness, based on continuously monitored real-time body movement data.
  34. According to claim 33, if no response is received within a predefined period after an anomalous acceleration is detected, the processor transmits a distress signal to a paired mobile device, which forwards emergency notifications to pre-defined contacts or medical services.
  35. A brain-machine interface protocol for bi-directional communication between the eyeglasses and a brain-embedded chip, wherein visual information and focus cues are transmitted to the brain while neural feedback signals are interpreted by the eyeglasses processor to optimize visual presentation.
  36. According to claim 35, the glasses processor retains override capability in instances where brain-derived feedback contradicts environmental safety inputs, such as requests for increased brightness under excessive UV exposure, prioritizing retinal protection over user intention.
PCT/IB2025/053330 2025-03-30 2025-03-30 Multimodal smart eyeglasses for adaptive vision, predictive ocular and hemodynamic monitoring, and emergency response Pending WO2025163631A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2025/053330 WO2025163631A1 (en) 2025-03-30 2025-03-30 Multimodal smart eyeglasses for adaptive vision, predictive ocular and hemodynamic monitoring, and emergency response

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2025/053330 WO2025163631A1 (en) 2025-03-30 2025-03-30 Multimodal smart eyeglasses for adaptive vision, predictive ocular and hemodynamic monitoring, and emergency response

Publications (1)

Publication Number Publication Date
WO2025163631A1 true WO2025163631A1 (en) 2025-08-07

Family

ID=96589684

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2025/053330 Pending WO2025163631A1 (en) 2025-03-30 2025-03-30 Multimodal smart eyeglasses for adaptive vision, predictive ocular and hemodynamic monitoring, and emergency response

Country Status (1)

Country Link
WO (1) WO2025163631A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140340630A1 (en) * 2013-05-17 2014-11-20 Johnson & Johnson Vision Care, Inc. System and method for a processor controlled ophthalmic lens
TW201544865A (en) * 2014-05-30 2015-12-01 Super Electronics Co Ltd Smart glasses
CN107854288A (en) * 2017-11-01 2018-03-30 暨南大学 Ocular disorders monitoring and rehabilitation training glasses based on digital intelligent virtual three-dimensional stereopsis technology

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140340630A1 (en) * 2013-05-17 2014-11-20 Johnson & Johnson Vision Care, Inc. System and method for a processor controlled ophthalmic lens
TW201544865A (en) * 2014-05-30 2015-12-01 Super Electronics Co Ltd Smart glasses
CN107854288A (en) * 2017-11-01 2018-03-30 暨南大学 Ocular disorders monitoring and rehabilitation training glasses based on digital intelligent virtual three-dimensional stereopsis technology

Similar Documents

Publication Publication Date Title
JP7650948B2 (en) Light Field Processor System
KR102745258B1 (en) Methods and system for diagnosing and treating health ailments
US12437869B2 (en) Headset integrated into healthcare platform
JP7106569B2 (en) A system that evaluates the user&#39;s health
WO2025163631A1 (en) Multimodal smart eyeglasses for adaptive vision, predictive ocular and hemodynamic monitoring, and emergency response

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25748579

Country of ref document: EP

Kind code of ref document: A1