WO2024249115A1 - Réglage de mode de torche sur la base d'une reconnaissance faciale - Google Patents
Réglage de mode de torche sur la base d'une reconnaissance faciale Download PDFInfo
- Publication number
- WO2024249115A1 WO2024249115A1 PCT/US2024/029880 US2024029880W WO2024249115A1 WO 2024249115 A1 WO2024249115 A1 WO 2024249115A1 US 2024029880 W US2024029880 W US 2024029880W WO 2024249115 A1 WO2024249115 A1 WO 2024249115A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- mobile device
- led array
- leds
- illuminate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B45/00—Circuit arrangements for operating light-emitting diodes [LED]
- H05B45/10—Controlling the intensity of the light
- H05B45/12—Controlling the intensity of the light using optical feedback
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
- G03B15/05—Combinations of cameras with electronic flash apparatus; Electronic flash units
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B30/00—Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/115—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
- H05B47/125—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
Definitions
- the present disclosure relates to light emitting diode (LED) arrays.
- embodiments are directed to mobile devices that use torch mode for illumination by the LED arrays.
- torch mode also called flashlight mode
- This mode may be useful in a number of occasions, there are also a number of circumstances in which limiting use of the torch mode is desirable. In these circumstances, the use of the torch mode may be detrimental and even dangerous.
- FIG. 1 shows an illumination apparatus, in accordance with some examples.
- FIG. 2 illustrates a method of using a torch mode, according to some embodiments.
- FIG. 3 illustrates a segmented LED array, according to some embodiments.
- FIG. 4 illustrates an example of a general device in accordance with some embodiments.
- FIG. 5 illustrates a cross-sectional view of a single-die package architecture, in accordance with some examples.
- FIG. 6 illustrates an example hardware arrangement for implementing the above disclosed subject matter, according to some embodiments.
- FIG. 7 illustrates a top plan view of an example array suitable for implementing embodiments described herein.
- FIG. 8 illustrates an example lighting device, according to some embodiments.
- FIG. 9 illustrates an example lighting system, according to some embodiments.
- an illumination system, apparatus, and method of controlling torch mode are described.
- the various embodiments enable the current flow between one or more specific segments of a monolithic segmented LED array to be adjusted when specific objects are detected, e.g., using facial recognition.
- the IEC/EN 62471 standard provides guidance for evaluating the photobiological safety of illumination systems, which includes luminaries. Specifically, the IECZEN 62471 standard defines exposure limits, references measurement techniques, and the classification scheme for the evaluation and control of photobiological hazards from all electrically powered incoherent broadband sources of optical radiation, including LEDs (but excluding lasers), in the wavelength range from 200 nm through 3000 nm.
- the IECZEN 62471 standard considers the biological effects on both the eyes and skin, classifying the light source into several groups including safe, low risk, moderate risk, and high risk, through a risk assessment. The classification is based on the emission limit as well as permissible exposure time before the hazard is exceeded. In order to determine the risk group of a source, its spectral irradiance or radiance is measured at a specified distance, and subsequently weighted with action spectra and maximum allowed exposure time, which is compared to different exposure limits. For continuous sources, the exposure time limits for blue light sources are safe (10000s), low risk (100s), moderate risk (0.25s), and high risk (0.0s).
- FIG. 1 shows an illumination apparatus 100, in accordance with some examples.
- the illumination apparatus 100 have other components that are not shown for convenience.
- the illumination apparatus 100 may be, for example, a smart phone or standalone camera.
- the illumination apparatus 100 may include both a light source 110 and a camera 120.
- the camera 120 may capture an image of an individual or parts thereof during an exposure duration of the camera 120, whether or not the scene 104 is illuminated by the light source 110.
- a processor 130 may be used to control various functions of the light source 110 and the camera 120, including whether or not a shutter is open in an opening 108 of a housing of the illumination apparatus 100, driving of the light source 110, and detection by the camera 120.
- the opening 108 may be a single opening as shown in FIG. 1 or may include multiple separate openings.
- the shutter may be a single shutter that covers both the light source 110 and the camera 120 or may include multiple separate shutters that covers only one of the light sources 110 (or pixel within the light source) or the camera 120 and are individually controllable by the processor 130.
- a layer of material may be disposed on at least one of the light sources 110 and that is electrically controllable between an opaque stage and a transmission phase.
- the illumination apparatus 100 may include one or more LED arrays 112.
- Each of the one or more LED arrays 112 may include a plurality of LEDs 114 that may produce light during at least a portion of the exposure duration of the camera 120.
- Each of the one or more LED arrays 112 may contain segmented LEDs 114 in which the LEDs 114 are divided into a grid of light emitting areas and non-light emitting areas (between the LEDs 114).
- Each of the LEDs 114 may be formed from one or more inorganic materials (e.g., binary compounds such as gallium arsenide (GaAs), ternary compounds such as aluminum gallium arsenide (AlGaAs), quaternary compounds such as indium gallium phosphide (InGaAsP), gallium nitride (GaN), or other suitable materials), usually either III-V materials (defined by columns of the Periodic Table) or II- VI materials (such as InGaN).
- Each of the LEDs 114 may emit light in the visible spectrum (about 400nm to about 800 nm) or may also emit light in the infrared spectrum (above about 800nm).
- one or more other layers such as a phosphor layer may be disposed on each of the one or more LED arrays 112 to convert the light from the LEDs 114 into white (or another color) light.
- LEDs 114 in a particular LED array 112 that emit light in the infrared spectrum may be, for example, interspersed with LEDs 114 may emit light in the visible spectrum, or each type of LED (visible emitter/infrared emitter) may be disposed on different sections of the particular LED array 112.
- each LED array 112 may only emit light in either the visible spectrum or the infrared spectrum; separate (one or more) LED arrays may be used to emit light in the infrared spectrum, each of the individual LED array 112, LEDs 114 and/or LED segments controllable by the processor 130.
- Each of the one or more LED arrays 112 may include several individual LEDs 114 to millions of microLEDs 114 that emit light and that may be individually controlled or controlled in groups of pixels (e.g., 5x5 groups of pixels).
- the microLEDs are small (e.g., ⁇ 0.01 mm on a side) and may provide monochromatic or multi-chromatic light, typically red, green, or blue using inorganic semiconductor material such as that indicated above.
- the light source 110 may include at least one lens 116 and/or other optical elements such as reflectors to reflect emitted light towards an emission surface of the light source 110.
- the lens 116 and/or other optical elements may direct the light emitted by the one or more LED arrays 112 toward the scene 104 as illumination 102.
- the lens 116 and/or other optical elements may be stationary or may be adjustable.
- the camera 120 may sense light at least the wavelength or wavelengths emitted by the one or more LED arrays 112. Similar to the light source 110, the camera 120 may include optics (e.g., at least one camera lens 122) that are able to collect reflected light 106 of the illumination 102 that is reflected from and/or emitted by the scene 104.
- the camera lens 122 may direct the reflected light 106 onto a multi-pixel sensor 124 (also referred to as a light sensor) to form an image of the scene 104 on the multi-pixel sensor 124. Pixels of the multi-pixel sensor 124 may be calibrated such that light on each pixel is illuminated by a specific LED(s) of the light source 110.
- the processor 130 may receive a data signal that represents the image of the scene 104.
- the processor 130 may additionally control and drive the LEDs 114 in the one or more LED arrays 112 via one or more drivers 132.
- the processor 130 may optionally control one or more LEDs 114 in the one or more LED arrays 112 independent of another one or more LEDs 114 in the one or more LED arrays 112, so as to illuminate the scene in a specified manner.
- one or more detectors 126 may be incorporated in the camera 120. In other embodiments, instead of being incorporated in the camera 120, the one or more detectors 126 may be incorporated in one or more different areas, such as the light source 110 or elsewhere close to the camera 120.
- the one or more detectors 126 may include multiple different sensors to sense visible and/or infrared light (e.g., from the scene 104), and may further sense the ambient light and/or variations/flicker in the ambient light in addition to reception of the reflected light from the LEDs 114.
- the multi -pixel sensor 124 of the camera 120 may be of higher resolution than the sensors of the one or more detectors 126 to obtain an image of the scene with a desired resolution.
- the sensors of the one or more detectors 126 may have one or more segments (that are able to sense the same wavelength/range of wavelengths or different wavelength/range of wavelengths), similar to the one or more LED arrays 112. In some embodiments, if multiple detectors are used, one or more of the detectors may detect visible wavelengths and one or more of the detectors may detect infrared wavelengths; like the one or more LED arrays 112, the one or more detectors 126 may be individually controllable by the processor 130. [0025] In some embodiments, instead of, or in addition to, being provided in the camera 120, one or more of the sensors of the one or more detectors 126 may be provided in the light source 110.
- the light source 110 and the camera 120 may be integrated in a single module, while in other embodiments, the light source 110 and the camera 120 may be separate modules that are disposed on a printed-circuit board (PCB). In other embodiments, the light source 110 and the camera 120 may be attached to different PCBs. In the latter embodiment, multiple openings may be present in the housing at least one of which may be eliminated with the use of an integrated version of the light source 110 and camera 120.
- PCB printed-circuit board
- the LEDs 114 may be driven in an analog or digital manner, e.g., using a direct current (DC) driver or pulse width modulation (PWM). As shown, one or more drivers 132 may be used to drive the LEDs 114 in the one or more LED arrays 112, as well as other components, such as the actuators. In torch mode, the LEDs 114 may be driven using the DC driver to enable maximum illumination from the one or more LED arrays 112. The LED array 112 may be controlled to be driven at higher and higher current densities without surpassing the thermal limitation imposed by the generally small build-in volume. The peak illuminance of the LED array 112 when in torch mode can significantly increase by choosing a smaller LED segment, thereby allowing the torch light to illuminate a further distance.
- DC direct current
- PWM pulse width modulation
- the illumination apparatus 100 may also include an input device 134, for example, a user-activated input device such as a button that is depressed to take a picture.
- an input device 134 for example, a user-activated input device such as a button that is depressed to take a picture.
- the light source 110 and camera 120 may be disposed in a single housing.
- LEDs can be used to form different types of displays, LED matrices and light engines including automotive adaptive headlights, augmented-, virtual-, mix-reality (AR/VR/MR) headsets, smart glasses and displays for mobile phones, smart watches, monitors and TVs.
- the individual LED pixels in these architectures may, as above, have an area of few square millimeters down to few square meters depending on the matrix or display size and pixel -per-inch requirements.
- One common approach is to create a monolithic array of LED pixels on an epitaxial wafer and later transfer and hybridize the LED arrays to a backplane to allow individual control of the pixels, as described in more detail below.
- FIG. 1 Other embodiments, which may contain some or all of the components shown in FIG. 1 (as well as additional components not shown), include displays, automotive adaptive headlights, augmented-, virtual-, mixreality (AR/VR/MR) headsets, smart glasses and displays for mobile phones, smart watches, monitors, and TVs.
- displays automotive adaptive headlights, augmented-, virtual-, mixreality (AR/VR/MR) headsets, smart glasses and displays for mobile phones, smart watches, monitors, and TVs.
- AR/VR/MR mixreality
- the LEDs may be formed by combining n- and p-type semiconductors (e.g., III-V semiconductors above) on a substrate of sapphire aluminum oxide (A12O3) or silicon carbide (SiC), among others.
- n- and p-type semiconductors e.g., III-V semiconductors above
- substrate of sapphire aluminum oxide (A12O3) or silicon carbide (SiC), among others.
- various layers are deposited and processed on the substrate during fabrication of the LED.
- the surface of the substrate may be pretreated to anneal, etch, polish, etc. the surface prior to deposition of the various layers.
- the LED light may be used in a torch mode to illuminate the environment to replace the use of a separate flashlight.
- FIG. 2 illustrates an example method of using a torch mode, according to some embodiments. Not all of the operations may be undertaken in the method 200, and/or additional operations may be present. The operations may occur in a different order from that indicated in FIG. 2.
- the torch mode may be activated on a mobile device.
- the torch mode may be activated manually by user, for example, using an app, soft button, or from a pulldown menu.
- the LEDs, as above, may be disposed in a segmented LED array.
- the strobe light strobe light functionality in which some or all of the LEDs are alternatively off and on - usually driven at full power
- an LED light may be continuously powered on to illuminate the scene when recording a video.
- FIG. 3 illustrates a segmented LED array, according to some embodiments.
- the segmented LED array 300 may contain individual LEDs 302 that are electrically isolated by trenches 304.
- the trenches 304 may be partially or completely filled with one or more dielectric layers (e.g., to form a distributed Bragg reflector at the wavelength emitted by the LEDs 302) and/or metal layers that provide optical isolation between pixels of the LED array 300 and increase the light output of the LEDs 302.
- the LEDs 302 may be individually driven to provide illumination, allowing an increase in power provided to each LED 302.
- Each LED 302 may emit light of the same wavelength (in other embodiments, the light emitted by at least some of the LED 302 may be different - e.g., R, G, Y).
- the LEDs 302 may be formed from, for example, doped GaN with a multiple quantum well active region.
- the LEDs 302 may emit, for example, blue light, which may be converted to white light using a phosphor layer disposed above each of the LEDs 302. Although a 7x7 segmented LED array 300 is shown in FIG.
- the segmented LED array 300 may be of any size (e.g., the segmented LED array 300 may contain any size LEDs, such as microLEDs) and may be independently addressable (drivable) or drivable in predetermined sets of LEDs 302 (e.g., a set of 3x3 LEDs).
- the device may be used to illuminate a scene.
- optics in the device may be used to collimate the light from the LEDs 302 to illuminate the scene.
- Each optic may cover a single LED 302 or a set of the LEDs 302, or a single optic may cover all of the LEDs 302.
- the controller in the mobile device may cause the driver to supply a predetermined amount of current to some or all of the LEDs 302.
- the radiance emitted by the segmented LED array 300 may be above the safety standard limit, which in turn may trigger activation of the camera to screen the field of view of illumination of the driven LEDs 302 for faces (specifically, human faces or eyes).
- the radiance emitted by the segmented LED array 300 is below the safety standard limit (as determined by the processor), the camera may not be activated and/or the remaining operations shown in FIG. 2 may not be performed (i.e., the LEDs 302 may be driven using the unmodified torch mode current).
- an initial condition is that the processor may determine whether the light from the segmented LED array 300 exceeds a standard safety limit when the mobile device is in the torch mode, and, if so, illuminate the scene without determining whether the light illuminates a face in the scene and without modifying the torch mode current to be supplied to the LEDs 302.
- the processor may determine whether the device will be illuminating one or more faces when in torch mode. More specifically, one or more cameras in the device may be used to capture images of the scene. Under normal conditions, image capture may be deactivated during torch mode as the device is essentially being used as a flashlight. However, in some embodiments, one or more cameras in the device may be used to take images of the scene prior to and/or during illumination during torch mode.
- the image may be captured using the LEDs 302 of the segmented LED array 300 under the same or reduced lighting conditions (e.g., using white light as above), or may be captured using IR radiation to illuminate the scene (in which case, the segmented LED array 300 may contain LEDs that emit IR or a separate segmented LED array 300 in the device may be used).
- the images may be captured and processed periodically during use in torch mode.
- the images may be processed by a processor of the device.
- the processor may use artificial intelligence (AI)/machine learning (ML) to determine whether one or more of the LEDs 302 is illuminating one or more faces (or in some embodiments, eyes of the one or more faces).
- AI/ML may include both a training mode to train an AI/ML model (using a predetermined set of images of typical faces and/or eyes, for example) and an inference mode to use once the AI/ML model is sufficiently trained. The training may occur after the device has been sold and/or prior to sale of the device using a software upload.
- the AI/ML model may be an artificial neural network (ANN).
- the image(s) may instead be transmitted for an external processing device to use the AI/ML model. For example, edge and/or cloud devices with which the device is in wired or wireless communication may provide the processing.
- Training can be performed on one or more local or remote processors.
- initial training may be performed remotely using a variety of faces under uniform lighting conditions, and the initial ANN parameters transferred to local processor within the device for updating when new images are obtained in the inference mode.
- initial remote training of the model may produce an initial remotely trained model, which may be followed by local use (and perhaps training) of the initial remotely trained model.
- the initial remote training is used to obtain an accuracy of the model that is acceptable, and subsequently locally used when the initial accuracy is satisfactory (e.g., above 90%).
- one or more predetermined faces whose characteristics are known may be obtained from a memory associated with the processor.
- the faces may be collected by taking images using the device and extracting the faces or using a preset group of faces.
- the ANN may enter training mode and train the ANN.
- the training mode may be operated at predetermined intervals in which images of the scene (or faces) are batched and run when processing resources become available.
- the ANN is a neural network in which multiple layers exist: the first (input) layer, intermediate (hidden) layers, and the last (output) layer.
- Each of the layers in the ANN contains nodes (neurons) that processes the data in the ANN through a sum and transfer function.
- the prediction accuracy of the ANN depends on the number of nodes in the hidden layers.
- the ANN receives input data, which is processed through the layers, before an output is generated and fed back as training feedback to the ANN for further changes to the parameters of the ANN. Training may be easier than other training conditions as the illumination conditions are under full driving of the LED array, as well as environmental parameters.
- the environmental parameters may include optics specifics, camera/sensor specifics (e.g., exposure time, dynamic range of exposure, gain factor), noise reduction, and ambient light/flicker in the environment based on information one or more separate sensors.
- the model may be supplied to the device. That is, the parameters trained for the ANN may be supplied to the memory in the wired or wirelessly device.
- the local processing resources may use the trained ANN as an inference engine and may or may not continue to train the ANN to further improve the accuracy using new incoming data/images (online training). In other embodiments, the training may instead be performed locally.
- operation 206 may be performed once, prior to initiation of the torch mode (after activation by the user). Alternatively, or in addition, operation 206 may be performed intermittently, e.g., every few seconds. In some embodiments, the frequency at which operation 206 is performed may be adjusted dependent on whether or not a face has been previously detected within a predetermined amount of time. In this case, the frequency of performance of operation 206 may decrease with increasing time since the last time a face has been previously detected. In addition, the frequency may be dependent on other aspects of the scene as determined by the AI/ML model.
- the AI/ML model may determine to decrease the frequency of performance of operation 206.
- the processing resources may determine whether the segmented LED array 300 is illuminating a face. If not, the torch mode may continue (or start) to illuminate the scene. If so, the processor may determine at operation 208 specifically which LEDs 302 are going to illuminate a face(s) (or more specifically eyes). That is, the processor may use the AI/ML model to determine whether any one or more of the LEDs 302 is illuminating the face(s).
- the inference mode may eliminate pixels of the camera that are below a predetermined amount of light being reflected (e.g., even if a face is present, it is too far away to be appreciably affected); the LEDs 302 corresponding to these pixels may be eliminated from consideration prior to using the AI/ML model to determine whether the remaining pixels are illuminating a face.
- only driving of (e.g., the current or voltage provided to) the particular segmented LEDs 302a determined during operation 208 may be reduced, as shown in FIG. 3.
- This reduction in driving may be partial (e.g., the driving current for the LED 302a is about 25% of the maximum driving current in torch mode) or full (e.g., the driving current for the LED 302a is about 0).
- the reduction may be predetermined or may be based on characteristics determined by the AI/ML model (such as distance from the array 300 to the face). If multiple LEDs 302a are determined by the AI/ML model to illuminate a face, the reduction for each LED 302a may be the same.
- the reduction may be dependent on distance of a particular LED 302 from a center (or other location) LED 302 determined to illuminate the face. For example, if a 3x3 grid of LEDs 302a is determined by the AI/ML model to illuminate a face, the driving current of the center LED 302a may be reduced to about 0, the driving current of abutting LEDs 302a adjacent to the center LED 302a may be reduced by about 90%, and the driving current of the LEDs 302a adjacent to the abutting LED 302a may be reduced by about 75%. Note that the percentages given here are only one example, and may be adjusted to any percentage as desired.
- the same current used to drive the reduced number of LEDs 302 may be used to drive all of the LEDs 302 in the array 300.
- driving of the LEDs 302 detected to illuminate the face may be partially or fully reduced, while other LEDs 302 of the LED array 300 that were previously not driven (and determined by the AI/ML model to not illuminate the detected face) may instead be driven.
- the AI/ML model may estimate the distance between the array 300 and the detected face.
- the distance may be estimated using angle of arrival or time of flight data of the reflected light, for example.
- the amount of reduction may be based solely on one or more of the above and may be independent of the estimated distance. In other embodiments, in addition, or instead, the amount of reduction may be based on the estimated distance.
- the AI/ML model may calibrate the distance vs luminescence intensity when in torch mode for array 300 and decrease the intensity reduction proportional to the distance (or with distance squared, as the reduction in intensity from a point source).
- the device may contain a diffuser that is separate from the LED array 300 or may be a layer formed when fabricating the LED array 300.
- the diffuser may be disposed between the LED array 300 and the shutter or window of the device from which the light of the LED array 300 is emitted from the device. The differ may be mechanically moved into place or electrically activated to diffuse light from the LED array 300 in response to at least one of the LEDs 302 being determined by the AI/ML model to illuminate a face.
- the diffuser may be used in addition to or adjusting the driving of one or more of the LEDs 302 to adjust throughput of the light in torch mode. Note that while LED arrays are discussed herein, this may also be applicable to other light emitters, such as a semiconductor laser array. Similarly, a layer with electro- or thermo-optical areas (corresponding to the LEDs 302) may be used to adjust the transparency of the area and thus light from the corresponding LED 302.
- activation of the torch mode at operation 202 may trigger safety illumination (e.g., low power illumination, IR illumination) initially and subsequently permit high power illumination (LEDs driven using the torch mode current) only after the ensuring that no face is present and/or reducing the current to LEDs illuminating a face.
- safety illumination e.g., low power illumination, IR illumination
- high power illumination LEDs driven using the torch mode current
- the embodiments disclosed may also be applied to a strobe mode of the device.
- strobe mode some or all of LEDs of the LED array may be controlled by the processor to rapidly (about once every second or two or faster) activate (turn on) and deactivate (turn off).
- the LEDs selected may be uniform over time or may change in particular patterns.
- strobe mode illumination using the IR LED array and subsequent detection may occur between strobe flashes.
- FIG. 4 illustrates an example of a general device in accordance with some embodiments.
- the device 400 may be a mobile device such as a laptop computer (PC), a tablet PC, a smart phone, or an augmented reality (AR)/virtual reality (VR), or an automotive device, for example.
- Various elements may be provided on the backplane indicated above, while other elements may be local or remote. Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or mechanisms.
- Modules and components are tangible entities (e.g., hardware) capable of performing specified operations and may be configured or arranged in a certain manner. In an example, circuits may be arranged (e.g., internally or with respect to external entities such as other circuits) in a specified manner as a module.
- the whole or part of one or more computer systems may be configured by firmware or software (e.g., instructions, an application portion, or an application) as a module that operates to perform specified operations.
- the software may reside on a machine readable medium.
- the software when executed by the underlying hardware of the module, causes the hardware to perform the specified operations.
- module (and “component”) is understood to encompass a tangible entity, be that an entity that is physically constructed, specifically configured (e.g., hardwired), or temporarily (e.g., transitorily) configured (e.g., programmed) to operate in a specified manner or to perform part or all of any operation described herein.
- each of the modules need not be instantiated at any one moment in time.
- the modules comprise a general -purpose hardware processor configured using software
- the general -purpose hardware processor may be configured as respective different modules at different times.
- Software may accordingly configure a hardware processor, for example, to constitute a particular module at one instance of time and to constitute a different module at a different instance of time.
- the electronic device 400 may include a hardware processor (or equivalently processing circuitry) 402 (e.g., a central processing unit (CPU), a GPU, a hardware processor core, or any combination thereof), a memory 404 (which may include main and static memory), some or all of which may communicate with each other via an interlink (e.g., bus) 408.
- the memory 404 may contain any or all of removable storage and non-removable storage, volatile memory or non-volatile memory.
- the electronic device 400 may further include a display/light source 410 such as the LEDs described above, or a video display, an alphanumeric input device 412 (e.g., a keyboard), and a user interface (UI) navigation device 414 (e.g., a mouse).
- the display/light source 410, input device 412 and UI navigation device 414 may be a touch screen display.
- the electronic device 400 may additionally include a storage device (e.g., drive unit) 416, a signal generation device 418 (e.g., a speaker), a network interface device 420, one or more cameras 428, and one or more sensors 430, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor such as those described herein.
- GPS global positioning system
- the electronic device 400 may further include an output controller, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
- a serial e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
- USB universal serial bus
- IR infrared
- NFC near field communication
- the storage device 416 may include a non-transitory machine readable medium 422 (hereinafter simply referred to as machine readable medium) on which is stored one or more sets of data structures or instructions 424 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein.
- the instructions 424 may also reside, completely or at least partially, within the memory 404 and/or within the hardware processor 402 during execution thereof by the electronic device 400.
- the machine readable medium 422 is illustrated as a single medium, the term "machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 424.
- machine readable medium may include any medium that is capable of storing, encoding, or carrying instructions for execution by the electronic device 400 and that cause the electronic device 400 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
- Non-limiting machine-readable medium examples may include solid-state memories, and optical and magnetic media.
- machine-readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto-optical disks; Random Access Memory (RAM); and CD-ROM and DVD-ROM disks.
- non-volatile memory such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices
- EPROM Electrically Programmable Read-Only Memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- flash memory devices e.g., Electrically Erasable Programmable Read-Only Memory (EEPROM)
- EPROM Electrically Programmable Read-Only Memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- flash memory devices e.g
- the instructions 424 may further be transmitted or received over a communications network using a transmission medium 426 via the network interface device 420 utilizing any one of a number of wireless local area network (WLAN) transfer protocols or a SPI or CAN bus.
- Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks.
- LAN local area network
- WAN wide area network
- POTS Plain Old Telephone
- Communications over the networks may include one or more different protocols, such as Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi, IEEE 802.16 family of standards known as WiMax, IEEE 802.16.4 family of standards, a Long Term Evolution (LTE) family of standards, a Universal Mobile Telecommunications System (UMTS) family of standards, peer-to-peer (P2P) networks, a next generation (NG)/6 th generation (6G) standards among others.
- the network interface device 420 may include one or more physical jacks (e.g., Ethernet, coaxial, or phonejacks) or one or more antennas to connect to the transmission medium 426.
- circuitry refers to, is part of, or includes hardware components such as an electronic circuit, a logic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group), an Application Specific Integrated Circuit (ASIC), a field-programmable device (FPD) (e.g., a field-programmable gate array (FPGA), a programmable logic device (PLD), a complex PLD (CPLD), a high-capacity PLD (HCPLD), a structured ASIC, or a programmable SoC), digital signal processors (DSPs), etc., that are configured to provide the described functionality.
- FPD field-programmable device
- FPGA field-programmable gate array
- PLD programmable logic device
- CPLD complex PLD
- HPLD high-capacity PLD
- DSPs digital signal processors
- the circuitry may execute one or more software or firmware programs to provide at least some of the described functionality.
- the term “circuitry” may also refer to a combination of one or more hardware elements (or a combination of circuits used in an electrical or electronic system) with the program code used to carry out the functionality of that program code. In these embodiments, the combination of hardware elements and program code may be referred to as a particular type of circuitry.
- processor circuitry or “processor” as used herein thus refers to, is part of, or includes circuitry capable of sequentially and automatically carrying out a sequence of arithmetic or logical operations, or recording, storing, and/or transferring digital data.
- processor circuitry or “processor” may refer to one or more application processors, one or more baseband processors, a physical central processing unit (CPU), a single- or multi-core processor, and/or any other device capable of executing or otherwise operating computer-executable instructions, such as program code, software modules, and/or functional processes.
- the camera 428 may sense light at least the wavelength or wavelengths emitted by the LEDs.
- the camera 428 may include optics (e.g., at least one camera lens) that are able to collect reflected light of illumination that is reflected from and/or emitted by an illuminated region.
- the camera lens may direct the reflected light onto a multi-pixel sensor (also referred to as a light sensor) to form an image of on the multi-pixel sensor.
- Multiple cameras may be used, for example also low power motion-triggered cameras.
- the processor 402 may control and drive the LEDs via one or more drivers.
- the processor 402 may optionally control one or more LEDs in LED arrays independent of another one or more LEDs in the LED arrays, so as to illuminate an area in a specified manner.
- the sensors 430 may be incorporated in the camera 428 and/or the light source 410.
- the sensors 430 may sense visible and/or infrared light, and may further sense the ambient light and/or variations/flicker in the ambient light in addition to reception of the reflected light from the LEDs.
- the sensors may have one or more segments (that are able to sense the same wavelength/range of wavelengths or different wavelength/range of wavelengths), similar to the LED arrays.
- FIG. 5 illustrates a cross-sectional view of a single-die package architecture, in accordance with some examples.
- the package architecture 500 illustrates only a single LED die 510 for clarity.
- the LED die 510 may contain a semiconductor stack fabricated by combining n-type and p-type semiconductors (e.g., the above III-V semiconductors) on a substrate such as sapphire or silicon carbide (SiC).
- a substrate such as sapphire or silicon carbide (SiC).
- Various layers may be deposited and processed on the substrate during fabrication of the LED as described above.
- the surface of the substrate may be pretreated to anneal, etch, polish, etc. the surface prior to deposition of the various layers.
- the LED die 510 may also contain contacts fabricated on the semiconductor stack to make electrical contact with different layers of the semiconductor stack.
- the LED die 510 may be electrically coupled to a cathode under bump metallization (UBM) (nUBM) 512a and an anode UBM (pUBM) 512b.
- UBM cathode under bump metallization
- pUBM anode UBM
- the nUBM 512a and the pUBM 512b may be patterned and formed from a metal, such as copper (Cu), nickel (Ni), gold (Au), silver (Ag), and/or titanium (Ti), for example, which may be deposited on the LED die 510.
- the nUBM 512a and the pUBM 512b may be electrically coupled to a PCB 520 through a patterned tile metallization 514 that is disposed on a tile 522 (also referred to as a submount).
- the electrical connection may be formed by direct contact (e.g., thermocompression bonding) or through a solder and reflow process whereby the solder wets to both metal interfaces and forms a solid joint upon cooling.
- the tile metallization 514 may be formed from a metal, such as Cu, which may be the same as, or different from, the material(s) used to form the nUBM 512a and the pUBM 512b.
- the tile metallization 514 may entirely overlap the nUBM 512a and the pUBM 512b to ensure electrical contact therebetween.
- the tile 522 may be formed from FR4, a ceramic, or aluminum nitride (AIN), for example.
- the tile 522 may provide mechanical support for the LED die 510.
- the tile 522 may be disposed on a Thermal Interface Material (TIM)/electrode layer 524, which may include a metal, such as those above, and may further include thermal epoxy or grease, for example.
- TIM/electrode layer 524 may act as an electrode layer, connecting the tile 522 to a heat sink 526 formed, for example, from Al.
- a light-converting layer 530 containing phosphor particles may be disposed on the LED die 510.
- the light-converting layer 530 may convert light emitted by the LEDs of the LED die 510 to white light, for example.
- the light-converting layer 530 may be a continuous layer or may be segmented to be disposed only on the LEDs.
- a lens 532 and/or other optics may be disposed over the entire LED die 510 as shown. In other embodiments, individual lenses may be disposed over each LED or over sets of LEDs. In some cases, optical efficiency of the illuminance within each segment of the LED die 510 may be affected due to the geometry of the lens 532 and the LED die 510, and thus the relative position of the LEDs of the segment with respect to the lens 532.
- the LEDs and circuitry supporting the LED array can be packaged and include a submount or PCB for powering and controlling light production by the LEDs.
- the PCB supporting the LED array may include electrical vias, heat sinks, ground planes, electrical traces, and flip chip or other mounting systems.
- the submount or PCB may be formed of any suitable material, such as ceramic, silicon, aluminum, etc. If the submount material is conductive, an insulating layer may be formed over the substrate material, and a metal electrode pattern formed over the insulating layer for contact with the micro-LED array.
- the submount can act as a mechanical support, providing an electrical interface between electrodes on the LED array and a power supply, and also provide heat sink functionality.
- LED arrays may include stand-alone applications to provide general illumination (e.g., within or external to a room or vehicle) or to provide specific images.
- the system may be used to provide AR and VR-based applications.
- Visualization systems such as VR and AR systems, are becoming increasingly more common across numerous fields such as entertainment, education, medicine, and business.
- Various types of devices may be used to provide AR/VR to users, including headsets, glasses, and projectors.
- Such an AR/VR system may include components similar to those described above: the microLED array, a display or screen (which may include touchscreen elements), a micro-LED array controller, sensors, and a controller, among others.
- the AR/VR components can be disposed in a single structure, or one or more of the components shown can be mounted separately and connected via wired or wireless communication.
- Power and user data may be provided to the controller.
- the user data input can include information provided by audio instructions, haptic feedback, eye or pupil positioning, or connected keyboard, mouse, or game controller.
- the sensors may include cameras, depth sensors, audio sensors, accelerometers, two or three axis gyroscopes and other types of motion and/or environmental/wearer sensors that provide the user input data.
- control input can include detected touch or taps, gestural input, or control based on headset or display position.
- an estimated position of the AR/VR system relative to an initial position can be determined.
- the controller may control individual micro-LEDs or one or more groups of LEDs to display content (AR/VR and/or non- AR/VR) to the user while controlling other LEDs and sensors used in eye tracking to adjust the content displayed.
- Content display LEDs may be designed to emit light within the visible band (approximately 400 nm to 780 nm) while LEDs used for tracking may be designed to emit light in the IR band (approximately 780 nm to 2,200 nm).
- the tracking LEDs and content LEDs may be simultaneously active.
- the tracking LEDs may be controlled to emit tracking light during a time period that content LEDs are deactivated and are thus not displaying content to the user.
- the AR/VR system can incorporate optics, such as those described above, and/or an AR/VR display, for example to couple light emitted by LED array onto the AR/VR display.
- the AR/VR controller may use data from the sensors to integrate measurement signals received from the accelerometers over time to estimate a velocity vector and integrate the velocity vector over time to determine an estimated position of a reference point for the AR/VR system.
- the reference point used to describe the position of the AR/VR system can be based on depth sensor, camera positioning views, or optical field flow.
- the system controller can send images or instructions the light emitting array controller. Changes or modification the images or instructions can also be made by user data input, or automated data input.
- a display in general, can present to a user a view of scene, such as a three-dimensional scene.
- the user can move within the scene, such as by repositioning the user’s head or by walking.
- the VR system can detect the user’s movement and alter the view of the scene to account for the movement. For example, as a user rotates the user’s head, the system can present views of the scene that vary in view directions to match the user’s gaze. In this manner, the VR system can simulate a user’s presence in the three- dimensional scene.
- a VR system can receive tactile sensory input, such as from wearable position sensors, and can optionally provide tactile feedback to the user.
- the display can incorporate elements from the user’s surroundings into the view of the scene.
- the AR system can add textual captions and/or visual elements to a view of the user’s surroundings.
- a retailer can use an AR system to show a user what a piece of furniture would look like in a room of the user’s home, by incorporating a visualization of the piece of furniture over a captured image of the user’s surroundings.
- the visualization accounts for the user’s motion and alters the visualization of the furniture in a manner consistent with the motion.
- the AR system can position a virtual chair in a room.
- the user can stand in the room on a front side of the virtual chair location to view the front side of the chair.
- the user can move in the room to an area behind the virtual chair location to view a back side of the chair.
- the AR system can add elements to a dynamic view of the user’s surroundings.
- FIG. 6 shows a block diagram of an example of a system, according to some embodiments.
- the system 600 may provide AR/VR functionality using microLEDs.
- the system 600 can include a wearable housing 612, such as a headset or goggles.
- the housing 612 can mechanically support and house the elements detailed below.
- one or more of the elements detailed below can be included in one or more additional housings that can be separate from the wearable housing 612 and couplable to the wearable housing 612 wirelessly and/or via a wired connection.
- a separate housing can reduce the weight of wearable goggles, such as by including batteries, radios, and other elements.
- the housing 612 can include one or more batteries 614, which can electrically power any or all of the elements detailed below.
- the housing 612 can include circuitry that can electrically couple to an external power supply, such as a wall outlet, to recharge the batteries 614.
- the housing 612 can include one or more radios 616 to communicate wirelessly with a server or network via a suitable protocol, such as WiFi.
- the system 600 can include one or more sensors 618, such as optical sensors, audio sensors, tactile sensors, thermal sensors, gyroscopic sensors, time-of-flight sensors, triangulation-based sensors, and others.
- one or more of the sensors can sense a location, a position, and/or an orientation of a user.
- one or more of the sensors 618 can produce a sensor signal in response to the sensed location, position, and/or orientation.
- the sensor signal can include sensor data that corresponds to a sensed location, position, and/or orientation.
- the sensor data can include a depth map of the surroundings.
- one or more of the sensors 618 can capture a real-time video image of the surroundings proximate a user.
- the system 600 can include one or more video generation processors 620.
- the one or more video generation processors 620 can receive scene data that represents a three-dimensional scene, such as a set of position coordinates for objects in the scene or a depth map of the scene. This data may be received from a server and/or a storage medium.
- the one or more video generation processors 620 can receive one or more sensor signals from the one or more sensors 618. In response to the scene data, which represents the surroundings, and at least one sensor signal, which represents the location and/or orientation of the user with respect to the surroundings, the one or more video generation processors 620 can generate at least one video signal that corresponds to a view of the scene.
- the one or more video generation processors 620 can generate two video signals, one for each eye of the user, that represent a view of the scene from a point of view of the left eye and the right eye of the user, respectively. In some examples, the one or more video generation processors 620 can generate more than two video signals and combine the video signals to provide one video signal for both eyes, two video signals for the two eyes, or other combinations.
- the system 600 can include one or more light sources 622 that can provide light for a display of the system 600.
- Suitable light sources 622 can include the microLEDs above, for example.
- the one or more light sources 622 can include light-producing elements having different colors or wavelengths.
- a light source can include a red light-emitting diode that can emit red light, a green light-emitting diode that can emit green light, and a blue lightemitting diode that can emit blue right.
- the red, green, and blue light combine in specified ratios to produce any suitable color that is visually perceptible in a visible portion of the electromagnetic spectrum.
- the system 600 can include one or more modulators 624.
- the modulators 624 can be implemented in one of at least two configurations.
- the modulators 624 can include circuitry that can modulate the light sources 622 directly.
- the light sources 622 can include an array of LEDs, and the modulators 624 can directly modulate the electrical power, electrical voltage, and/or electrical current directed to each light-emitting diode in the array to form modulated light.
- the modulation can be performed in an analog manner and/or a digital manner.
- the light sources 622 can include an array of red light-emitting diodes, an array of green lightemitting diodes, and an array of blue light-emitting diodes
- the modulators 624 can directly modulate the red light-emitting diodes, the green light-emitting diodes, and the blue light-emitting diodes to form the modulated light to produce a specified image.
- the modulators 624 can include a modulation panel, such as a liquid crystal panel.
- the light sources 622 can produce uniform illumination, or nearly uniform illumination, to illuminate the modulation panel.
- the modulation panel can include pixels. Each pixel can selectively attenuate a respective portion of the modulation panel area in response to an electrical modulation signal to form the modulated light.
- the modulators 624 can include multiple modulation panels that can modulate different colors of light.
- the modulators 624 can include a red modulation panel that can attenuate red light from a red light source such as a red light-emitting diode, a green modulation panel that can attenuate green light from a green light source such as a green light-emitting diode, and a blue modulation panel that can attenuate blue light from a blue light source such as a blue light-emitting diode.
- a red modulation panel that can attenuate red light from a red light source such as a red light-emitting diode
- a green modulation panel that can attenuate green light from a green light source such as a green light-emitting diode
- a blue modulation panel that can attenuate blue light from a blue light source such as a blue light-emitting diode.
- the modulators 624 can receive uniform white light or nearly uniform white light from a white light source, such as a white-light light-emitting diode.
- the modulation panel can include wavelength-selective filters on each pixel of the modulation panel.
- the panel pixels can be arranged in groups (such as groups of three or four), where each group can form a pixel of a color image.
- each group can include a panel pixel with a red color filter, a panel pixel with a green color filter, and a panel pixel with a blue color filter.
- Other suitable configurations can also be used.
- the system 600 can include one or more modulation processors 626, which can receive a video signal, such as from the one or more video generation processors 620, and, in response, can produce an electrical modulation signal.
- the electrical modulation signal can drive the light sources 622.
- the modulators 624 include a modulation panel
- the electrical modulation signal can drive the modulation panel.
- the system 600 can include one or more beam splitters 628 (also known as beam combiners), which can combine light beams of different colors to form a single multi-color beam.
- the system 600 can include one or more wavelength-sensitive (e.g., dichroic) beam splitters 628 that can combine the light of different colors to form a single multi-color beam.
- the system 600 can direct the modulated light toward the eyes of the viewer in one of at least two configurations.
- the system 600 can function as a projector, and can include suitable projection optics 630 that can project the modulated light onto one or more screens 632.
- the screens 632 can be located a suitable distance from an eye of the user.
- the system 600 can optionally include one or more lenses 634 that can locate a virtual image of a screen 632 at a suitable distance from the eye, such as a closefocus distance, such as 500 mm, 750 mm, or another suitable distance.
- the system 600 can include a single screen 632, such that the modulated light can be directed toward both eyes of the user.
- the system 600 can include two screens 632, such that the modulated light from each screen 632 can be directed toward a respective eye of the user.
- the system 600 can include more than two screens 632.
- the system 600 can direct the modulated light directly into one or both eyes of a viewer.
- the projection optics 630 can form an image on a retina of an eye of the user, or an image on each retina of the two eyes of the user.
- the system 600 can include at least a partially transparent display, such that a user can view the user’s surroundings through the display.
- the AR system can produce modulated light that corresponds to the augmentation of the surroundings, rather than the surroundings itself.
- the AR system can direct modulated light, corresponding to the chair but not the rest of the room, toward a screen or toward an eye of a user.
- FIG. 7 illustrates a top plan view of an example array suitable for implementing embodiments described herein.
- the example hybridized device illustrated in FIG. 7 includes an LED die 710 that includes LEDs 712, such as those described herein. Projected patterned light may define images that may include light emitted from the LEDs 712.
- Each LED 712 (or group of LEDs) of the array may correspond to a projector picture element or projector pixel. In embodiments described herein, the LEDs 712.
- Suitable hybridized devices may include monolithic LED arrays, micro LED arrays, etc.
- Each LED 712 in LED die 710 may be individually addressable. Alternatively, groups or subsets of LEDs 712 may be addressable. In embodiments described herein, each array may comprise micro LEDs.
- Each LED 712 may have a size in the range of micrometers (i.e., between 1 micrometer (pm) and 100 pm). For example, LED 712 may have dimensions of approximately (within 10 pm by 10 pm) 40 pm by 40 pm in some embodiments. An LED 712 may have a lateral dimension of less than 100 pm in some embodiments.
- LEDs 712 may be arranged as a matrix comprising one or more rows and one or more columns to define a rectangle. In other embodiments, LEDs 712 may be arranged to define other shapes. Each micro-LED included in the LED die 710 may encompass thousands or millions of projector pixels or LEDs. For example, an LED die 710 that contains a pLED may include within 5,000 pixels, 20,000 pixels or more - such as millions of pixels. Each pixel may include an emitter. An LED die 710 that contains the pLED can support high- density pixels having a lateral dimension less than 150pm by 150pm. In some embodiments, a pLED die can have dimensions of about 50 pm in diameter or width. In some embodiments, the height dimension of an array including the LEDs 712, their supporting substrate and electrical traces, and associated microoptics may be less than 5 millimeters.
- Sub-array 716 may include LEDs 712, each defined by a width wl.
- width wl can be approximately 100pm or less (e.g., 40pm).
- lanes 714 may be defined extending horizontally and vertically to define rows and columns of LEDs 712.
- Lanes 714 between the LEDs 712 may have a width, w2, wide.
- the width w2 may be approximately 20pm or less (e.g., 5pm).
- the width w2 may be as small as 1pm.
- the lanes 714 may provide an air gap between adjacent emitters or may contain other material.
- a distance dl from the center of one LED 712 to the center of an adjacent LED 712 may be approximately 120pm or less (e.g., 45pm). It will be understood that the widths and distances provided herein are examples of one of many possible embodiments in which widths and/or other dimensions may vary.
- lanes 714 may be defined by a width w2 that can be approximately 20pm or less (e.g., 5pm). In some example embodiments, width w2 can be as small as 1pm. Lanes 714 can serve to provide an air gap between adjacent LEDs 712 and may contain material other than light emitting material.
- a distance dl from the center of one LED 712 to the center of an adjacent LED 712 can be approximately 120pm or less (e.g., 45pm). It will be understood that the LED and lane widths and distances between LEDs are intended as examples. Persons of ordinary skill reading the disclosure herein will appreciate a range of widths and/or dimensions will be suitable for various implementations, and those embodiments will fall within the scope of the disclosure.
- LED 712 that are included in the LED die 710 are depicted herein as having a rectangular shape.
- LED die 710 is depicted in FIG. 7 as a symmetric matrix of LEDs 712.
- LED die 710 may be suitable for implementing embodiments described herein, depending on application and design considerations.
- LED die 710 can comprise a linear array of LEDs 712, and in other implementations a rectangular array of LEDs 712.
- the LED die 710 can comprise a symmetric or asymmetric matrix of LEDs 712.
- LED die 710 can comprise an array or matrix defined by a dimension or order that differs from the array dimensions or orders depicted herein.
- the LED die 710 depicted in FIG. 7 may include over 20,000 LEDs 712 in asymmetric or symmetric arrangements in a wide range of array dimensions and orders (e.g., a 200x100 array, a symmetric matrix, or a non-symmetric matrix).
- two or more LED dies 710 can be stacked such that LEDs 712 are arranged to define rows and columns that extend in three spatial directions or dimensions.
- the LED die 710 can itself be a subarray of a larger array (not shown) of LEDs 712.
- LED die 710 may have a surface area of 90 mm 2 or greater and may require significant power to drive the array. In some applications, this power can be as much as 60 watts or more.
- the LED die 710 may include hundreds, thousands, or even millions of LEDs or emitters arranged within a centimeter-scale area substrate or smaller.
- a micro LED may include an array of individual emitters provided on a substrate or may be a single silicon wafer, or die partially or fully divided into light-emitting segments that form the LEDs 712. In some embodiments, the emitters may have distinct non-white colors. For example, at least four of the emitters may be RGB Y groupings of emitters.
- FIG. 8 illustrates an example lighting device 800, according to some embodiments. As above, some of the elements shown in the exemplary lighting device 800 may not be present, while other additional elements may be disposed in the lighting device 800.
- the lighting device 800 may include controller electronics 802, one or more LED arrays 804, and one or more optics 806 contained within a housing 810.
- the controller electronics 802 may include, among others, one or more PCBs to control the LEDs of the one or more LED arrays 804, drivers to drive the LEDs using one or more channels, and WiFi or other communication modules to communicate with a remote controller.
- the controller electronics 802 may be disposed in one or more locations within the lighting device 800 and may be different from that shown in FIG. 8.
- the LEDs and circuitry supporting the LED array can be packaged and include a submount, PCB, and/or CMOS backplane for powering and controlling light production by the LEDs.
- the PCB supporting the LED array may include electrical vias, heat sinks, ground planes, electrical traces, and flip chip or other mounting systems.
- the submount or PCB may be formed of any suitable material, such as ceramic, silicon, aluminum, etc. If the submount material is conductive, an insulating layer may be formed over the substrate material, and a metal electrode pattern formed over the insulating layer for contact with the micro-LED array.
- the submount can act as a mechanical support, providing an electrical interface between electrodes on the LED array and a power supply, and also provide heat sink functionality.
- the number of LED arrays 804 may vary from a single array up to a desired number able to be contained within the housing 810.
- the optics 806 may include lenses, reflective elements, and other devices that permit the light from the LEDs to be directed to a particular individual area.
- the shape of the housing 810 may be different from that shown in FIG. 8.
- FIG. 9 illustrates an example lighting system, according to some embodiments. As above, some of the elements shown in the lighting system 900 may not be present, while other additional elements may be disposed in the lighting system 900.
- the lighting system 900 may include a controller 902 that controls illumination using a pixel array 910 that contains multiple individual pixels 912.
- controller 902 may be disposed on a backplane such as, for example, a complementary metal oxide semiconductor (CMOS) backplane.
- the controller 902 may be coupled to or include one or more processors 904.
- the processor 904 may receive image data (in frames) via an interface and may process the image data to control a generator 906a, for example, controlling analog signals or PWM duty cycles and/or turn-on times for causing the lighting system 900 to produce the images indicated by the image data.
- the generator 906a may be controlled by the processor 904 and may produce driving signals in accordance with the indications.
- the generator 906a may be connected to a driver 906b (such as that described in FIG. 2) to drive the pixel array 910 so that the pixels 912 provide desired intensities of light.
- Each pixel 912 may include one or more LEDs 914.
- the LEDs 914 may be different colors and may be controlled individually or in groups.
- the pixel 912 may include, for each pixel 912 or LED 914, a PWM switch, and a current source.
- the pixel 912 may be driven by the driver 906b.
- the signal from the generator 906a may cause the switch to open and close in accordance with the value of the signal.
- the signal corresponding to the intensities of light may cause the current source to produce a current flow to cause the pixels 912 to produce the corresponding intensities of light.
- the lighting system 900 may further include a power supply 920.
- the power supply 920 may be a battery that produces power for the controller 902.
- Example l is a method of operating a monolithic segmented light emitting diode (LED) array, the method comprising: activating a torch mode of a mobile device to illuminate a scene; determining, under certain conditions, whether light from the LED array is configured to illuminate a face in the scene when the mobile device is in the torch mode; and reducing light from at least some of LEDs of the LED array in response to determining that the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- LED monolithic segmented light emitting diode
- Example 2 the subject matter of Example 1 includes, determining a first set of LEDs of the LED array that illuminate the face when the mobile device is in the torch mode and reducing a current used to drive the first set of LEDs.
- Example 3 the subject matter of Example 2 includes, determining a second set of LEDs of the LED array that is not configured to illuminate the face when the mobile device is in the torch mode and increasing a current used to drive the second set of LEDs.
- Example 4 the subject matter of Examples 2-3 includes, wherein reducing the current used to drive the first set of LEDs comprises stopping a supply of the current used to drive the first set of LEDs.
- Example 5 the subject matter of Examples 1-4 includes, determining whether the light from the LED array exceeds a standard safety limit when the mobile device is in the torch mode; and 11 Ruminating the scene without determining whether the light from the LED array is configured to illuminate the face in the scene when the mobile device is in the torch mode and -without modifying a torch mode current to be supplied to the LEDs.
- Example 6 the subject matter of Examples 1-5 includes, wherein determining whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode comprises activating a camera in the mobile device to detect reflected light from the scene.
- Example 7 the subject matter of Example 6 includes, wherein determining whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode comprises illuminating the scene using an infrared (IR) LED array in the mobile device and detecting reflected IR light from the IR LED array.
- IR infrared
- Example 8 the subject matter of Examples 6-7 includes, wherein determining whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode comprises detecting the reflected light from the scene before activating the torch mode.
- Example 9 the subject matter of Examples 6-8 includes, periodically automatically deactivating the torch mode, wherein determining whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode comprises detecting the reflected light from the scene during periodic deactivations of the torch mode.
- Example 10 the subject matter of Examples 1-9 includes, activating a diffuser to diffuse the light from the at least some of LEDs of the LED array in response to determining that light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- Example 11 the subject matter of Examples 1-10 includes, wherein an artificial intelligence (AI)/machine learning (ML) model is used by a processor in the mobile device to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- AI artificial intelligence
- ML machine learning
- Example 12 is a mobile device comprising: a monolithic segmented light emitting diode (LED) array configured to emit light to illuminate a scene; a camera configured to detect light reflected from the scene; a driver configured to individually drive LEDs of the LED array; and a processor configured to: activate a torch mode of a mobile device to illuminate a scene; determine whether the light from the LED array is configured to illuminate a face in the scene when the mobile device is in the torch mode; and control the driver to reduce light from at least some of LEDs of the LED array in response to a determination that the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- LED monolithic segmented light emitting diode
- Example 13 the subject matter of Example 12 includes, wherein the processor is configured to determine a first set of LEDs of the LED array that illuminate the face when the mobile device is in the torch mode and control the driver to reduce a current used to drive the first set of LEDs.
- Example 14 the subject matter of Example 13 includes, wherein the processor is configured to determine a second set of LEDs of the LED array that is not configured to illuminate the face when the mobile device is in the torch mode and control the driver to increase a current used to drive the second set of LEDs.
- Example 15 the subject matter of Examples 13-14 includes, wherein to reduce the current used to drive the first set of LEDs the processor is configured to control the driver to stop the current used to drive the first set of LEDs comprises stopping a supply of the current used to drive the first set of LEDs.
- Example 16 the subject matter of Examples 12-15 includes, wherein the processor is configured to determine that the light from the LED array exceeds a standard safety limit when the mobile device is in the torch mode prior to activation of the torch mode.
- Example 17 the subject matter of Examples 12-16 includes, wherein the processor is configured to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode through activation of the camera.
- Example 18 the subject matter of Example 17 includes, infrared (IR) LED array, wherein the processor is configured to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode through illumination of the scene using the IR LED array and detection of reflected IR light from the IR LED array.
- IR infrared
- Example 19 the subject matter of Examples 17-18 includes, wherein the processor is configured to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode through detection of the reflected light from the scene before activation of the torch mode.
- Example 20 the subject matter of Examples 17-19 includes, wherein the processor is configured to control the driver to periodically automatically deactivate the torch mode, and determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode through detection of the reflected light from the scene during periodic deactivations of the torch mode.
- Example 21 the subject matter of Examples 12-20 includes, wherein the processor is configured to activate a diffuser to diffuse the light from the at least some of LEDs of the LED array in response to a determination that light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- Example 22 the subject matter of Examples 12-21 includes, wherein the processor is configured to use an artificial intelligence (AI)/machine learning (ML) model to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- Example 23 is at least one machine-readable medium including instructions, which when executed by processing circuitry of a mobile device, cause the processing circuitry to perform operations comprising: activating a torch mode of a mobile device to illuminate a scene; determining whether light from a monolithic segmented light emitting diode (LED) array is configured to illuminate a face in the scene when the mobile device is in the torch mode; and reducing light from at least some of LEDs of the LED array in response to determining that light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- LED monolithic segmented light emitting diode
- Example 24 the subject matter of Example 23 includes, wherein the instructions further cause the processing circuitry to determine a first set of LEDs of the LED array that illuminate the face when the mobile device is in the torch mode and reduce a current used to drive the first set of LEDs.
- Example 25 the subject matter of Example 24 includes, wherein the instructions further cause the processing circuitry to determine a second set of LEDs of the LED array that is not configured to illuminate the face when the mobile device is in the torch mode and increase a current used to drive the second set of LEDs.
- Example 26 the subject matter of Examples 24-25 includes, wherein the instructions further cause the processing circuitry to reduce the current used to drive the first set of LEDs comprises by stopping a supply of the current used to drive the first set of LEDs.
- Example 27 the subject matter of Examples 23-26 includes, wherein the instructions further cause the processing circuitry to determine that the light from the LED array exceeds a standard safety limit when the mobile device is in the torch mode prior to activating the torch mode.
- Example 28 the subject matter of Examples 23-27 includes, wherein the instructions further cause the processing circuitry to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode by activating a camera in the mobile device to detect reflected light from the scene.
- Example 29 the subject matter of Example 28 includes, wherein the instructions further cause the processing circuitry to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode by illuminating the scene using an infrared (IR) LED array in the mobile device and detecting reflected IR light from the IR LED array.
- IR infrared
- Example 30 the subject matter of Examples 28-29 includes, wherein the instructions further cause the processing circuitry to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode by detecting the reflected light from the scene before activating the torch mode.
- Example 31 the subject matter of Examples 28-30 includes, wherein the instructions further cause the processing circuitry to periodically automatically deactivate the torch mode, and determining whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode comprises detecting the reflected light from the scene during periodic deactivations of the torch mode.
- Example 32 the subject matter of Examples 23-31 includes, wherein the instructions further cause the processing circuitry to activate a diffuser to diffuse the light from the at least some of LEDs of the LED array in response to determining that light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- Example 33 the subject matter of Examples 23-32 includes, wherein the instructions further cause the processing circuitry to use an artificial intelligence (AI)/machine learning (ML) model to determine whether the light from the LED array is configured to illuminate the face when the mobile device is in the torch mode.
- AI artificial intelligence
- ML machine learning
- Example 34 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1-33.
- Example 35 is an apparatus comprising means to implement of any of Examples 1-33.
- Example 36 is a system to implement of any of Examples 1-33.
- Example 37 is a method to implement of any of Examples 1-33.
- a processor configured to carry out specific operations includes both a single processor configured to carry out all of the operations as well as multiple processors individually configured to carry out some or all of the operations (which may overlap) such that the combination of processors carry out all of the operations.
- the term “about x” and similar terms (e.g., substantially) as used herein may be understood to be within 10% of x or otherwise within a range known to one of skill in the art to be within tolerance of the quantity or quality described, unless otherwise indicated.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
L'invention concerne un réseau de diodes électroluminescentes segmentées monolithiques (DEL) d'un dispositif mobile et un procédé de fonctionnement du réseau de DEL. Un mode de torche est activé pour éclairer une scène. Il est déterminé si la lumière provenant du mode de torche utilisé pour éclairer un visage humain détecté, par exemple, dépasse une limite de sécurité standard. Un processeur utilise un modèle d'intelligence artificielle (AI)/apprentissage automatique (ML) entraîné pour déterminer si la lumière provenant du réseau de DEL éclaire le visage dans la scène lorsque le dispositif mobile est dans le mode de torche. Si tel est le cas, le processeur peut commander un circuit d'attaque pour réduire la lumière provenant d'au moins certaines des DEL du réseau de DEL.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363469982P | 2023-05-31 | 2023-05-31 | |
| US63/469,982 | 2023-05-31 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024249115A1 true WO2024249115A1 (fr) | 2024-12-05 |
Family
ID=91616494
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/029880 Pending WO2024249115A1 (fr) | 2023-05-31 | 2024-05-17 | Réglage de mode de torche sur la base d'une reconnaissance faciale |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024249115A1 (fr) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8761594B1 (en) * | 2013-02-28 | 2014-06-24 | Apple Inc. | Spatially dynamic illumination for camera systems |
| US20160337564A1 (en) * | 2015-05-13 | 2016-11-17 | Apple Inc. | Light source module with adjustable diffusion |
| US20190058822A1 (en) * | 2016-08-24 | 2019-02-21 | Samsung Electronics Co., Ltd. | Electronic device including light-emitting elements and method of operating electronic device |
| US20210200064A1 (en) * | 2016-01-20 | 2021-07-01 | Lumileds Llc | Driver for an adaptive light source |
-
2024
- 2024-05-17 WO PCT/US2024/029880 patent/WO2024249115A1/fr active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8761594B1 (en) * | 2013-02-28 | 2014-06-24 | Apple Inc. | Spatially dynamic illumination for camera systems |
| US20160337564A1 (en) * | 2015-05-13 | 2016-11-17 | Apple Inc. | Light source module with adjustable diffusion |
| US20210200064A1 (en) * | 2016-01-20 | 2021-07-01 | Lumileds Llc | Driver for an adaptive light source |
| US20190058822A1 (en) * | 2016-08-24 | 2019-02-21 | Samsung Electronics Co., Ltd. | Electronic device including light-emitting elements and method of operating electronic device |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7341202B2 (ja) | 適応光源 | |
| CN108713316B (zh) | 光电子发光装置、对场景照明的方法、相机及移动终端 | |
| WO2024249115A1 (fr) | Réglage de mode de torche sur la base d'une reconnaissance faciale | |
| WO2024249028A1 (fr) | Architecture d'attaque électronique utilisant un contrôleur blu et une puce segmentée | |
| US20230124794A1 (en) | Micro-led with reflectance redistribution | |
| TW202414032A (zh) | 用於虛擬實境頭戴裝置之透明微發光二極體顯示器 | |
| WO2024129691A1 (fr) | Sources de lumière polarisée pour applications lcos | |
| WO2025117838A1 (fr) | Micro-del à micro-structures trapézoïdales | |
| WO2024129716A1 (fr) | Micro-led à surface à nanomotifs | |
| WO2025006184A1 (fr) | Réglage de luminance de del pour ajuster la répartition d'éclairage | |
| US20230122492A1 (en) | Narrowband reflector for micro-led arrays | |
| WO2024249144A2 (fr) | Structure de microdel côté p | |
| EP4634984A1 (fr) | Boîtier de del à film mince sans élément porteur de substrat | |
| EP4635002A1 (fr) | Métallisation de puce pour réseaux à garnissage dense | |
| WO2025049204A1 (fr) | Micro-del à flux de courant à ouvertures | |
| WO2024129339A1 (fr) | Couche optique pour pixels de dispositifs à micro-diodes électroluminescentes (del) | |
| WO2024129340A1 (fr) | Revêtement optique pour pixels de dispositifs à micro-diodes électroluminescentes (led) | |
| EP4457864A1 (fr) | Liaison hybride avec des dispositifs à micro-diodes électroluminescentes (del) | |
| WO2025117122A1 (fr) | Réseau de del à couche semi-conductrice continue | |
| EP4634990A1 (fr) | Pixels à motifs de micro-diodes électroluminescentes (uleds) et puces |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24735044 Country of ref document: EP Kind code of ref document: A1 |