[go: up one dir, main page]

WO2025176821A1 - Procédé d'authentification d'un utilisateur d'un dispositif - Google Patents

Procédé d'authentification d'un utilisateur d'un dispositif

Info

Publication number
WO2025176821A1
WO2025176821A1 PCT/EP2025/054666 EP2025054666W WO2025176821A1 WO 2025176821 A1 WO2025176821 A1 WO 2025176821A1 EP 2025054666 W EP2025054666 W EP 2025054666W WO 2025176821 A1 WO2025176821 A1 WO 2025176821A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
user
light
spectral radiance
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2025/054666
Other languages
English (en)
Inventor
Sebastian Valouch
Alexander Stefan RENNER
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TrinamiX GmbH
Original Assignee
TrinamiX GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TrinamiX GmbH filed Critical TrinamiX GmbH
Publication of WO2025176821A1 publication Critical patent/WO2025176821A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan

Definitions

  • Available authentication systems in mobile devices include receivers, such as at least one camera.
  • Said mobile devices usually have a front display, such as an organic light-emitting diode (OLED) area and/or a quantum-dot light emitting diode (QLED) area.
  • the receiver may be positioned behind said front display.
  • a light emitter such as a projector, may be used, such as one or more light emitting diodes and/or laser, and may be positioned behind the display.
  • image capturing for face authentication may suffer from severe artifacts in the image in case a very bright object, e.g.
  • the display may be or may comprise at least one organic light-emitting diode (OLED) display and/or at least one quantum-dot light emitting diode (QLED) display.
  • OLED organic light-emitting diode
  • QLED quantum-dot light emitting diode
  • organic light emitting diode is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to a light-emitting diode (LED) in which an emissive electroluminescent layer is a film of organic compound configured for emitting light in response to an electric current.
  • the OLED display may be configured for emitting visible light.
  • the display may be at least partially transparent in at least one continuous area covering an camera, specifically a camera as will be outlined in further detail below.
  • the display may be at least partially transparent in at least one continuous area in a manner that at least one of: the light pattern incident on the continuous areas traverses the display while being illuminated from the pattern illumination source; the flood light incident on the continuous areas traverses the display while being illuminated from the flood illumination source; user light, generated by the light pattern and/or the flood light incident on a user, incident on the continuous areas traverses the display for impinging on the camera, specifically on the camera.
  • the display may specifically be or may comprise at least one user interface.
  • the term "user interface” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term may refer, without limitation, to a feature of the device which is configured for interacting with its environment, such as for the purpose of unidirectionally or bidirectionally exchanging information, such as for exchange of one or more of data or commands.
  • the user interface may be configured to share information with a user and to receive information by the user.
  • the user interface may be a feature to interact visually with a user, such as a display, or a feature to interact acoustically with the user.
  • the user interface as an example, may comprise one or more of: a graphical user interface; a data interface, such as a wireless and/or a wire-bound data interface.
  • resource as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to one or more functions and/or entities associated with the device.
  • the functions and/or entities associated with the device that require authentication of the user may be pre-defined.
  • allowing the user to access a resource may include allowing the user to perform at least one operation with the device. Additionally and/or alternatively, allowing the user to access the resource may include allowing the user to access an entity.
  • the entity may be physical entity and/or virtual entity.
  • the virtual entity may be a database.
  • the physical entity may be an area with restricted access.
  • the area with restricted access may be one of the following: security areas, rooms, apartments, vehicles, parts of the before mentioned examples, or the like.
  • the device may be locked and may only be unlocked by authorized user.
  • pattern illumination source is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to an optical device configured for projecting at least one light pattern.
  • projecting is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to the process of providing at least one light beam, in particular a light pattern onto at least one surface.
  • the transfer device may comprise at least one imaging optical device .
  • the transfer device specifically may comprise one or more of: at least one lens, for example at least one lens selected from the group consisting of at least one focus-tunable lens, at least one aspheric lens, at least one spherical lens, at least one Fresnel lens; at least one diffractive optical element; at least one concave mirror; at least one beam deflection element, preferably at least one mirror; at least one beam splitting element, preferably at least one of a beam splitting cube or a beam splitting mirror; at least one multi-lens system; at least one holographic optical element; at least one meta optical element.
  • the transfer device comprises at least one refractive optical lens stack.
  • the transfer device may comprise a multi-lens system having refractive properties.
  • the flood illumination source may illuminate a measurement area, such as the object or a portion of the object, with a substantially constant illumination intensity.
  • the term “constant” as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to a time aspect during an exposure time. Flood light may vary temporally and/or may be substantially constant over time.
  • substantially constant as used herein, is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to a completely constant illumination and embodiments in which deviations from a constant illumination of ⁇ ⁇ 10 %, preferably ⁇ ⁇ 5 %, more preferably ⁇ ⁇ 2 % are possible.
  • the emitting of the flood light and the illumination of the light pattern may be performed subsequently or at least partially overlapping in time.
  • the flood light and the light pattern may be emitted at the same time.
  • one of the flood light or the light pattern may be emitted with a lower intensity compared to the other one.
  • the image may be generated via a hardware and/or a software interface, which may be considered as the image sensor.
  • the device may comprise the at least one image sensor.
  • the image sensor may comprise at least one optical sensor, in particular at least one pixelated optical sensor.
  • the image sensor may comprise at least one CMOS sensor or at least one CCD sensor.
  • the image sensor may comprise at least one CMOS sensor, which may be sensitive in the infrared spectral range.
  • image generation “generating an image” or simply “imaging” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to capturing and/or generating and/or determining and/or recording at least one image by using the image sensor.
  • the image data as generated by the image sensor may be referred to as “generated image”.
  • the image generation may comprise imaging and/or recording the image.
  • the image generation may comprise capturing a single image and/or a plurality of images such as a sequence of images.
  • the capturing and/or generating and/or determining and/or recording of the image may be caused and/or initiated by the hardware and/or the software interface.
  • the image generation may comprise recording continuously a sequence of images, such as a video or a movie.
  • the image generation may be initiated by a user action or may automatically be initiated, e.g. once the presence of at least one object or user within a field of view and/or within a predetermined sector of the field of view of the image sensor is automatically detected.
  • the term “field of view” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to an angular extent of the observable world and/or at least one scene that may be captured or viewed by an optical system, such as the image sensor.
  • the field of view may, typically, be expressed in degrees and/or radians, and, exemplarily, may represent the total angle spanned by the image and/or viewable area.
  • the camera may be an internal and/or external camera of the device.
  • the internal and/or external camera of the device may be accessed via a hardware and/or a software interface, which is used in conjunction as the image sensor.
  • the device may be or may comprise a smartphone and the image sensor may be a front camera, such as a selfie camera, and/or back camera of the smartphone.
  • the image may specifically comprise at least one pattern image.
  • pattern image as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to an image generated by the image sensor while the object is being illuminated by at least one light pattern.
  • the pattern image may comprise an image showing the object, in particular at least parts of the face of the user, while the user is being illuminated with the pattern, particularly on a respective area of interest comprised by the image.
  • the image sensor may yield an intensity image of the object projected with the pattern.
  • the pattern image may be generated by imaging and/or recording light reflected by an object, which is illuminated by the light pattern.
  • the method further comprises determining if spectral radiance associated with the image of the object is within the at least one predefined range at least within tolerances.
  • determining is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to a process of generating at least one representative result, specifically a numerical representative result, e.g. by evaluating the image as acquired by the camera.
  • the determining if spectral radiance associated with the image of the object is within at least one predefined range comprises determining if the image comprises stray light and/or ambient light.
  • the sun in case the sun is shining, this can lead to a too bright image, in particular overexposure, when the sun and/or a reflection of the sun is present in the image.
  • the sun at around 550 nm, may have a maximum radiance (mountain, equator, noontime) of about 13 kW/(sr*m 2 *nm), wherein at around 940 nm the maximum radiance may be about 8 kW/(sr*m 2 *nm) .
  • Step iv. comprises determining if the spectral radiance associated with the image of the object is within at least one predefined range at least within tolerances.
  • the predetermined range may be defined and/or selected such that image artifacts, in particular caused by for example ambient light, diffraction and/or straylight and the like, are one or more of: prevented, removed or suppressed. This can allow ensuring having a suitable image for further user authentication.
  • the range of spectral radiance values may be defined prior to performing of the method, such as in a calibration procedure and/or an end-of-line test of the device.
  • the predetermined range may be from 13 kW/(sr*m 2 *nm) to 13 mW/(sr*m 2 *nm), preferably 12 kW/(sr*m 2 *nm) to 1 W/(sr*m 2 *nm), most preferably 10 kW/(sr*m 2 *nm) to 10 W/(sr*m 2 *nm).
  • the determining if spectral radiance associated with the image of the object is within at least one predefined range may comprise determining if the image comprises stray light and/or ambient light.
  • stray light as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to unintended light in an optical system.
  • stray light may comprise light on the camera arising from unintended sources in the device, such as from diffraction, scattering, light leaks, diffuse scattering on surfaces and/or other undesired effects.
  • stray light may comprise light on the camera arising from diffraction at the display of the device, specifically at the display structure, such as at the OLED display structure and/or the QLED display structure.
  • ambient light is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to light being present in an environment of the device, in particular in the spectral region of light transmitted by the bandpass filter, e.g. light having a wavelength in the infrared spectral region, e.g. of 940 nm.
  • ambient light may refer to light being ambient to one or more of the user, the device and the like.
  • the ambient light may comprise one or both light arising from any natural source or light from any artificial source, such as artificial lighting, or a superposition thereof.
  • the ambient light may be sun light.
  • ambient light caused by sunlight may have a spectral radiance of up to 13 kW/(sr*m 2 *nm) (e.g. 107527 lux). This can lead to a too bright image, in particular overexposure, when the sun and/or a reflection of the sun is present in the image.
  • Artificial sources may be floodlights or spotlights for security cameras.
  • the determining if spectral radiance associated with the image of the object is within at least one predefined range may comprise determining if an intensity of a frequency associated with a Fourier transform of the image is within a predefined range. Specifically, the Fourier transform of the image may be determined thereby obtaining the image in frequency domain. The frequencies of the image may be evaluated. Specifically, each frequency of the image may be evaluated to be within a predefined range. For example, in case ambient light, e.g. from the sun, is present in the image, the spectral radiance associated with the image of the object may be outside the predetermined range. The presence of the ambient light may be determined according to predefined frequencies in the image having an intensity above the predefined range.
  • the determining if spectral radiance associated with the image of the object is within at least one predefined range may comprise determining a number of pixels associated with an spectral radiance deviating from the predefined range. For example, a number of pixels associated with an spectral radiance deviating from the predefined range may be determined and, if the number of pixels is above a predefined threshold value, the image may be evaluated to comprise ambient light and/or stray light.
  • the determining if spectral radiance associated with the image of the object is within at least one predefined range may comprise detecting a diffraction pattern and comparing the diffraction pattern to at least one reference diffraction pattern.
  • the reference diffraction pattern may comprise a diffraction pattern associated with the light and electronics of the device comprising the illumination source.
  • the image may be evaluated to comprise ambient light and/or stray light.
  • the manipulating may comprise one or more image transformations, such as filtering, convoluting, scaling, cropping and the like, specifically one or more transformation on pixels coordinates of the image.
  • the result of the manipulating may comprise a processed image.
  • the “manipulated image” may comprise the image as generated by the camera and having performed at least one manipulating operation thereon.
  • Step iv. may comprise identifying stray light and/or ambient light features in the image by comparing the image to at least one stray light and/or ambient light image.
  • the manipulating may further comprise subtracting the stray light and/or ambient light image from the image.
  • the subtracting of the stray light and/or ambient light from the image may result in an elimination of stray light and/or ambient light features.
  • the one or more stray light and/or ambient light features may specifically be identified by comparing the image with the stray light and/or ambient light image.
  • the stray light and/or ambient light image may comprise one or more illumination features resulting from the projection of the light emitted by the illumination source.
  • Step iv. may comprise identifying stray light features in the image.
  • the identifying may comprise using a stray light map being indicative of stray light within the image.
  • the stray light map may be generated and/or may be retrieved, e.g. from a local storage of the device and/or via a communication interface of the device.
  • Step iv. may comprise identifying an ambient light source, in particular the sun, within the image.
  • the identifying may comprise identifying pixels exhibiting spectral radiance above at least one threshold value, e.g. relating to overexposure.
  • Step iv. may comprise determining a resulting pattern formed by the sun illuminating the camera.
  • the resulting pattern may be determined based on the position of the sun within the image and information on diffractive optical elements between the camera for recording the image and the emitted light, such as information about the display, e.g. pixel size, pixel density and the like, distances between different components of the device.
  • the resulting pattern may be determined using optical equations, such as the Bragg equation for diffraction effects.
  • Step iv. may further comprise verifying the presence of the ambient light source using information obtained by at least one sensor of the device.
  • step iv. comprises identifying the sun within the image and verifying the presence of the sun.
  • the position of the sun in particular a relative position of the sun with respect to the device, can be one or more of determined, calculated and/or retrieved.
  • the position of the sun can be calculated and/or is known precisely such that artefacts in the image identified to be caused by the sun can be verified to be caused by the sun.
  • the position of the sun may be calculated using at least one algorithm for determining a position of astronomical objects, e.g.
  • the sensor may be at least one sensor selected from the group consisting of: at least one inertial measurement unit (IMU), at least one accelerometer, at least one gyroscope, at least one GPS sensor.
  • the information may be one or more of position information, orientation information, or time.
  • the stray light and/or ambient light features may be identified by identifying an external illumination source of the ambient light, such as the sun, within the image and, optionally in case the sun is identified, verifying the presence of the sun based on at least one of GPS data, time and orientation of the device, e.g. obtained via an integrated inertial measurement unit.
  • the position of the device may be determined using the suitable means of the operating system, e.g. using GPS data and/or information from a further data source e.g. received WLANs.
  • the position may be determined down to a few meters.
  • An angle of inclination of the device may be determined using at least one accelerometer or the like, e.g. with an accuracy of about 0.1 °. Additionally or alternatively, image analysis may be performed for horizon detection.
  • the identifying may comprise generating at least two ambient light images showing at least a part of the ambient light source at different positions and/or the corresponding stray light.
  • a relation between the position of the ambient light source to one or more stray light and/or ambient light features may be used for determining and/or identifying one or more stray light and/or ambient light features.
  • an angle between the illumination source and the device may have an effect to a diffraction pattern in the image.
  • the angle may also be estimated by calculating the center of the light source and using the information of the optical system, such as the field of view and/or lens formulas.
  • the method may further comprise authenticating an authorized user.
  • the method may comprise at least one authorization process.
  • the authorization may be performed before step i.
  • processor also denoted as “processing unit”, as generally used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to an arbitrary logic circuitry configured for performing basic operations of a computer or system, and/or, generally, to a device which is configured for performing calculations or logic operations.
  • the processor may be configured for processing basic instructions that drive the computer or system.
  • the processor may be or may comprise one or more application-specific integrated circuits (ASICs) and/or one or more field-programmable gate arrays (FPGAs) and/or one or more tensor processing unit (TPU) and/or one or more chip, such as a dedicated machine learning optimized chip, or the like.
  • the processor specifically may be configured, such as by software programming, for performing one or more evaluation operations.
  • At least one or any component of a computer program configured for performing the authentication process may be executed by the processing device.
  • the authentication unit may be or may comprise a connection interface.
  • the connection interface may be configured to transfer data from the device to a remote device; or vice versa.
  • At least one or any component of a computer program configured for performing the authentication process may be executed by the remote device.
  • the authentication process may comprise a plurality of steps.
  • the authentication process may comprise performing at least one face detection step.
  • the face detection step may comprise analyzing at least one image of the user, e.g. generated by the camera or a further camera.
  • the image may be a flood image.
  • the term “flood image” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically specifically may refer, without limitation, to an image generated by the camera while illumination source is emitting infrared flood light, e.g. on an object and/or a user.
  • the complete pattern image may be used.
  • partial images may be used. Extracting of material data may include generating one or more partial images from the pattern image. For example, different regions of the pattern image may be selected as partial images. The partial images may be different from each other. In particular, the partial images may be non-overlapping. Using partial images may allow having material data from different areas of the object (e.g. having different light conditions) and/or comparing the extracted material data and/or generating of a material map and/or reducing an uncertainty on the obtained material data.
  • the determining if the surface is human skin may comprise comparing the extracted material data with material data relating to skin (skin material data). Comparing the material data with desired material data may comprise determining a similarity of the extracted material data and the skin material data.
  • the skin material data may refer to predetermined material data of skin.
  • the material data may comprise reflectance values, wherein the method may comprise comparing the determined reflectance values for the surface with at least one range of reflectance values for skin.
  • the method may comprise considering tolerances, e.g. of ⁇ 10 %, preferably of ⁇ 5 %, more preferably of ⁇ 1 %.
  • the skin material data may be stored in at least one database and/or may be retrieved from at least one database, e.g. the database may be at least partially cloud based, e.g. via at least one communication interface.
  • the authentication process may be validated based on the extracted material data.
  • the validating may comprise determining a similarity of the extracted material data and skin, e.g. comparing the extracted material data with the skin material data.
  • a comparison of material data with skin may result in a allowing and/or declining the user and/or object to perform at least one operation that requires authentication.
  • skin as desired material data may be compared with non-skin material or silicon as material data and the result may be declination since silicon or non-skin material may be different from skin.
  • the extracting of blood perfusion data may comprise determining a speckle contrast of the image and determining a blood perfusion measure based on the determined speckle contrast.
  • a speckle contrast may represent a measure for a mean contrast of an intensity distribution within an area of a speckle pattern.
  • the authentication unit may be further configured for considering additional security features, e.g. extracted from the pattern image.
  • the authentication unit may be further configured for extracting liveness data such as a blood perfusion measure and/or considering the extracted liveness data from the pattern image.
  • liveness data such as a blood perfusion measure
  • the term “blood perfusion measure” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to a blood volume flow through a given volume or mass of tissue.
  • the blood perfusion measure may be given in units of ml/ml/s or ml/100 g/min.
  • speckle contrast as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning. The term specifically may refer, without limitation, to a degree of a variation in a speckle pattern generated by coherent light.
  • the speckle pattern may be generated by the transmitter, particularly on the object.
  • a speckle contrast may represent a measure for a mean contrast of an intensity distribution within an area of a speckle pattern.
  • a speckle contrast K over an area of the speckle pattern may be expressed as a ratio of standard deviation o to the mean speckle intensity ⁇ l>, i.e.,
  • Speckle contrast may comprise a speckle contrast value. Speckle contrast values may be distributed between 0 and 1 .
  • the blood perfusion measure may be determined based on the speckle contrast. The blood perfusion measure may depend on the determined speckle contrast. If the speckle contrast changes, the blood perfusion measure derived from the speckle contrast may change accordingly.
  • a blood perfusion measure may be a single number or value that may represent a likelihood that the object is a living subject. For monitoring of speckle contrast changes a plurality of pattern images generated at different points in time may be used. For determining the speckle contrast, the complete pattern image may be used. Alternatively, for determining the speckle contrast, a section of the pattern image may be used.
  • a data-driven model may be used for determining a blood perfusion measure.
  • Data-driven model be parametrized and/or trained based on a training data set.
  • the training data set may comprise a pattern image and a blood perfusion measure.
  • the data-driven model may be parametrized and/or trained based on the training data set to output a blood perfusion measure based on receiving a pattern image.
  • the determining if an object corresponds to a living organism based on the blood perfusion data may comprise determining if the blood perfusion measure corresponds to blood perfusion measure of a human being.
  • the estimated distance can be considered for calculating the reflectance.
  • the distance can be considered as a correction value for the reflectance measure.
  • the distance can be used for determining if the distance of the object and the camera is within a working range of the model used for extraction of material data. Distances are further important for face authentication as well since the face recognition models are associated with a working range.
  • the working range may specify a distance range between the object and the camera where the model works and/or is trained on the image of the user.
  • a working range may specify at least one upper and/or at least one lower boundary for a distance of an object from the camera and/or an illumination source.
  • the working range may be associated with an authentication process.
  • a working range may comprise at least one value.
  • the distance may be calculated using at least one distance determination technique.
  • the distance may be determined using one or more of beam profile analysis, e.g. as described in WO 2018/091649 A1 , WO 2018/091638 A1 and WO 2018/091640 A1 , the full content of which is included by reference, time-of-flight, triangulation and the like.
  • the distance may be calculated with a distance sensing system of a mobile device which comprises the illumination source and the camera.
  • the authentication process further may comprise determining a distance information of the user and determining if the user is within or outside of a working range of the authentication process by comparing the distance to the working range.
  • the authentication process may comprise allowing the user to access the resource in case the user is determined to be within the working range and otherwise, in case the user is determined to be outside the working range, denying the user to access the resource.
  • the device may be used as a field device that is used by the user for generating data required in the authentication process and/or its validation.
  • the device may transmit the generated data and/or data associated to an intermediate step of the authentication process and/or its validation to the remote device.
  • the authentication unit may be and/or may comprise a connection interface configured for transmitting information to the remote device. Data generated by the remote device used in the authentication process and/or its validation may further be transmitted to the device. This data may be received by the connection interface comprised by the device.
  • the connection interface may specifically be configured for transmitting or exchanging information.
  • the connection interface may provide a data transfer connection.
  • the connection interface may be or may comprise at least one port comprising one or more of a network or internet port, a USB-port, and a disk drive.
  • data from the device may be transmitted to a specific remote device depending on at least one circumstance, such as a date, a day, a load of the specific remote device, and so on.
  • the specific remote device may not be selected by the field device. Rather a further device may select to which specific remote device the data may be transmitted.
  • the authentication process and and/or the generation of validation data may involve a use of several different entities of the remote device. At least one entity may generate intermediate data and transmit the intermediate data to at least one further entity.
  • the allowing the user to access the resource may comprise authorization of the user.
  • the device may comprise at least one authorization unit configured for allowing the user to perform at least one operation on the device, e.g. unlocking the device, in case of successful authentication of the user or declining the user to perform at least one operation on the device in case of non-successful authentication. Thereby, the user may become aware of the result of the authentication.
  • the authorization unit may be configured for allowing or declining the user to perform at least one operation on the device that requires authentication based on the material data and the identifying e.g. using the flood image.
  • the authorization unit may be configured for allowing or declining the user to access one or more functions associated with the device depending on the authentication or denial.
  • the allowing may comprise granting permission to access the one or more functions.
  • authorization unit is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to a unit such as a processor configured for authorization of a user.
  • the authorization unit may be comprised by the processing unit, e.g. by at least one processor, or may be designed as software or application.
  • the authorization unit and the authentication unit may be embodied integral, e.g. by using the same processor or processing unit.
  • the authorization unit may be configured for allowing the user to access the one or more functions, e.g. on the device, e.g. unlocking the device, in case of successful authentication of the user or declining the user to access the one or more functions, e.g. on the device, in case of non-successful authentication.
  • the device e.g. by using a user interface, such as the display of the device, may be configured for displaying a result of the authentication and/or the authorization.
  • the method may comprise determining if the object corresponds to a user and/or a living human from the liveness data and allowing the user to access the resource in response to determining that the user corresponds to a user and/or living human.
  • the method may specifically be computer-implemented.
  • the term “computer-implemented” as used herein is a broad term and is to be given its ordinary and customary meaning to a person of ordinary skill in the art and is not to be limited to a special or customized meaning.
  • the term specifically may refer, without limitation, to a method involving at least one computer and/or at least one computer network.
  • the computer and/or computer network may comprise at least one processor which is configured for performing at least one of the method steps of the method according to the present invention.
  • each of the method steps may be performed by the computer and/or computer network.
  • the method may be performed completely automatically, specifically without user interaction. For example, the illuminating, the generating of images, the determining of the spectral radiance and/or the manipulating may be triggered and/or executed by using at least one processor.
  • a device for authenticating a user of the device to perform at least one operation on the device that requires authentication is disclosed.
  • the device comprises: at least one communication interface configured for receiving at least one request to access at least one resource, at least one illumination source configured for illuminating at least one object with light, at least one camera configured for generate at least one image of the object while the object is being illuminated by the light, at least one processing unit configured for determining if spectral radiance associated with the image of the object is within at least one predefined range at least within tolerances, wherein the processing unit is configured for manipulating spectral radiance, if the spectral radiance deviates from the predefined range, in at least a part of the image associated with the deviation, wherein the processing unit is further configured for determining if the object associated with the image corresponds to a user and/or a living organism, wherein, in case the spectral radiance deviates from the predefined range, the manipulated image is used, or, in case the spectral radiance is within the predefined range, the generated image is used, at least one authentication unit configured for allowing to access the resource based on
  • the device may specifically be configured for performing a method for authenticating a user according to the present invention, such as according to any one of the embodiments disclosed above and/or according to any one of the embodiments disclosed in further detail below.
  • a use of a device according to the present invention for authenticating a user for one or more of: in-car payment; vehicle access; starting a vehicle; access control, such as for at least one resource of a mobile device such as a mobile phone; in-cabin sensing; building access; ; at least one payment process; unlocking of at least one electronic device.
  • a computer program comprising instructions which, when the program is executed by the device according to the present invention, such as according to any one of the embodiments disclosed above and/or according to any one of the embodiments disclosed in further detail below, cause the device to perform the method according to the present invention, such as according to any one of the embodiments disclosed above and/or according to any one of the embodiments disclosed in further detail below.
  • a computer-readable storage medium comprising instructions which, when the instructions are executed by the device according to the present invention, such as according to any one of the embodiments disclosed above and/or according to any one of the embodiments disclosed in further detail below, cause the device to perform the method according to the present invention, such as according to any one of the embodiments disclosed above and/or according to any one of the embodiments disclosed in further detail below.
  • computer-readable storage medium specifically may refer to non- transitory data storage means, such as a hardware storage medium having stored thereon computer-executable instructions.
  • the computer-readable storage medium specifically may be or may comprise a storage medium such as a random-access memory (RAM) and/or a read-only memory (ROM).
  • RAM random-access memory
  • ROM read-only memory
  • the computer-readable storage medium may be or may comprise a computer- readable data carrier.
  • a non-transient computer-readable medium including instructions that, when executed by one or more processors, cause the one or more processors to perform the method according to the present invention, such as according to any one of the embodiments disclosed above and/or according to any one of the embodiments disclosed in further detail below.
  • a computer program including computer-executable instructions for performing the method according to the present invention in one or more of the embodiments enclosed herein when the program is executed on a computer or computer network.
  • the computer program may be stored on a computer-readable data carrier and/or on a computer-readable storage medium.
  • one, more than one or even all of method steps i. to vi. as indicated above may be performed by using a computer or a computer network, preferably by using a computer program.
  • program code means in order to perform the method according to the present invention in one or more of the embodiments enclosed herein when the program is executed on a computer or computer network.
  • the program code means may be stored on a computer-readable data carrier and/or on a computer-readable storage medium.
  • a data carrier having a data structure stored thereon, which, after loading into a computer or computer network, such as into a working memory or main memory of the computer or computer network, may execute the method according to one or more of the embodiments disclosed herein.
  • a computer program product with program code means stored on a machine-readable carrier, in order to perform the method according to one or more of the embodiments disclosed herein, when the program is executed on a computer or computer network.
  • a computer program product refers to the program as a tradable product.
  • the product may generally exist in an arbitrary format, such as in a paper format, or on a computer-readable data carrier and/or on a computer-readable storage medium.
  • the computer program product may be distributed over a data network.
  • a modulated data signal which contains instructions readable by a computer system or computer network, for performing the method according to one or more of the embodiments disclosed herein.
  • one or more of the method steps or even all of the method steps of the method according to one or more of the embodiments disclosed herein may be performed by using a computer or computer network.
  • any of the method steps including provision and/or manipulation of data may be performed by using a computer or computer network.
  • these method steps may include any of the method steps, typically except for method steps requiring manual work, such as providing the samples and/or certain aspects of performing the actual measurements.
  • a computer or computer network comprising at least one processor, wherein the processor is adapted to perform the method according to one of the embodiments described in this description, a computer loadable data structure that is adapted to perform the method according to one of the embodiments described in this description while the data structure is being executed on a computer, a computer program, wherein the computer program is adapted to perform the method according to one of the embodiments described in this description while the program is being executed on a computer, a computer program comprising program means for performing the method according to one of the embodiments described in this description while the computer program is being executed on a computer or on a computer network, a computer program comprising program means according to the preceding embodiment, wherein the program means are stored on a storage medium readable to a computer, a storage medium, wherein a data structure is stored on the storage medium and wherein the data structure is adapted to perform the method according to one of the embodiments described in this description after having been loaded into a main and/or working storage of a computer
  • the method and the device according to the present invention may provide a large number of advantages of known methods and devices of similar kind.
  • the method and the device according to the present invention may reduce and/or eliminate detrimental effects due to ambient light on authentication.
  • a smartphone may be capable of determining its position and orientation via various sensors, in particular GPS sensor, accelerometers, gyroscopes and the like.
  • time and/or location on the surface of the earth may be known or determined. Taking into account these information, it may be possible to determine a position of the sun and thereby knowledge about angles in which sunlight can impinge on the display of the device.
  • an expected stray light and/or diffraction pattern may be predicted and can be used for compensation.
  • the position of the camera relative to this very bright object can be used to compensate for diffraction and/or stray light effects.
  • an artificial light source e.g. in case where the GPS information cannot be used, it may be possible to estimate an angle of the light source by taking the center position of the light source and the lens specifications into account.
  • the method for authenticating a user of a device may comprise, as an example: receiving at least one request to access at least one resource; in response to receiving the request to access the resource, triggering to illuminate the object by light emitted from an illumination source, and triggering to generate at least one image of the object while the object is being illuminated by the light; determining if spectral radiance associated with the image of the object is within at least one predefined range at least within tolerances, wherein, if the spectral radiance deviates from the predefined range, spectral radiance is manipulated in at least a part of the image associated with the deviation, wherein the determining specifically comprises: determining if the image comprises stray light and/or ambient light by determining if a spectral radiance associated with the image is within the predefined range, and, in response to determining that the image comprises stray light and/or ambient light, eliminating at least a part of the stray light and/or ambient light in at least a part of the image by manipulating
  • the terms “at least one”, “one or more” or similar expressions indicating that a feature or element may be present once or more than once typically are used only once when introducing the respective feature or element. In most cases, when referring to the respective feature or element, the expressions “at least one” or “one or more” are not repeated, nonwithstanding the fact that the respective feature or element may be present once or more than once.
  • the terms “preferably”, “more preferably”, “particularly”, “more particularly”, “specifically”, “more specifically” or similar terms are used in conjunction with optional features, without restricting alternative possibilities.
  • features introduced by these terms are optional features and are not intended to restrict the scope of the claims in any way.
  • the invention may, as the skilled person will recognize, be performed by using alternative features.
  • features introduced by "in an embodiment of the invention” or similar expressions are intended to be optional features, without any restriction regarding alternative embodiments of the invention, without any restrictions regarding the scope of the invention and without any restriction regarding the possibility of combining the features introduced in such way with other optional or non-optional features of the invention.
  • Embodiment 3 The method according to any one of the preceding embodiments, wherein the method comprises authenticating an authorized user, wherein the method comprises at least one authorization process, wherein the authorization is performed before step i.
  • Embodiment 4 The method according to any one of the preceding embodiments, wherein the determining if the object corresponds to a user and/or a living organism comprises at least one 2D face authentication and/or liveness detection.
  • Embodiment 5 The method according to the preceding embodiment, wherein the 2D face authentication comprises generating a representation of the image of the object and determining if the representation of the image corresponds to a representation of a template image associated with a user.
  • Embodiment 6 The method according to any one of the two preceding embodiments, wherein the liveness detection comprises extracting material data and/or extracting blood perfusion data, wherein extracting material data comprises providing the image to a model, wherein the model is configured for receiving the image as input and for determining material data of the object by using the image, wherein extracting blood perfusion data comprises determining a speckle contrast of the image and determining a blood perfusion measure based on the determined speckle contrast, wherein a speckle contrast represents a measure for a mean contrast of an intensity distribution within an area of a speckle pattern.
  • Embodiment 7 The method according to any one of the three preceding embodiments, wherein the method comprises determining if the object corresponds to a user and/or a living human from the liveness data and allowing the user to access the resource in response to determining that the user corresponds to a user and/or living human.
  • Embodiment 8 The method according to any one of the preceding embodiments, wherein the generating of the image comprises using at least one camera, wherein the camera comprises at least one bandpass filter, wherein the bandpass filter is associated with a predefined wavelength range.
  • Embodiment 9 The method according to the preceding embodiment, wherein the predefined wavelength range is from 780 nm to 2200 nm, preferably from 840 nm to 1700 nm, more preferably from 920 nm to 960 nm.
  • Embodiment 10 The method according to any one of the preceding embodiments, wherein the determining if spectral radiance associated with the image of the object is within at least one predefined range comprises determining if an intensity of a frequency associated with a Fourier transform of the image is within a predefined range.
  • Embodiment 11 The method according to any one of the preceding embodiments, wherein the determining if spectral radiance associated with the image of the object is within at least one predefined range comprises determining a mean spectral radiance value associated with at least one part of the image and comparing the mean spectral radiance value to the predefined range.
  • Embodiment 12 The method according to any one of the preceding embodiments, wherein the determining if spectral radiance associated with the image of the object is within at least one predefined range comprises determining a number of pixels associated with a spectral radiance deviating from the predefined range.
  • Embodiment 14 The method according to any one of the preceding embodiments, wherein the manipulating comprises increasing the contrast of at least a part of the image associated with the user by using at least one spectral radiance scaling factor.
  • Embodiment 19 The method according to any one of the preceding embodiments, wherein the device is selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or, and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality device, and/or a wearable, such as a smart watch; or another type of portable computer.
  • the device is selected from the group consisting of: a television device; a game console; a personal computer; a mobile device, particularly a cell phone, and/or a smart phone, and/or, and/or a tablet computer, and/or a laptop, and/or a tablet, and/or a virtual reality device, and/or a wearable, such as a smart watch; or another type of portable computer.
  • Embodiment 20 The method according to anyone of the preceding method embodiments, wherein the method is computer-implemented.
  • Embodiment 23 Use of a device according to any one of the preceding embodiments referring to a device for authenticating a user for one or more of: in-car payment; vehicle access; starting a vehicle; access control, such as for at least one resource of a mobile device such as a mobile phone; in-cabin sensing; building access; ; at least one payment process; unlocking of at least one electronic device.
  • Embodiment 24 A computer program comprising instructions which, when the program is executed by the device according to any one of the preceding embodiments referring to a device, cause the device to perform the method according to any one of the preceding embodiments referring to a method.
  • Embodiment 25 A computer-readable storage medium comprising instructions which, when the instructions are executed by the device according to any one of the preceding embodiments referring to a device cause the device to perform the method according to any one of the preceding embodiments referring to a method.
  • Embodiment 26 A non-transient computer-readable medium including instructions that, when executed by one or more processors, cause the one or more processors to perform the method according to any one of the preceding embodiments referring to a method.
  • Figure 1 shows an embodiment of a device for authenticating a user of the device to perform at least one operation on the device that requires authentication in a schematic view
  • the device 110 comprises at least one communication interface 114 configured for receiving at least one request to access at least one resource.
  • the light pattern may comprise at least one regular and/or constant and/or periodic pattern, such as a triangular pattern, a rectangular pattern, a hexagonal pattern or a pattern comprising further convex tilings.
  • the light pattern is a hexagonal pattern, preferably a hexagonal light pattern, preferably a 2/5 hexagonal light pattern. Using a periodical 2/5 hexagonal pattern can allow distinguishing between artefacts and usable signal.
  • the pattern illumination source 118 may comprise at least one least one emitter, in particular a plurality of emitters.
  • the emitter may be selected from the group consisting of: at least one laser source; at least one vertical cavity surface emitting laser (VCSEL); at least one light emitting diode; at least one edge emitter.
  • the illumination source 116 may additionally comprise flood illumination source 120 configured for emitting flood light.
  • the flood light may comprise substantially continuous spatial illumination, in particular diffuse and/or uniform illumination.
  • the flood illumination source 120 may comprise at least one least one emitter, in particular a plurality of emitters.
  • the flood illumination source 120 may comprise at least one LED or at least one VCSEL, preferably a plurality of VCSELs. The plurality of VCSELs may overlap to a uniform area.
  • the emitting of the flood light and the illumination of the light pattern may be performed subsequently or at least partially overlapping in time.
  • the flood light and the light pattern may be emitted at the same time.
  • one of the flood light or the light pattern may be emitted with a lower intensity compared to the other one.
  • the device 110 further comprises at least one camera 122 configured for generate at least one image of the object while the object is being illuminated by the light, specifically by at least one of the flood light and the light pattern.
  • the camera 122 may comprise at least one camera chip, such as at least one CCD chip and/or at least one CMOS chip configured for recording images.
  • the camera 122 may comprise a CMOS camera, specifically a camera comprising a CMOS chip being sensitive in the infrared spectral range.
  • the camera 122 may be an internal and/or external camera of the device 110.
  • the camera 122 may comprise an internal camera of the device 110, specifically a front camera, such as a selfie camera.
  • other and/or further cameras, such as external cameras accessed via a hardware and/or a software interface may also be feasible.
  • the device 110 may further comprise a display 124.
  • the display 124 may be or may comprise at least one organic light-emitting diode (OLED) display and/or at least one quantum-dot light emitting diode (OLED) display.
  • the display 124 may be at least partially transparent in at least one continuous area covering the camera 122.
  • the camera 122, the illumination source 116, specifically the pattern illumination source 118 and the flood illumination source 120 may be arranged behind the display 124.
  • the light may traverse the display 124 while being illuminated from the illumination source 116.
  • the display 124 may be at least partially transparent.
  • the display 124 may be at least partially transparent in the at least one continuous area covering the pattern illumination source 118 and/or the flood illumination source 120 and/or the camera 122.
  • the display 124 may comprise a punch hole in the continuous area covering the pattern illumination source 118, the flood illumination source 120 and/or the camera 122.
  • the display 124 may have a transmission > 10 %, preferably > 15 %, more preferably > 20 %.
  • an intensity of a light beam after being projected through the display 124 may correspond to > 10 % of the intensity associated with the light beam when being emitted.
  • the device 110 further comprises at least one processing unit 126 configured for determining if spectral radiance associated with the image of the object is within at least one predefined range at least within tolerances.
  • the processing unit 126 is configured for manipulating spectral radiance, if the spectral radiance deviates from the predefined range, in at least a part of the image associated with the deviation.
  • the processing unit 126 is further configured for determining if the object associated with the image corresponds to a user and/or a living organism, wherein, in case the spectral radiance deviates from the predefined range, the manipulated image is used, or, in case the spectral radiance is within the predefined range, the generated image is used.
  • the processing unit 126 may comprise at least one processor 128.
  • the device 110 further comprises at least one authentication unit 130 configured for allowing to access the resource based on determining that the object corresponds to a user and/or a living organism.
  • the device 110 may further comprise at least one authorization unit 132 configured for allowing the user to perform at least one operation on the device 110, e.g. unlocking the device 110, in case of successful authentication of the user or declining the user to perform at least one operation on the device 110 in case of non-successful authentication.
  • the authorization unit 132 and the authentication unit 130 may be embodied integral, e.g. by using the same processor 128.
  • the device 110 may specifically be configured for performing a method for authenticating a user according to the present invention, such as the exemplary embodiment shown in Figure 2 and/or according to any other embodiment disclosed herein.
  • a method for authenticating a user such as the exemplary embodiment shown in Figure 2 and/or according to any other embodiment disclosed herein.
  • Figure 2 shows a flowchart of an exemplary embodiment of a method for authenticating a user of a device 110.
  • a device 110 according to the exemplary embodiment of Figure 1 may be used.
  • the spectral radiance associated with the image of the object may be outside the predetermined range.
  • the presence of the ambient light may be determined according to predefined frequencies in the image having an intensity above the predefined range. As an example, if an occurrence of predefined frequencies in the image is above the predefined range, the spectral radiance associated with the image of the object is determined to be outside the predefined range, e.g. indicating presence of ambient light due to sun light in the image.
  • the determining if spectral radiance associated with the image of the object is within at least one predefined range may comprise determining a mean spectral radiance value associated with at least one part of the image and comparing the mean spectral radiance value to the predefined range.
  • the mean spectral radiance value may comprise an arithmetic mean of spectral radiance values associated with the part of the image, e.g. an arithmetic mean of spectral radiance values associated with each pixel in the part of the image.
  • a region of interest in the image may be determined, e.g. a region comprising the object in the image, and a mean spectral radiance value of the region of interest compared to a predefined threshold value.
  • the determining if spectral radiance associated with the image of the object is within at least one predefined range may comprise determining a number of pixels associated with a spectral radiance deviating from the predefined range. For example, a number of pixels associated with a spectral radiance deviating from the predefined range may be determined and, if the number of pixels is above a predefined threshold value, the image may be evaluated to comprise ambient light and/or stray light.
  • the determining if spectral radiance associated with the image of the object is within at least one predefined range may comprise detecting a diffraction pattern and comparing the diffraction pattern to at least one reference diffraction pattern.
  • the reference diffraction pattern may comprise a diffraction pattern associated with the light and electronics of the device 110 comprising the illumination source 116.
  • the image may be evaluated to comprise ambient light and/or stray light.
  • the method comprises, if the spectral radiance deviates from the predefined range, manipulating spectral radiance in at least a part of the image associated with the deviation.
  • the manipulating may comprise increasing the contrast of at least a part of the image associated with the user by using at least one spectral radiance scaling factor.
  • the manipulating may comprise increasing the part of the user’s face in the image by using the spectral radiance scaling factor.
  • the spectral radiance scaling factor may specifically comprise a constant factor configured for scaling spectral radiance values. The increase in spectral radiance may specifically be useful in case the sun and/or a reflection of the sun is present in the image.
  • a first part of the image may show the sun and a second part the image may show at least a part of the user.
  • the contrast in the second part may be scaled such that the highest spectral radiance value in the second part is equal to the highest spectral radiance value in the first part by multiplying the highest spectral radiance value with the spectral radiance scaling factor.
  • the remaining spectral radiance values in the second part may be multiplied with the spectral radiance scaling factor.
  • the scaling may result in an increased contrast enhancing authenticating the user based on the image.
  • the manipulating may comprise identifying stray light and/or ambient light features in the image by comparing the image to at least one stray light and/or ambient light image.
  • the manipulating may further comprise subtracting the stray light and/or ambient light image from the image.
  • the subtracting of the stray light and/or ambient light from the image may result in an elimination of stray light and/or ambient light features.
  • the one or more stray light and/or ambient light features may specifically be identified by comparing the image with the stray light and/or ambient light image.
  • the stray light and/or ambient light image may comprise one or more illumination features resulting from the projection of the light emitted by the illumination source 116.
  • a resulting image may be obtained comprising the user features only.
  • the manipulating may comprise identifying stray light and/or ambient light features in the image.
  • the identifying may comprise using a stray light map being indicative of stray light within the image.
  • the stray light map may be generated and/or may be retrieved, e.g. from a local storage of the device 110 and/or via a communication interface 114 of the device 110.
  • the identifying may comprise identifying an ambient light source within the image and verifying the presence of the ambient light source using information obtained by the at least one sensor 133 of the device 110.
  • the information may be one or more of position information, orientation information, or time.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Security & Cryptography (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Image Input (AREA)

Abstract

Procédé et dispositif d'authentification d'un utilisateur d'un dispositif (110), le procédé comprenant au moins les étapes qui consistent à : i. recevoir au moins une demande d'accès à au moins une ressource, ii. en réponse à la réception de la demande d'accès à la ressource, déclencher l'éclairage d'au moins un objet par la lumière émise par au moins une source d'éclairage (116), et iii. déclencher la génération d'au moins une image de l'objet pendant que l'objet est éclairé par la lumière, iv. déterminer si une luminance spectrale associée à l'image de l'objet se situe dans au moins une plage prédéfinie au moins dans des tolérances, si la luminance spectrale s'écarte de la plage prédéfinie, une luminance spectrale est manipulée dans au moins une partie de l'image associée à l'écart, v. déclencher la détermination si l'objet associé à l'image correspond à un utilisateur et/ou à un organisme vivant, dans le cas où la luminance spectrale s'écarte de la plage prédéfinie, l'image manipulée est utilisée, ou, dans le cas où la luminance spectrale se situe dans la plage prédéfinie, l'image générée est utilisée, vi. autoriser l'accès à la ressource sur la base de la détermination du fait que l'objet correspond à un utilisateur et/ou à un organisme vivant.
PCT/EP2025/054666 2024-02-23 2025-02-21 Procédé d'authentification d'un utilisateur d'un dispositif Pending WO2025176821A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP24159270 2024-02-23
EP24159270.8 2024-02-23

Publications (1)

Publication Number Publication Date
WO2025176821A1 true WO2025176821A1 (fr) 2025-08-28

Family

ID=90057552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2025/054666 Pending WO2025176821A1 (fr) 2024-02-23 2025-02-21 Procédé d'authentification d'un utilisateur d'un dispositif

Country Status (1)

Country Link
WO (1) WO2025176821A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018091638A1 (fr) 2016-11-17 2018-05-24 Trinamix Gmbh Détecteur pour détecter optiquement au moins un objet
US20200082157A1 (en) * 2018-09-11 2020-03-12 Apple Inc. Periocular facial recognition switching
US20200184059A1 (en) * 2017-09-07 2020-06-11 Beijing Sensetime Technology Development Co., Ltd. Face unlocking method and apparatus, and storage medium
WO2020187719A1 (fr) 2019-03-15 2020-09-24 Trinamix Gmbh Détecteur permettant d'identifier au moins une propriété de matériau
US20230104411A1 (en) * 2021-10-06 2023-04-06 Axis Ab Method and system for stray light compensation
WO2023156469A1 (fr) 2022-02-15 2023-08-24 Trinamix Gmbh Système et procédé permettant de déterminer un matériau d'un objet
WO2023156315A1 (fr) 2022-02-15 2023-08-24 Trinamix Gmbh Authentification de visage comprenant des données de matériau extraites d'une image

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018091638A1 (fr) 2016-11-17 2018-05-24 Trinamix Gmbh Détecteur pour détecter optiquement au moins un objet
WO2018091640A2 (fr) 2016-11-17 2018-05-24 Trinamix Gmbh Détecteur pouvant détecter optiquement au moins un objet
WO2018091649A1 (fr) 2016-11-17 2018-05-24 Trinamix Gmbh Détecteur destiné à la détection optique d'au moins un objet
US20200184059A1 (en) * 2017-09-07 2020-06-11 Beijing Sensetime Technology Development Co., Ltd. Face unlocking method and apparatus, and storage medium
US20200082157A1 (en) * 2018-09-11 2020-03-12 Apple Inc. Periocular facial recognition switching
WO2020187719A1 (fr) 2019-03-15 2020-09-24 Trinamix Gmbh Détecteur permettant d'identifier au moins une propriété de matériau
US20230104411A1 (en) * 2021-10-06 2023-04-06 Axis Ab Method and system for stray light compensation
WO2023156469A1 (fr) 2022-02-15 2023-08-24 Trinamix Gmbh Système et procédé permettant de déterminer un matériau d'un objet
WO2023156315A1 (fr) 2022-02-15 2023-08-24 Trinamix Gmbh Authentification de visage comprenant des données de matériau extraites d'une image

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
"High Dynamic Range Imaging - 02 Light and Color", 31 December 2006, ISBN: 978-0-12-585263-0, article REINHARD ERIK ET AL: "High Dynamic Range Imaging - 02 Light and Color", pages: 19 - 83, XP093179621, DOI: https://doi.org/10.1016/B978-012585263-0/50003-8 *
C. SZEGEDY ET AL.: "Going deeper with convolutions", CORR, ABS/1409.4842, 2014
FLORIAN SCHROFFDMITRY KALENICHENKOJAMES PHILBIN: "FaceNet: A Unified Embedding for Face Recognition and Clustering", ARXIV:1503.03832
G. B. HUANGM. RAMESHT. BERGE. LEARNED-MILLER: "Technical Report", vol. 07, October 2007, UNIVERSITY OF MASSACHUSETTS, article "Labeled faces in the wild: A database for studying face recognition in unconstrained environments", pages: 49
KOLB C ET AL: "A REALISTIC CAMERA MODEL FOR COMPUTER GRAPHICS", SIGGRAPH '95: PROCEEDINGS OF THE 22ND ANNUAL CONFERENCE ON COMPUTER GRAPHICS AND INTERACTIVE TECHNIQUES; [COMPUTER GRAPHICS PROCEEDINGS (SIGGRAPH)], ACM, NEW YORK, NY, USA, 6 August 1995 (1995-08-06), pages 317 - 324, XP000546243, ISBN: 978-0-89791-701-8 *
L. WOLFT. HASSNERI. MAOZ: "Face recognition in unconstrained videos with matched background similarity", IEEE CONF. ON CVPR, 2011
M. D. ZEILERR. FERGUS: "Visualizing and understanding convolutional networks", CORR, ABS/1311.2901, 2013

Similar Documents

Publication Publication Date Title
US20250094553A1 (en) Face authentication including material data extracted from image
US12288421B2 (en) Optical skin detection for face unlock
CN112232163A (zh) 指纹采集方法及装置、指纹比对方法及装置、设备
CN112232159B (zh) 指纹识别的方法、装置、终端及存储介质
US20250078567A1 (en) Face authentication including occlusion detection based on material data extracted from an image
WO2025040591A1 (fr) Rugosité cutanée en tant qu'élément de sécurité pour déverrouillage facial
WO2024231531A1 (fr) Projecteur à delo
WO2025176821A1 (fr) Procédé d'authentification d'un utilisateur d'un dispositif
US20250095335A1 (en) Image manipulation for material information determination
EP4530666A1 (fr) Projecteur 2in1 à vcsel polarisés et diviseur de faisceau
WO2025162970A1 (fr) Système d'imagerie
WO2025040650A1 (fr) Synchronisation de mesure de référence d'authentification de visage
CN118696352A (zh) 用于识别显示设备的系统
WO2025012337A1 (fr) Procédé d'authentification d'un utilisateur d'un dispositif
WO2025172524A1 (fr) Analyse de profil de faisceau en combinaison avec des capteurs de temps de vol (tof)
WO2024200502A1 (fr) Élément de masquage
WO2025046067A1 (fr) Éléments optiques sur des vcsel d'inondation pour projecteurs 2in1
WO2025046063A1 (fr) Tomographie optique diffuse combinée à laser vcsel unique et lampe-projecteur à faisceau large
EP4666254A1 (fr) Authentification de diode électroluminescente organique (oled) derrière une oled
CN121195290A (zh) 结合oled的投影仪
WO2024170254A1 (fr) Système d'authentification pour véhicules
WO2023156478A1 (fr) Procédé de fonctionnement d'un dispositif d'affichage et dispositif d'affichage doté d'un processus d'authentification sécurisé
WO2025252691A1 (fr) Détermination d'une propriété de surface
WO2024170598A1 (fr) Authentification de diode électroluminescente organique (oled) derrière une oled
WO2025003364A1 (fr) Capteur rvb-ir à obturateur roulant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25707051

Country of ref document: EP

Kind code of ref document: A1