WO2025165256A1 - Système et procédé de détection de vivacité passive - Google Patents
Système et procédé de détection de vivacité passiveInfo
- Publication number
- WO2025165256A1 WO2025165256A1 PCT/RU2024/000026 RU2024000026W WO2025165256A1 WO 2025165256 A1 WO2025165256 A1 WO 2025165256A1 RU 2024000026 W RU2024000026 W RU 2024000026W WO 2025165256 A1 WO2025165256 A1 WO 2025165256A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- transition points
- pattern
- biometric data
- sequence
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/40—Spoof detection, e.g. liveness detection
- G06V40/45—Detection of the body part being alive
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
Definitions
- the present invention generally relates to verification of the validity of biometric data. More specifically, the present invention relates to a computer- implemented passive liveness detection system with a controlled illumination to assess liveness of a captured biometric data, for example an image of a user, without secondary requests from the user.
- Such approaches usually have relatively low accuracies for liveness detections and require secondary requests from the user or supplemental methods of verification.
- Low accuracy for liveness detection causes a high level of vulnerability of computing apparatuses that perform the user verification that are easily spoofed. This results in unpermitted access to the computing apparatus by the wrongdoers thereby causing substantial harm to the user and a service provider, such as a banking institution, merchant, health care provider and the like
- a system and method for passive liveness detection are provided.
- the system includes a device connected to the camera, a lighting source, at least one processor, and a non-transitory machine-readable medium having instructions stored therein, which when executed by the processor, cause the processors to perform operations.
- a method for passive liveness detection provides for detecting the biometric data, restricting the biometric data to a specific position, capturing the biometric data by the camera while projecting a random sequence illumination pattern from the light source thereby creating a captured image, determining transition points based on frames from the captured image, predicting a transition points pattern from the projected random sequence illumination pattern and the transition points, comparing the transition points pattern to a real random pattern sequence, and detecting whether an attack is launched based on the comparison of the transition points pattern to the real random pattern sequence.
- FIG. 1 depicts a diagram of an exemplary computer-implemented system to assess liveness of a biometric data of a user in accordance with embodiments of this invention
- FIG. 3 depicts examples of a random sequence illumination pattern in accordance with embodiments of this invention.
- FIG. 4 depicts a diagram of a method to determine average probabilities of the transition points in accordance with embodiments of this invention.
- FIG. 5 depicts a diagram of a method to assess liveness and detect an attack in accordance with embodiments of this invention.
- a computer-implemented passive liveness detection system uses a controlled illumination to assess liveness of a captured biometric data, preferably, without secondary requests from the user.
- the biometric data can be an image of the user obtained from a video or photograph.
- the present invention automatically assesses liveness of a captured biometric data (e.g., a user’s face) when a backlight randomized color scheme (“flickering”) is emitted on to the user’s face.
- the backlight scheme can be sequenced and/or timed.
- the liveness is determined from an image sequence (e.g., video or frame) of the captured biometric data based on the assessment of resulting parameters, such as three-dimensional (3D) structure, texture and albedo.
- the backlight randomized color scheme allows unique random verification instances for each liveness verification, effectively illuminating a possibility of “spoofing” that is required for presentation and composing attacks.
- assessment of the background realism can be performed by processing dynamics of the background color when the projected light changes.
- Intermediate or final results of the processes can be combined with a number of additional liveness assessments, features or processing results, which can include, but are not limited to: assessment of specular reflection and micromovements of the iris biometric estimations (rPPG, respiratory assessment, involuntary micromovements, and the like), two-dimensional (2D) liveness on separate frames (i.e, liveness detection based on one RGB frame without any additional data), assessment of the background liveness by presence of local artifacts and global movements corresponding to presentation attacks, and combination and/or processing results of the above-mentioned methods.
- rPPG specular reflection and micromovements of the iris biometric estimations
- 2D two-dimensional
- biometric data is exemplified in this disclosure via user’s face. It is, however, understood that embodiments recited in this disclosure can include other biometric data, such as skin color or tone, blink dynamics, speech patterns, and chemical analysis from, for example, a breathalyzer or DNA sample. Furthermore, embodiments of this invention can facilitate securing access to a host site, user’s personal financial or other information or other generally protected data. As such the embodiments are rooted in and/or tied to computer technology in order to overcome a problem specifically arising in the realm of computers, specifically authenticating access by a user.
- Embodiments of the present invention can facilitate validating a user, using a single-factor authentication method or as part of a multi-factor authentication method. Accordingly, embodiments of the present invention can facilitate restricting access or usage of the system by a specific user. For example, the user may want to allow only the user to log into a host system, such as a banking system, a remote access system, or any other such computer program products, and only from a user-designated location, such as the user’s home-office.
- a host system such as a banking system, a remote access system, or any other such computer program products
- embodiments of the present invention can be employed in conjunction with an independent validation of the user’s identity. That is, the biometric data confirms the actual identity of the user.
- the biometric data confirms the actual identity of the user.
- an original identity document for example passport, driving license, and/or debit card.
- Cross-checks to other databases such as driving license issuers, national passport or identity card issuers, credit scoring databases, and alerts of identity theft can be also used to verify the actual identity of the user.
- FIG. 1 illustrates an exemplary embodiment of the computer-implemented passive liveness detection system 100 with a controlled illumination to assess liveness of a captured biometric data.
- a computer system can include a device 120, such as a personal computer, mobile device (e.g., a mobile phone or tablet), workstation, embedded system or any other computer system for accessing information stored on a host system (not shown).
- the device 120 can include a processor and memory for executing and storing instructions, a software with one or more applications and an operating system, and a hardware with a processor, memory and/or graphical user interface display.
- the device 120 may also have multiple processors and multiple shared or separate memory components.
- the system 100 includes a front-facing camera 140.
- the camera 140 can be embedded into the device 120, e.g., mobile device camera, and displayed on the same side as a display screen 125. If a desktop computer is used as the device 120, such that the desktop computer is connected by a wired or wireless connection to camera 140.
- the camera 140 may be a webcam, or a similar type of camera.
- devices 120 and camera 140 implementations exist, and the examples presented herein are intended to be illustrative only.
- the device 120 is configured to capture an image 130 via the camera 140 for accessing information stored on the host system.
- the image 130 is a digital image, stored and transmitted in one or more predetermined file formats, such as Portable Network Graphics (PNG), Joint Photographic Experts Group (JPEG), Tagged Image File Format (TIFF) or any other such file format.
- PNG Portable Network Graphics
- JPEG Joint Photographic Experts Group
- TIFF Tagged Image File Format
- the image 130 can be a photo, sequence of images, or a video stream of the user.
- the device 120 further can include at least one illumination source 160.
- the illumination source 160 can be a front-facing illumination source from a mobile device, a PC display light or the like.
- the illumination source 160 has frame per second (FPS) rate of at least 20 and minimal video resolution of 1280, 720 progressive scan (p).
- the display screen 125 can also be used as the light source.
- the display screen 125 can be configured to change mode of illumination by changing contents displayed on the display screen 125.
- the illumination source 160 can project a specific time-base light sequence (“flickering”). That is, the illumination source can project a specific sequence of backlight color scheme onto, for example, the user’s face at the same time the image 130 is captured. This allows randomly generating a backlight pattern that creates unique random images 130 each time the image 130 is illuminated.
- the system 100 includes an implemented computer program 150 that can operate on the device 120 or via cloud computing services accessible via network connection. That is, the device 120 can be connected over a network to one or more servers 155.
- the implemented computer program 150 includes an authentication module 170.
- the implemented computer program 150 is configured to first detect the user’s face position, image data format, data compliance and connectivity compliance.
- the image data compliance can detect and then normalize the image data by filtering or requiring correction of inappropriate conditions (e.g., too bright, too dark conditions, contrast issues).
- the user can be provided with visual, voice and text aids for achieving a proper face positioning in front of the camera 140 and maintaining a specific face position 220, i.e., the user looks directly into the camera without turning away and does not make any movements.
- the user can receive a message via text if the computer program 150 detects a foreign object or another person in the frame of the camera or a request is sent to adjust brightness of the camera 140.
- a self-control zone can be displayed in the screen center of the device 120, an ellipse with a central frame part inside and a “progress bar” on the border.
- the specific face position 220 can be determined by restricting bounding box coordinates.
- center of the bounding box must be between 40% and 60% of resolution both vertically and horizontally, at least 30% vertical resolution and 40% horizontal in size and occupies a maximum of 80% of vertical and horizontal resolution.
- center of the bounding box must be between 30% and 70% of resolution both vertically and horizontally, at least 60% vertical resolution and 70% horizontal in size and occupies a maximum of 100% of vertical and horizontal resolution.
- Those restrictions can also include landmark-based rules and time-based rules (e.g. restricting motion of bounding box coordinates and/or landmarks).
- the authentication module 170 generates a random sequence illumination pattern 230, which is communicated to the device 120 to create the image 130 using the random sequence illumination pattern 230.
- the random sequence illumination pattern 230 is essentially a specific sequence of backlight color scheme.
- the camera 140 has autoexposure and autofocus disabled because in case of lack of backlight brightness (exceeding the brightness of ambient light), a false alarm can occur. It is also preferred that the image 130 is taken at the camera 140 highest possible brightness and illumination is performed in such a way that the entire image is always illuminated by one optical illumination channel.
- the random sequence illumination pattern 230 can be described as providing screen halves (both horizontal and vertical) with stripes of different colors. By “rotating” these halves, the pattern allows you to project up to 10 different images per second. According to the embodiment of the present invention, 4 different images per second are used without discomfort for the user. Correspondingly, vertical and horizontal stripes alternate, which makes it possible to establish both the random pattern (for preventing video/video stream substitution during an attack) and usage of the corresponding images to evaluate the corresponding components of the normals to the face surface.
- FIG. 3 illustrates examples of the random sequence illumination pattern 230.
- a first color color of the left half for a vertical instance, or the color of the upper half in case of a horizontal instance
- a second color is, respectively, on the right or lower half of the same instance.
- Transition points 250 shown in FIG. 2 that are used for further analysis by the authentication module 170 is a transition point at which the orientation of the image has changed.
- FIG. 3 shows a non-randomized illustration pattern (a).
- the patterns (b) and (c) illustrate the examples of the randomized patterns according to the embodiment of the present invention.
- the duration of the image 130 is inversely proportional to the frequency of image change, for example, if the pattern changes 4 times per second, then the duration of the image 130 is 250 milliseconds (ms).
- Image 130 is a composition of several images. More images in the composition leads to a more stable and resistant to hacking algorithm, but on the other hand it will be less convenient for the user. It is preferred that the duration of illumination should not exceed 5 seconds, according to the embodiment of the present invention 2 seconds are used, from which it follows that image 130 is a composition of eight images.
- a single instance of creating the image 130 can take about two seconds and generates seven transition points 250.
- Different randomized variations of the random sequence illumination pattern 230 can be used to generate transition points 250.
- One of the variations is to keep the color scheme constant, that is, two fixed colors are used, for example, #00ffff and #ffOOff.
- the number of images preferably are eight.
- the orientation of the next image should be different from the previous image.
- the order of the colors is chosen randomly.
- the frames of the image 130 are communicated (e.g., via Internet (Ethernet, LAN/WLAN, wi-fi, mobile web or any other connection method), similar methods of connecting 2 different devices e. g. Bluetooth® or internal data bus) to the authentication module 170.
- Internet Ethernet, LAN/WLAN, wi-fi, mobile web or any other connection method
- similar methods of connecting 2 different devices e. g. Bluetooth® or internal data bus
- the authentication module 170 is configured to extract transition points 250 from the captured images 130.
- the transition points 250 are determined by evaluating images frames based on the random sequence illumination pattern 230. That is, each frame of the patterned images is evaluated by detecting and segmenting the user’s face. Average pixel values are calculated in these regions (per channel). Reciprocal change in pixel values averages suggests changes in the direction of illumination. Accordingly, using the lower threshold and nonmaximum suppression on criterion, the transition points 250 and the corresponding surrounding frames are determined.
- the left and right frames (3-5 frames) from the detected transition point 250 (with a gap of one in case of an incomplete screen transition, which means that transition was caught on two adjacent frames) are considered frame blocks used by a method 400 (shown in FIG. 4).
- transition points 250 are determined, as well as transition points patterns (pair of images: image before transition points and image after that), they are compared to a real random pattern sequence 255.
- the real random pattern sequence 255 is the pattern that was used for illumination of the user while creating the image 130. If the comparison yields that the transition points patterns 252 and the real random pattern sequence 255 are dissimilar then this indicates that an attack is detected, and an alert is sent to the host. To the contrary, if the comparison yields synchronized patterns than the authentication module 170 performs additional analysis to determine liveness based on the probabilities for all transition points 250 as shown in FIG. 4.
- FIG. 4 illustrates the method 400 for evaluation liveness of the image 130 by the authentication module 170.
- the transition points 250 are determined as also shown in FIG. 2.
- face detection and segmentation are performed on each frame of the image 130, as well as segmentation of the left and right halves of the face by key point.
- the average pixel values are calculated of these regions per channel.
- Ri eft represents the average value of the intensity of the red color of the frame, obtained from the left half of the face.
- R ri ght represents the average value of the intensity of the red color of the frame, obtained from the right half of the face.
- the same definitions are also used for the remaining colors: green (G) and blue (B).
- step 420 frame blocks and colors are rearranged to provide a consistent look for different combinations of lighting patterns.
- the order can be determined by the feature of the color, namely, whether it is changeable or not.
- the image preprocessing is performed by centering and normalization by using the average pixel value and sample variance, respectively.
- step 430 for each of the two preprocessed images, disparity maps are calculated, namely pixels are normalized by the value of the common channel or z-normalized by all pixels of the face. Next, each pixel of the map is considered whole map is normalized.
- ChannelL denotes the (i ;j)-the pixel value for one of the changing channels (colors)
- Channel2ij denotes the second channel accordingly.
- the resulting maps represent an estimation of the albedo of the face and one of the components of the normals to the face surface.
- the albedo and the x-th component of the face surface normals are estimated from the image with a vertical pattern.
- the y-th component is estimated.
- the obtained paired disparity maps are used to fully evaluate the three-dimensional (3D) shape of the face in step 440.
- step 450 the resulting maps (relative difference and two disparity maps) are used to build histograms of pixel value distributions for the extracted maps obtained from the block. Additionally, histograms of average values for rows and columns are calculated (i.e., projections on the x-th and y-th components of the obtained maps).
- step 460 image embeddings, obtained from each block, is the combination of all histograms of the block, which were additionally smoothed by a Gaussian function or averaging window.
- the corresponding embeddings are passed to the random forest classifier, and the verdict is the label with the maximum probability, the dataset for which is the videos passed through the above process.
- the embeddings of each detected transition are labeled real, display, paper and the like for each type of attack.
- the embeddings can also be used for backlight image classification. They are fed into the classifier to determine the likelihood of the pair of images that were backlighted in the block. Finally, there is an additional check that the sequence of backlight images has a sufficiently high likelihood. If the likelihood is low, the video is labeled as an attack.
- the final step 470 is to average the probabilities for all transition points 250.
- the system 100 can be configured to support and supplement findings based on the method 400, by additional methods of liveness assessment.
- these methods can include determining an assessment of specular reflection and micromovements of the iris of the user, providing biometric estimations, such as rPPG, respiratory assessment and involuntary micro-movements.
- biometric estimations such as rPPG
- respiratory assessment and involuntary micro-movements.
- determining two dimensional liveness on separate frames can be performed to supplement the method described in this disclosure, for example, liveness detection based on one RGB frame without additional data, assessment of the background liveness by presence of local artifacts and global movements corresponding to presentation attacks, and other methods for liveness detection.
- FIG. 5 is a flow diagram illustrating a computer-implemented method embodied in the system 100 shown in FIGS. 1 and 2 for assessing liveness of a captured biometric data (e.g., face of the user) using the system 100.
- the method 500 assumes access to the device 120 and the computer program 150.
- the user is prompted by the computer program 150 to restrict his or her face within the communicated parameters thereby achieving the specific face positioning 220.
- the image 130 is captured by the camera 140.
- the random sequence illumination pattern 230 is projected via illumination source 160 (e.g., mobile device LED light).
- the authentication module 170 of the computer program 150 receives the frames of the image 130 and evaluates the frames to determine the transition points 250. Once the transition points 250 are determined, in step 540 the transition points 250 and image changes predicted from the projected random sequence illumination pattern 230 are compared to the real random pattern sequence pattern 225. If the transition points and patterns are not similar to the real random pattern sequence 225, the attack is detected and the system 100 and/or the host are alerted.
- step 550 the transition points 250 are used, based on the assessment of the 3D structure, local texture and albedo of the face to assess the liveness via the method 400. Each transition point 250 is labeled based on the assessment - real, display, paper, and the like, based on the assessment and the type of the attack.
- the authentication module 170 detects based on the assessment in step 540 whether the image 130 is an attack (e.g., paper, mask, photo and so one) or authentic (live). If the attack is detected the system 100 alerts the user or the host in step 560. If possible the type of the attack is also identified.
- an attack e.g., paper, mask, photo and so one
- the system 100 alerts the user or the host in step 560. If possible the type of the attack is also identified.
- the present invention can be a system, a method, and/or a computer program product.
- the computer program product can include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
- the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
- the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- SRAM static random access memory
- CD-ROM compact disc read-only memory
- DVD digital versatile disk
- memory stick a floppy disk
- a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
- a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
- Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
- the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
- a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
- Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
- the computer readable program instructions may execute entirely on the user’s computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
- the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
- electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures.
- two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration can be implemented by special purpose hardware- based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Security & Cryptography (AREA)
- Human Computer Interaction (AREA)
- Artificial Intelligence (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Hardware Design (AREA)
- Collating Specific Patterns (AREA)
Abstract
L'invention concerne un système et un procédé pour évaluer la vivacité d'une donnée biométrique d'un utilisateur. Le système possède un dispositif connecté à la caméra, une source d'éclairage, au moins un processeur et un support lisible par machine non transitoire sur lequel sont stockées des instructions qui, lorsqu'elles sont exécutées par le processeur, amènent les processeurs à effectuer des opérations. Les opérations consistent à détecter les données biométriques, à restreindre les données biométriques à une position spécifique, à capturer les données biométriques par la caméra tout en projetant un motif d'éclairage de séquence aléatoire à partir de la source de lumière, créant ainsi une image capturée, à déterminer des points de transition sur la base de trames à partir de l'image capturée, à prédire un motif de points de transition à partir du motif d'éclairage de séquence aléatoire projeté et des points de transition, à comparer le motif de points de transition à une séquence de motifs aléatoires réelle, et à détecter si une attaque est lancée sur la base de la comparaison du motif de points de transition à la séquence de motifs aléatoires réelle.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/RU2024/000026 WO2025165256A1 (fr) | 2024-02-02 | 2024-02-02 | Système et procédé de détection de vivacité passive |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/RU2024/000026 WO2025165256A1 (fr) | 2024-02-02 | 2024-02-02 | Système et procédé de détection de vivacité passive |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025165256A1 true WO2025165256A1 (fr) | 2025-08-07 |
Family
ID=96590834
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/RU2024/000026 Pending WO2025165256A1 (fr) | 2024-02-02 | 2024-02-02 | Système et procédé de détection de vivacité passive |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025165256A1 (fr) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| RU2741768C2 (ru) * | 2016-06-30 | 2021-01-28 | Конинклейке Филипс Н.В. | Способ и устройство для систем детектирования/распознавания лица |
| US20210182584A1 (en) * | 2019-12-17 | 2021-06-17 | Daon Holdings Limited | Methods and systems for displaying a visual aid and enhancing user liveness detection |
| US20210216800A1 (en) * | 2020-01-09 | 2021-07-15 | AuthenX Inc. | Liveness detection apparatus, system and method |
| RU2770752C1 (ru) * | 2018-11-16 | 2022-04-21 | Биго Текнолоджи Пте. Лтд. | Способ и устройство для обучения модели распознавания лица и устройство для определения ключевой точки лица |
| US20220375259A1 (en) * | 2013-05-31 | 2022-11-24 | IDMission LLC | Artificial intelligence for passive liveness detection |
| US20230206700A1 (en) * | 2021-12-29 | 2023-06-29 | Elm | Biometric facial recognition and liveness detector using ai computer vision |
-
2024
- 2024-02-02 WO PCT/RU2024/000026 patent/WO2025165256A1/fr active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20220375259A1 (en) * | 2013-05-31 | 2022-11-24 | IDMission LLC | Artificial intelligence for passive liveness detection |
| RU2741768C2 (ru) * | 2016-06-30 | 2021-01-28 | Конинклейке Филипс Н.В. | Способ и устройство для систем детектирования/распознавания лица |
| RU2770752C1 (ru) * | 2018-11-16 | 2022-04-21 | Биго Текнолоджи Пте. Лтд. | Способ и устройство для обучения модели распознавания лица и устройство для определения ключевой точки лица |
| US20210182584A1 (en) * | 2019-12-17 | 2021-06-17 | Daon Holdings Limited | Methods and systems for displaying a visual aid and enhancing user liveness detection |
| US20210216800A1 (en) * | 2020-01-09 | 2021-07-15 | AuthenX Inc. | Liveness detection apparatus, system and method |
| US20230206700A1 (en) * | 2021-12-29 | 2023-06-29 | Elm | Biometric facial recognition and liveness detector using ai computer vision |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6487105B2 (ja) | アクセス制御される環境へのアクセスを認可するためのシステム及び方法 | |
| KR102324706B1 (ko) | 얼굴인식 잠금해제 방법 및 장치, 기기, 매체 | |
| US10924476B2 (en) | Security gesture authentication | |
| US11023757B2 (en) | Method and apparatus with liveness verification | |
| De Marsico et al. | Firme: Face and iris recognition for mobile engagement | |
| US11093770B2 (en) | System and method for liveness detection | |
| US12210605B2 (en) | Spoof detection using illumination sequence randomization | |
| WO2020034733A1 (fr) | Procédé et appareil d'authentification d'identité, dispositif électronique, et support de stockage | |
| US11373449B1 (en) | Systems and methods for passive-subject liveness verification in digital media | |
| TW202026948A (zh) | 活體檢測方法、裝置以及儲存介質 | |
| CN113205057B (zh) | 人脸活体检测方法、装置、设备及存储介质 | |
| CN106663157A (zh) | 用户认证方法、执行该方法的装置及存储该方法的记录介质 | |
| CN112868028A (zh) | 利用虹膜图像的欺骗检测 | |
| CN108833359A (zh) | 身份验证方法、装置、设备、存储介质及程序 | |
| CN111144277B (zh) | 一种带活体检测功能的人脸验证方法和系统 | |
| WO2019200872A1 (fr) | Procédé et appareil d'authentification, dispositif électronique, programme informatique et support de données | |
| KR20220037995A (ko) | 시각적 인코딩 등록 및 프로세싱 플랫폼 | |
| CN111756951B (zh) | 生物特征认证系统、方法及计算机可读存储介质 | |
| CN112395580A (zh) | 一种认证方法、装置、系统、存储介质和计算机设备 | |
| US12277804B2 (en) | Spoof detection using catadioptric spatiotemporal corneal reflection dynamics | |
| WO2025165256A1 (fr) | Système et procédé de détection de vivacité passive | |
| Bashier et al. | Graphical password: Pass-images Edge detection | |
| KR102579610B1 (ko) | Atm 이상행동감지 장치 및 그 장치의 구동방법 | |
| US20250104479A1 (en) | Injection and Other Attacks | |
| Sanh et al. | Effective know-your-customer method for secure and trustworthy non-fungible tokens in media assets |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24922616 Country of ref document: EP Kind code of ref document: A1 |