WO2021072673A1 - Procédé et appareil d'identification d'identité d'utilisateur, et dispositif électronique - Google Patents
Procédé et appareil d'identification d'identité d'utilisateur, et dispositif électronique Download PDFInfo
- Publication number
- WO2021072673A1 WO2021072673A1 PCT/CN2019/111442 CN2019111442W WO2021072673A1 WO 2021072673 A1 WO2021072673 A1 WO 2021072673A1 CN 2019111442 W CN2019111442 W CN 2019111442W WO 2021072673 A1 WO2021072673 A1 WO 2021072673A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- face image
- stored
- identity information
- facial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
Definitions
- the embodiments of the present application relate to the field of electronic information technology, and in particular, to a method, device, and electronic equipment for user identification.
- face recognition is applied in various fields to realize the identification of users.
- face recognition is based on the user’s facial features to identify the user, and the user’s facial features are prone to change due to age, facial reshaping, etc. If the user’s face The change of the characteristics will easily lead to the failure of face recognition, and then the user's identity cannot be accurately recognized. Therefore, how to overcome the influence brought by the change of facial features to accurately identify the user's identity is a technical problem that needs to be solved urgently at present.
- the embodiments of the present application aim to provide a user identification method, device, and electronic equipment, which can improve the accuracy of user identification.
- a technical solution adopted in the embodiments of the present application is to provide a user identification method, including:
- the method further includes:
- An identity database is established in advance, and the identity database includes pre-stored identity information and pre-stored face images corresponding to the pre-stored identity information; then,
- the identifying the identity information of the user according to the facial feature specifically includes:
- a pre-stored facial image matching the facial feature is recognized, it is determined that the identity recognition is successful, and the pre-stored identity information corresponding to the pre-stored facial image matching the facial feature is determined as the identity of the user information;
- the method further includes:
- the face image is stored as a backup face image corresponding to the identity information of the user.
- the method when storing the face image as a backup face image corresponding to the identity information of the user, the method further includes:
- the pre-stored face image is set as priority matching
- the standby facial image is set as a priority matching.
- the pre-stored face image is deleted.
- the extracting the eyeball feature of the user from the face image specifically includes:
- the identity database also includes a pre-stored iris feature code corresponding to the pre-stored identity information
- the extracting the eyeball features of the user from the eye image specifically includes:
- the identifying the identity information of the user according to the eyeball feature specifically includes:
- a pre-stored iris feature code that matches the iris feature code is recognized, it is determined that the identity recognition is successful, and the pre-stored identity information corresponding to the pre-stored iris feature code that matches the iris feature code is determined as the identity of the user information;
- the pre-stored iris feature code matching the iris feature code is not recognized, it is determined that the identity recognition fails.
- a user identification device including:
- the acquisition module is used to acquire the user's face image
- the first extraction module is configured to extract the facial features of the user from the face image
- the first recognition module is configured to recognize the identity information of the user according to the facial feature
- the second extraction module is configured to extract the eyeball features of the user from the face image when the identity recognition fails;
- the second identification module is used to identify the user's identity information according to the eyeball characteristics.
- the device further includes:
- the establishment module is used to establish an identity database in advance, the identity database including pre-stored identity information and pre-stored face images corresponding to the pre-stored identity information; then,
- the first identification module is specifically configured to:
- a pre-stored facial image matching the facial feature is recognized, it is determined that the identity recognition is successful, and the pre-stored identity information corresponding to the pre-stored facial image matching the facial feature is determined as the identity of the user information;
- the device further includes:
- a determining module configured to determine whether the face image is occluded when the recognition fails according to the facial feature and the identity information of the user is successfully recognized according to the eyeball feature;
- the face image is stored as a backup face image corresponding to the identity information of the user.
- the determining module when storing the face image as a backup face image corresponding to the identity information of the user, is further configured to:
- the pre-stored face image is set as priority matching
- the standby facial image is set as a priority matching.
- the device further includes:
- the deleting module is configured to delete the pre-stored face image if the matching degree of the pre-stored face image is less than the matching degree of the standby face image within the preset number of matching times.
- the second extraction module specifically includes:
- An interception module for intercepting the eye image of the user in the face image
- the third extraction module is used to extract the eyeball features of the user from the eye image.
- the identity database also includes a pre-stored iris feature code corresponding to the pre-stored identity information
- the third extraction module is specifically used for:
- the second identification module is specifically configured to:
- a pre-stored iris feature code that matches the iris feature code is recognized, it is determined that the identity recognition is successful, and the pre-stored identity information corresponding to the pre-stored iris feature code that matches the iris feature code is determined as the identity of the user information;
- the pre-stored iris feature code matching the iris feature code is not recognized, it is determined that the identity recognition fails.
- an electronic device including:
- At least one processor and
- the device can be used to perform the methods described above.
- another technical solution adopted by the embodiments of the present application is to provide a computer program product containing program code, which when the computer program product runs on an electronic device, causes the electronic device to execute the above The method described.
- the embodiments of the present application provide a user identity recognition method, device, and electronic equipment.
- the user identity recognition method the user’s face image is obtained, and The user’s facial features are extracted from the acquired facial images, and the user’s identity information is identified based on the extracted facial features. If the identity recognition fails, the user’s eye features are extracted from the acquired facial images. The extracted eyeball features identify the user's identity information. That is, the user identity recognition method can recognize the user's identity information through the user's eyeball characteristics after failing to recognize the user's identity information based on the facial features.
- the eyeball feature accurately determines the user's identity and improves the accuracy of user identification.
- FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- FIG. 2 is a schematic flowchart of a user identity identification method provided by an embodiment of the present application
- FIG. 3 is a schematic structural diagram of a user identification device provided by an embodiment of the present application.
- FIG. 4 is a schematic structural diagram of a user identification device provided by another embodiment of the present application.
- FIG. 5 is a schematic structural diagram of a user identification device provided by another embodiment of the present application.
- FIG. 6 is a schematic structural diagram of a user identification device provided by still another embodiment of the present application.
- Figure 7 is a schematic structural diagram of a second extraction module
- FIG. 8 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application.
- This application provides a method and device for user identity recognition, which are applied to electronic equipment, so that the electronic device can extract from the acquired facial image when it fails to recognize the user’s identity information based on facial features
- the user’ s eye characteristics, and identify the user’s identity information based on the eye characteristics, to avoid the inability to accurately determine the user’s identity due to changes in the user’s face due to age, facial plastic surgery, etc., and improve the accuracy of user identity recognition .
- the electronic device is a device with an image acquisition function
- the electronic device may be a camera module, a robot provided with a camera module, or an intelligent terminal provided with a camera module.
- FIG. 1 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- the electronic device includes: an image acquisition unit 100, a display unit 200, and a control unit 300.
- the image capture unit 100 is used to capture video images of the surrounding environment. Based on this, when the user performs identity recognition on the electronic device, the image capture unit 100 can capture the user's face image.
- the image acquisition unit 100 is in communication connection with the control unit 300, and the control unit 300 can acquire the user's face image from the image acquisition unit 100.
- the display unit 200 is used to interact with the user. Specifically, the display unit 200 can display text reminders to the user.
- the display unit 200 is in communication connection with the control unit 300, and the control unit 300 can control the display unit 200 to display text reminders to the user.
- the control unit 300 is used to execute the user identity recognition method to extract the user’s eye characteristics from the acquired facial image when the user’s identity information fails to be recognized based on the facial characteristics, and to identify the user’s identity information based on the eye characteristics, Improve the accuracy of user identification.
- the electronic device may further include: a loudspeaker device, the loudspeaker device can send out a sound reminder to the user.
- the loudspeaker device is in communication connection with the control unit 300, and the control unit 300 can control the loudspeaker device to issue a sound reminder to the user.
- FIG. 2 is a schematic flowchart of a user identification method provided by an embodiment of the present application.
- the user identification method is applied to the above-mentioned electronic device and executed by the above-mentioned control unit 300 to improve user identification. Accuracy.
- the user identification method includes:
- the electronic device when the user performs identity recognition on the electronic device, the electronic device can collect the face image of the user through the image acquisition unit.
- the face image includes the user's clear and complete front face.
- the electronic device reminds the user to make adjustments when it is determined that the user's face is not fully presented, the user's face is not clear, or the user's face is not facing the camera.
- the user when it is determined that the user's face is not completely presented, the user is reminded to stay away from the lens; when it is determined that the user's face is not clear, the user is reminded to perform manual focus; when it is determined that the user's face is not facing the lens, the user is reminded to turn left/toward Adjust the face to the right.
- the electronic device can remind the user to adjust the face through text display.
- the user can also be reminded to make facial adjustments by means of voice reminders.
- S300 Identify the user's identity information according to facial features.
- the facial features of the user are extracted from the collected facial image.
- the user's facial features can be extracted and recognized through geometric feature methods, eigenface methods, local feature analysis LFA methods, neural network methods, etc.
- the user's facial features are extracted from the face image by the geometric feature method.
- the facial features include but are not limited to: facial features, eyebrow features, eye features, nose features, and Mouth characteristics, etc.
- the user's identity information is identified by matching the extracted facial features.
- an identity database is established in advance, and the identity database includes pre-stored identity information and pre-stored face images corresponding to the pre-stored identity information. For example, if the identity database includes the pre-stored identity information A of user A and the pre-stored identity information B of user B, the identity database also includes the pre-stored face image A corresponding to the pre-stored identity information A, and the pre-stored face image A corresponding to the pre-stored identity information B. Pre-stored face image B, the pre-stored face image A is the face image of user A, and the pre-stored face image B is the face image of user B.
- the pre-stored facial image matching the facial features is identified in the identity database.
- the pre-stored facial image matching the facial feature is recognized, it is determined that the identity recognition is successful, and the pre-stored identity information corresponding to the pre-stored facial image matching the facial feature is determined as the user's identity information;
- the degree of matching between the facial features and the pre-stored facial image is greater than or equal to the preset matching threshold, it is determined that the pre-stored facial image that matches the facial feature is recognized; otherwise, it is determined that the pre-stored facial image that matches the facial feature is not recognized.
- Pre-stored face images with matching features Pre-stored face images with matching features.
- the value range of the preset matching threshold may be 90% to 95%.
- the preset matching threshold is 90%
- when user A performs identity recognition collect user A's face image A, and extract facial feature A from face image A, and then identify and The pre-stored face image that matches the facial feature A.
- the matching degree between the pre-stored face image A and the facial feature A is 96%, it is determined that the pre-stored face image that matches the facial feature A is recognized.
- the pre-stored identity information A corresponding to the pre-stored face image A is determined as the identity information of the user A; if the matching degree between the pre-stored face image A and the pre-stored face image B and the facial feature A is less than 90%, it is determined that it is not recognized A pre-stored face image that matches the facial feature A.
- S500 Identify the user's identity information according to the eyeball characteristics.
- the user’s facial features will be blocked.
- the user’s identity information cannot be accurately determined based on the user’s facial features, and the user’s eyes are not blocked.
- the user's identity information can be determined by extracting the user's eyeball characteristics.
- the user's eyeball features are extracted from the facial image.
- extracting the user's eyeball feature from the face image specifically includes: intercepting the user's eye image from the face image, and then extracting the user's eyeball feature from the eye image.
- This method of extracting the user's eyeball features by intercepting eye images can highlight the details of the eyeball features, and then can extract more detailed eyeball features.
- the user's eye image can be captured by the eyeball positioning method.
- the eyeball feature is a biological feature related to the user's eyeball.
- the eyeball feature does not change with the user's age, facial reshaping and other reasons, and has good uniqueness.
- the eyeball feature can be an iris feature or a retina feature.
- the eyeball features include iris features. Based on this, extracting the user's eyeball features from the eye image specifically includes:
- the iris feature points are extracted from the eye image by a preset algorithm, and the extracted iris feature points are coded to obtain the user's iris feature code.
- the iris feature codes obtained after the iris feature points are coded are also the same. Therefore, the iris feature codes can be used to determine whether the user's iris features are the same.
- the pre-established identity database also includes a pre-stored iris feature code corresponding to the pre-stored identity information, and the pre-stored iris feature code is determined according to the pre-stored face image.
- the identity database includes user A's pre-stored identity information A and pre-stored face image A corresponding to pre-stored identity information A, including user B's pre-stored identity information B and pre-stored face image B corresponding to pre-stored identity information B, then
- the identity database also includes a pre-stored iris feature code A corresponding to the pre-stored identity information A, and a pre-stored iris feature code B corresponding to the pre-stored identity information B.
- the pre-stored iris feature code A is determined based on the pre-stored face image A, and the pre-stored iris features
- the code B is determined according to the pre-stored face image B.
- the iris feature points are extracted from the pre-stored face image by a preset algorithm, and the extracted iris feature points are coded to obtain the pre-stored iris feature code.
- the iris feature points in the pre-stored face image and eye image can be extracted in the same way, and the pre-stored face image can be extracted in the same way
- the iris feature points and the iris feature points extracted from the eye image are coded, so that the obtained pre-stored iris feature code and the iris feature code are comparable.
- the identification information of the user is identified, which specifically includes:
- the pre-stored iris feature code that matches the iris feature code is recognized, it is determined that the identity recognition is successful, and the pre-stored identity information corresponding to the pre-stored iris feature code that matches the iris feature code is determined as the user's identity information;
- the pre-stored iris feature code matching the iris feature code is not recognized, it is determined that the identity recognition fails.
- the iris feature code is completely consistent with the pre-stored iris feature code, it is determined that the pre-stored iris feature code matching the iris feature code is recognized; otherwise, it is determined that the pre-stored iris feature code matching the iris feature code is not recognized.
- user A when user A performs identity recognition, collect user A's face image A, and extract facial feature A from face image A. If the user's identity information fails to be identified based on facial feature A, the user will pass the pre- Suppose the algorithm extracts the iris feature point A from the face image A, and encodes the extracted iris feature point A to obtain the user's iris feature code A, and then identify the pre-stored iris matching the iris feature code A in the identity database Feature code.
- the pre-stored iris feature code that matches the iris feature code A is determined to be recognized, and the pre-stored identity information A corresponding to the pre-stored iris feature code A is determined as User A's identity information.
- the user's identity information when the user's identity information is successfully recognized based on the eyeball features, it means that the user's identity information has failed to be recognized based on the user's facial features. At this time, in order to increase the successful identification of the user's identity information based on the facial features, it is determined whether the face image is occluded. If there is no occlusion, the pre-stored face image corresponding to the user’s identity information will be deleted, and the face image will be stored as and The pre-stored face image corresponding to the user's identity information.
- the eyeball features do not change with the age of the user, facial reshaping and other reasons, and have good uniqueness, the accuracy of identifying the user's identity information based on the eyeball characteristics is higher than that of identifying the user's identity information based on facial features Based on this, when the user’s identity information fails to be identified based on the user’s facial features, and the user’s identity information is successfully identified based on the user’s eyeball features, it means that the pre-stored facial image corresponding to the user’s identity information in the database and There is a big difference in the face images captured by the user on site, and the difference is caused by the age of the user, facial reshaping, or the user’s face is blocked.
- the pre-stored face image corresponding to the identity information of the user is stored as a pre-stored face image corresponding to the user’s identity information.
- the pre-stored face image after ageing or facial reshaping can be used to affect the user Identity information is recognized to overcome the impact of changes in user facial features on user identity recognition, and increase the probability of successfully identifying user identity information through facial features.
- the user puts on makeup may also cause a large change in the user’s facial features
- the difference between the pre-stored facial image corresponding to the user’s identity information in the database and the facial image captured on site by the user is temporarily caused by makeup, etc. It is caused by sexual changes.
- deleting the pre-stored face image corresponding to the user’s identity information and storing the face image that does not exist as a pre-stored face image corresponding to the user’s identity information will result in the user’s face without makeup Under the circumstances, when the facial features are used for identity recognition, the identity recognition still fails.
- the face image is stored as a backup face image corresponding to the user’s identity information.
- the pre-stored face image or matching facial features can be identified in the identity database. Backup face image.
- a classifier trained by a machine learning algorithm can determine whether the face image is occluded.
- sample data with occlusion and sample data without occlusion includes sample data with occlusion and sample data without occlusion.
- the presence of occlusion sample data includes occluded facial features
- the absence of occlusion sample data includes non-occluded facial features.
- the corresponding relationship between the user's identity information and the face image is established in the identity database, that is, the face image is stored as a pre-stored face image corresponding to the user's identity information.
- the face image is stored as the pre-stored face image of the pre-stored identity information A.
- the successful recognition is based on the facial features.
- the user’s identity information it is determined whether the user’s identity information corresponds to a spare face image. If so, the matching degree between the face image and the prestored face image corresponding to the user’s identity information and the spare face image are calculated separately. If the matching degree of the face image is greater than the matching degree of the backup face image, set the pre-stored face image as priority matching. If the matching degree of the pre-stored face image is less than the matching degree of the backup face image, set the backup face image to Priority match.
- the matching degree 1 between the pre-stored face image 1 and the face image is calculated, and the difference between the spare face image 2 and the face image is calculated.
- Matching degree 2 if the matching degree 1 is greater than the matching degree 2, set the pre-stored face image 1 as a priority matching; if the matching degree 1 is less than the matching degree 2, set the backup face image 2 as a priority matching.
- the pre-stored face image is set as priority matching, that is, the pre-stored face image is preferentially used for facial feature matching;
- the spare face image is set as the priority matching, that is, the spare face image is preferentially used for facial feature matching.
- the matching degree of the pre-stored facial image is less than the matching degree of the backup facial image within the preset matching times, it is determined that the pre-stored facial image corresponding to the user's identity information in the database and the facial image captured by the user on-site are determined The difference between them is caused by long-term changes such as the user's aging, facial reshaping, etc. Therefore, it is possible to delete pre-stored facial images and perform facial feature matching through spare facial images.
- the preset number of matching times represents the number of times that the identity information of the same user is successfully recognized through facial features.
- the preset number of matching times is 5, which means that the identity information of user A is successfully recognized 5 times according to the facial features.
- the matching degree of the pre-stored face image calculated each time is less than the matching degree of the standby face image, the pre-stored face image is deleted.
- the user’s identity information is identified by the user’s eyeball features, avoiding the inability to change the facial features caused by the user’s age, facial plastic surgery, etc.
- the situation of accurately determining the user's identity has occurred, and the accuracy of user's identity recognition has been improved.
- FIG. 3 is a schematic structural diagram of a user identification device provided by an embodiment of the present application.
- the user identification device is applied to the above-mentioned electronic equipment, and the functions of each module are executed by the above-mentioned control unit 300 for Improve the accuracy of user identification.
- module used in the embodiments of the present application is a combination of software and/or hardware that can implement predetermined functions.
- the devices described in the following embodiments can be implemented by software, implementation by hardware or a combination of software and hardware may also be conceived.
- the user identity recognition device includes:
- the obtaining module 10 is used to obtain a user's face image
- the first extraction module 20 is configured to extract the facial features of the user from the face image
- the first recognition module 30 is configured to recognize the identity information of the user according to the facial feature
- the second extraction module 40 is configured to extract the eyeball features of the user from the face image when the identity recognition fails;
- the second identification module 50 is configured to identify the user's identity information according to the eyeball characteristics.
- the device further includes:
- the establishment module 60 is configured to pre-establish an identity database, the identity database including pre-stored identity information and pre-stored face images corresponding to the pre-stored identity information; then,
- the first identification module 30 is specifically configured to:
- a pre-stored facial image matching the facial feature is recognized, it is determined that the identity recognition is successful, and the pre-stored identity information corresponding to the pre-stored facial image matching the facial feature is determined as the identity of the user information;
- the device further includes:
- the determining module 70 is configured to determine whether the face image is occluded when the recognition fails according to the facial features and the identity information of the user is successfully recognized according to the eyeball features;
- the face image is stored as a backup face image corresponding to the identity information of the user.
- the determining module 70 when storing the face image as a backup face image corresponding to the identity information of the user, the determining module 70 is further configured to:
- the pre-stored face image is set as priority matching
- the standby facial image is set as a priority matching.
- the device further includes:
- the deleting module 80 is configured to delete the pre-stored face image if the matching degree of the pre-stored face image is less than the matching degree of the standby face image within the preset number of matching times.
- the second extraction module 40 specifically includes:
- the interception module 41 is configured to intercept the eye image of the user in the face image
- the third extraction module 42 is configured to extract the eyeball features of the user from the eye image.
- the eyeball feature includes an iris feature
- the identity database further includes a pre-stored iris feature code corresponding to the pre-stored identity information
- the third extraction module 42 is specifically configured to:
- the second identification module 50 is specifically configured to:
- a pre-stored iris feature code that matches the iris feature code is recognized, it is determined that the identity recognition is successful, and the pre-stored identity information corresponding to the pre-stored iris feature code that matches the iris feature code is determined as the identity of the user information;
- the pre-stored iris feature code matching the iris feature code is not recognized, it is determined that the identity recognition fails.
- the content of the device embodiment can be quoted from the method embodiment on the premise that the content does not conflict with each other, which will not be repeated here.
- the determining module 70 and the deleting module 80 may be processing chips of the control unit 300.
- the user’s identity information is identified by the user’s eyeball features, avoiding the inability to change the facial features caused by the user’s age, facial plastic surgery, etc.
- the situation of accurately determining the user's identity has occurred, and the accuracy of user's identity recognition has been improved.
- FIG. 8 is a schematic diagram of the hardware structure of an electronic device provided by an embodiment of the present application, including:
- processors 310 and memory 320 are taken as an example in FIG. 8.
- the processor 310 and the memory 320 may be connected through a bus or in other ways. In FIG. 8, the connection through a bus is taken as an example.
- the memory 320 can be used to store non-volatile software programs, non-volatile computer-executable programs and modules, such as a user identification method in the above-mentioned embodiment of the present application Corresponding program instructions and modules corresponding to a user identification device (for example, the acquisition module 10, the first extraction module 20, the first identification module 30, the second extraction module 40, the interception module 41, the third extraction module 42, the first Two identification module 50, establishment module 60, determination module 70, deletion module 80, etc.).
- a user identification device for example, the acquisition module 10, the first extraction module 20, the first identification module 30, the second extraction module 40, the interception module 41, the third extraction module 42, the first Two identification module 50, establishment module 60, determination module 70, deletion module 80, etc.
- the processor 310 executes various functional applications and data processing of a user identification method by running non-volatile software programs, instructions, and modules stored in the memory 320, that is, implements a user identification method in the foregoing method embodiment.
- the memory 320 may include a program storage area and a data storage area.
- the program storage area may store an operating system and an application program required by at least one function; the data storage area may store data created based on the use of a user identification device, etc. .
- the storage data area also stores preset data, including preset algorithms, preset matching thresholds, pre-stored identity information, pre-stored facial images, pre-stored iris feature codes, preset matching times, and the like.
- the memory 320 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
- the memory 320 may optionally include memories remotely provided with respect to the processor 310, and these remote memories may be connected to the processor 310 through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
- the program instructions and one or more modules are stored in the memory 320, and when executed by the one or more processors 310, each step of a user identification method in any of the foregoing method embodiments is executed, Or, realize the functions of each module of a user identity recognition device in any of the foregoing device embodiments.
- the above-mentioned products can execute the methods provided in the above-mentioned embodiments of the present application, and have functional modules and beneficial effects corresponding to the execution methods.
- functional modules and beneficial effects corresponding to the execution methods For technical details that are not described in detail in this embodiment, please refer to the method provided in the above embodiment of this application.
- the embodiment of the present application also provides a non-volatile computer-readable storage medium, the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are executed by one or more processors, for example, FIG. 8
- a processor 310 in any of the foregoing method embodiments may enable a computer to execute each step of a user identity recognition method in any of the foregoing method embodiments, or realize the functions of various modules of a user identity recognition device in any of the foregoing device embodiments.
- the embodiment of the present application also provides a computer program product containing program code.
- the computer program product When the computer program product is run on an electronic device, the electronic device can execute a user identification method in any of the above-mentioned method embodiments. Each step, or implements the function of each module of a user identity recognition device in any of the foregoing device embodiments.
- the device embodiments described above are merely illustrative.
- the modules described as separate components may or may not be physically separated, and the components displayed as modules may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
- each embodiment can be implemented by software plus a general hardware platform, and of course, it can also be implemented by hardware.
- a person of ordinary skill in the art can understand that all or part of the processes in the methods of the foregoing embodiments can be implemented by computer programs instructing relevant hardware.
- the programs can be stored in a computer readable storage medium. At the time, it may include the flow of the implementation method of each method as described above.
- the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Ophthalmology & Optometry (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Collating Specific Patterns (AREA)
Abstract
Des modes de réalisation de la présente invention se rapportent au domaine technique des informations électroniques. Un procédé et un appareil d'identification d'identité d'utilisateur, et un dispositif électronique sont divulgués. Le procédé d'identification d'identité d'utilisateur comprend : l'obtention d'une image de visage d'un utilisateur; l'extraction des caractéristiques faciales de l'utilisateur à partir de l'image de visage; l'identification des informations d'identité de l'utilisateur en fonction des caractéristiques faciales; si l'identification d'identité échoue, l'extraction des caractéristiques du globe oculaire de l'utilisateur à partir de l'image de visage; et l'identification des informations d'identité de l'utilisateur selon les caractéristiques du globe oculaire. Au moyen du procédé, les modes de réalisation de la présente invention peuvent améliorer la précision d'identification d'identité d'utilisateur.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2019/111442 WO2021072673A1 (fr) | 2019-10-16 | 2019-10-16 | Procédé et appareil d'identification d'identité d'utilisateur, et dispositif électronique |
| CN201980002119.3A CN111095268A (zh) | 2019-10-16 | 2019-10-16 | 一种用户身份识别方法、装置及电子设备 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2019/111442 WO2021072673A1 (fr) | 2019-10-16 | 2019-10-16 | Procédé et appareil d'identification d'identité d'utilisateur, et dispositif électronique |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021072673A1 true WO2021072673A1 (fr) | 2021-04-22 |
Family
ID=70400268
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2019/111442 Ceased WO2021072673A1 (fr) | 2019-10-16 | 2019-10-16 | Procédé et appareil d'identification d'identité d'utilisateur, et dispositif électronique |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN111095268A (fr) |
| WO (1) | WO2021072673A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230094993A1 (en) * | 2021-09-17 | 2023-03-30 | Samsung Electronics Co., Ltd. | Electronic apparatus and controlling method thereof |
| CN117058787A (zh) * | 2023-08-16 | 2023-11-14 | 鹿客科技(北京)股份有限公司 | 门锁控制方法、装置、电子设备和计算机可读介质 |
| CN117690179A (zh) * | 2024-01-31 | 2024-03-12 | 全民认证科技(杭州)有限公司 | 一种基于人脸信息的身份核验方法与系统 |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111985298B (zh) * | 2020-06-28 | 2023-07-25 | 百度在线网络技术(北京)有限公司 | 人脸识别样本收集方法和装置 |
| CN111768543A (zh) * | 2020-06-29 | 2020-10-13 | 杭州翔毅科技有限公司 | 基于人脸识别的通行管理方法、设备、存储介质及装置 |
| CN113657231B (zh) * | 2021-08-09 | 2024-05-07 | 广州中科智云科技有限公司 | 一种基于多旋翼无人机的图像识别方法及装置 |
| CN113902448B (zh) * | 2021-10-19 | 2025-03-28 | 北京雪扬科技有限公司 | 一种基于人脸识别的智能手表支付方法 |
| CN113985095A (zh) * | 2021-10-22 | 2022-01-28 | 国网上海市电力公司 | 一种适用于计量柜非法入侵的数字化稽查方法及装置 |
| CN114067408A (zh) * | 2021-11-22 | 2022-02-18 | 杭州世拓创意智能科技有限公司 | 用于银行自助设备的人脸识别身份认证方法及系统 |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101814130A (zh) * | 2009-02-19 | 2010-08-25 | 中国科学院自动化研究所 | 用摄像机阵列的虹膜识别装置和多模态生物特征识别方法 |
| US20130236069A1 (en) * | 2012-03-07 | 2013-09-12 | Altek Corporation | Face Recognition System and Face Recognition Method Thereof |
| CN106373240A (zh) * | 2016-09-14 | 2017-02-01 | 浙江维融电子科技股份有限公司 | 一种无人银行的智能监控系统及其监控方法 |
| CN106384285A (zh) * | 2016-09-14 | 2017-02-08 | 浙江维融电子科技股份有限公司 | 一种智能无人银行系统 |
| CN106447853A (zh) * | 2016-09-14 | 2017-02-22 | 浙江维融电子科技股份有限公司 | 一种具备多级识别功能的无人银行系统 |
| CN108694353A (zh) * | 2017-04-10 | 2018-10-23 | 上海聚虹光电科技有限公司 | 一种人脸识别和虹膜识别的多模态身份识别方法 |
| CN109979046A (zh) * | 2019-01-09 | 2019-07-05 | 平安科技(深圳)有限公司 | 安保方法、装置、计算机装置及计算机可读存储介质 |
-
2019
- 2019-10-16 WO PCT/CN2019/111442 patent/WO2021072673A1/fr not_active Ceased
- 2019-10-16 CN CN201980002119.3A patent/CN111095268A/zh active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101814130A (zh) * | 2009-02-19 | 2010-08-25 | 中国科学院自动化研究所 | 用摄像机阵列的虹膜识别装置和多模态生物特征识别方法 |
| US20130236069A1 (en) * | 2012-03-07 | 2013-09-12 | Altek Corporation | Face Recognition System and Face Recognition Method Thereof |
| CN106373240A (zh) * | 2016-09-14 | 2017-02-01 | 浙江维融电子科技股份有限公司 | 一种无人银行的智能监控系统及其监控方法 |
| CN106384285A (zh) * | 2016-09-14 | 2017-02-08 | 浙江维融电子科技股份有限公司 | 一种智能无人银行系统 |
| CN106447853A (zh) * | 2016-09-14 | 2017-02-22 | 浙江维融电子科技股份有限公司 | 一种具备多级识别功能的无人银行系统 |
| CN108694353A (zh) * | 2017-04-10 | 2018-10-23 | 上海聚虹光电科技有限公司 | 一种人脸识别和虹膜识别的多模态身份识别方法 |
| CN109979046A (zh) * | 2019-01-09 | 2019-07-05 | 平安科技(深圳)有限公司 | 安保方法、装置、计算机装置及计算机可读存储介质 |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230094993A1 (en) * | 2021-09-17 | 2023-03-30 | Samsung Electronics Co., Ltd. | Electronic apparatus and controlling method thereof |
| US12284511B2 (en) * | 2021-09-17 | 2025-04-22 | Samsung Electronics Co., Ltd. | Electronic apparatus and controlling method thereof |
| CN117058787A (zh) * | 2023-08-16 | 2023-11-14 | 鹿客科技(北京)股份有限公司 | 门锁控制方法、装置、电子设备和计算机可读介质 |
| CN117058787B (zh) * | 2023-08-16 | 2024-04-12 | 鹿客科技(北京)股份有限公司 | 门锁控制方法、装置、电子设备和计算机可读介质 |
| WO2025036388A1 (fr) * | 2023-08-16 | 2025-02-20 | 鹿客科技(北京)股份有限公司 | Procédé et appareil de commande de serrure de porte, dispositif électronique et support lisible par ordinateur |
| CN117690179A (zh) * | 2024-01-31 | 2024-03-12 | 全民认证科技(杭州)有限公司 | 一种基于人脸信息的身份核验方法与系统 |
| CN117690179B (zh) * | 2024-01-31 | 2024-04-26 | 全民认证科技(杭州)有限公司 | 一种基于人脸信息的身份核验方法与系统 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN111095268A (zh) | 2020-05-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2021072673A1 (fr) | Procédé et appareil d'identification d'identité d'utilisateur, et dispositif électronique | |
| US10922529B2 (en) | Human face authentication method and apparatus, and storage medium | |
| TWI687879B (zh) | 伺服器、客戶端、用戶核身方法及系統 | |
| JP2020509441A5 (fr) | ||
| US20200125843A1 (en) | Detailed eye shape model for robust biometric applications | |
| WO2019085403A1 (fr) | Procédé de comparaison intelligente de reconnaissance faciale, dispositif électronique et support d'informations lisible par ordinateur | |
| CN112115866A (zh) | 人脸识别方法、装置、电子设备及计算机可读存储介质 | |
| CN110533001B (zh) | 基于人脸识别的大数据人脸识别方法 | |
| WO2019062080A1 (fr) | Procédé de reconnaissance d'identité, dispositif électronique et support d'informations lisible par ordinateur | |
| US12148189B2 (en) | Iris recognition system, iris recognition method, and storage medium | |
| WO2020168468A1 (fr) | Procédé et dispositif de recherche d'aide sur la base d'une reconnaissance d'expression, appareil électronique et support d'informations | |
| WO2022174699A1 (fr) | Procédé et appareil de mise à jour d'image, dispositif électronique et support lisible par ordinateur | |
| KR20180109109A (ko) | 홍채 기반 인증 방법 및 이를 지원하는 전자 장치 | |
| KR20160147515A (ko) | 사용자 인증 방법 및 이를 지원하는 전자장치 | |
| JP2016009453A (ja) | 顔認証装置および顔認証方法 | |
| CN107844742B (zh) | 人脸图像眼镜去除方法、装置及存储介质 | |
| CN108171138B (zh) | 一种生物特征信息获取方法和装置 | |
| KR20190048340A (ko) | 전자 장치 및 이를 이용한 눈의 충혈도 판단 방법 | |
| CN116110100A (zh) | 一种人脸识别方法、装置、计算机设备及存储介质 | |
| CN111353368A (zh) | 云台摄像机、人脸特征处理方法及装置、电子设备 | |
| CN108875468A (zh) | 活体检测方法、活体检测系统以及存储介质 | |
| US20210264184A1 (en) | Information processing apparatus, information processing method, and information processing program | |
| CN111353364A (zh) | 一种人脸动态识别方法及装置、电子设备 | |
| CN109544384A (zh) | 基于生物识别的津贴发放方法、装置、终端、存储介质 | |
| US20160217565A1 (en) | Health and Fitness Monitoring via Long-Term Temporal Analysis of Biometric Data |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19949135 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30.08.2022) |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19949135 Country of ref document: EP Kind code of ref document: A1 |