[go: up one dir, main page]

WO2015181729A1 - Method of determining liveness for eye biometric authentication - Google Patents

Method of determining liveness for eye biometric authentication Download PDF

Info

Publication number
WO2015181729A1
WO2015181729A1 PCT/IB2015/053941 IB2015053941W WO2015181729A1 WO 2015181729 A1 WO2015181729 A1 WO 2015181729A1 IB 2015053941 W IB2015053941 W IB 2015053941W WO 2015181729 A1 WO2015181729 A1 WO 2015181729A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
face
movement
liveness
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/IB2015/053941
Other languages
French (fr)
Inventor
Jonathan Anton CLAASSENS
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Johannesburg
Original Assignee
University of Johannesburg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Johannesburg filed Critical University of Johannesburg
Publication of WO2015181729A1 publication Critical patent/WO2015181729A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive

Definitions

  • the invention relates to a method of determining liveness for eye biometric authentication and more specifically, but not exclusively, to a method of determining liveness for eye biometric authentication using the conjunctiva of the eye.
  • biometrics and specifically eye and conjunctival biometrics
  • liveness testing which is used to determine whether or not a user being identified or authenticated, is required to enhance the security of the biometric authentication or identification.
  • United States patent number 8,437,513 entitled "Spoof detection for biometric authentication” describes technologies relating to biometric authentication based on images of the eye.
  • one aspect of the subject matter described in the specification can be embodied in methods that include obtaining images of a subject including a view of an eye.
  • the methods may further include determining a behavioral metric based on detected movement of the eye as the eye appears in a plurality of the images, determining a spatial metric based on a distance from a sensor to a landmark that appears in a plurality of the images each having a different respective focus distance, and determining a reflectance metric based on detected changes in surface glare or specular reflection patterns on a surface of the eye.
  • the methods may further include determining a score based on the behavioral, spatial, and reflectance metrics and rejecting or accepting the one or more images based on the score.
  • the step of determining whether the detected movements correspond to liveness may be preceded by a number of iterations of the preceding steps.
  • the step of selecting portions of the face may be preceded by the step of detecting the face of the subject in the image.
  • the step of selecting portions of the face may involve specific selection of regions associated with specific muscle groups of the face with known movement freedoms.
  • the step of selecting portions of the face may on a uniform distribution of skin regions.
  • the obtained images may include at least part of the subject's face, eyebrows, nose, check, jaw, mouth, skin, and/or chin.
  • the step of detecting the face of the subject may be preceded by detecting the eye and selecting a portion around the eye.
  • the step of instructing the subject to move/deform part of the subject's face may be:
  • Instructing the subject may be done visually, by displaying a message on a display, or audibly, by prompting the subject with beeps or spoken instructions.
  • the step of detecting movement of selected portions may be done by detecting optical flow of the portions between two or more images.
  • the step of detecting movement of the selected portions may be conducted using the Lucas- Kanade tracking method or any optical flow approach where pixel patches' relative movement is measured. This may include the modelling of pixel patch movement, but also its deformation. Deformation may be assumed affine in nature.
  • the step of determining whether detected movement corresponds with liveness of the subject may be done by comparing the detected movement with predetermined data representing the instructed movement and determining whether the detected movement corresponds to the instructed movement.
  • the detected movement and instructed movement may be represented as an array of vectors representing the movement between two or more images.
  • the detected array may be compared to the instructed array to determine a degree of similarity between the detected movement and the instructed movement.
  • the step of enrolment may involve the user providing a series of preferred expressions for use in liveness testing. He/she may then demonstrate the expression and, through the method of facial deformation modeling described above, an exemplar array of vector measurements of relative motions of face portions for these preferred expressions. These calibration exemplars may then be demonstrated schematically to the user as instruction in the liveness test.
  • the step of determining whether the detected movement corresponds with liveness of the subject may involve a test which establishes whether agreement is too precise and further detect the possibility of spoofing based on replay of recorded data.
  • a method of determining liveness of a subject for eye biometric authentication on a computing device with associated processor and memories comprising:
  • the reflection may be detected in an eye of the subject, wherein the shape and color of the reflection in the eye is detected.
  • the method may include the step of receiving input from at least one connected device and determining whether or not the input corresponds to liveness.
  • the connected device may be a microphone connected to the computing device, wherein the input is sound and corresponds to liveness if the subject's breathing is detected in the sound.
  • the connected device may be an accelerometer, wherein the input is acceleration of the computing device is used to determine liveness.
  • the connected device may be the image capture device, wherein the input is a degree of focus of different portions of the subject's face.
  • Figure 1 is a schematic representation of a series of images used in a method of determining liveness
  • Figure 2 is a schematic representation of a subject using a tablet computer implementing a method for determining liveness.
  • a method of determining liveness for eye biometric authentication, or a plurality of steps thereof is generally designated by reference numeral 1 .
  • the method of determining liveness of a subject 3 for eye biometric authentication includes the step of capturing two or more images 2 (shown as 2a, 2b, 2c, and 2d in Figure 1 ).
  • the images 2 should include part of the subject's face and may include specific facial features such as skin, eyes, eyebrows, ears nose, cheeks, nose, mouth, and chin.
  • the method 1 includes the step of selecting portions 4 of the image, specifically portions of the image representing part of the subject's face. This step will typically be preceded by determining the position of the subject's face in the image 2.
  • the portions 4 are selected uniformly sampling portions on the face of the subject 3. For illustrative purposes in this example, the portions 4 were selected at specific regions on the subject's 3 face to more effectively describe the method. In practice, a much larger number of portions would be selected and will typically be uniformly distributed across the subject's face. Alternatively, a non-uniform distribution, where more portions are selected in muscle groups known to be active in the face, may be used.
  • the method 1 further includes the step of instructing 5 the subject 3 to move or deform, at least part of, his face.
  • the instruction 5 may be visual, as described below, or audible, such as voice instructions or beeps, depending on the device on which the biometric authentication is performed. For example, if a tablet computer 6 with a front facing camera 7 is used instructions may be prompted to the subject on the screen in writing. Alternatively, the tablet 6 screen may display, for example, a face with an expression 8 for the subject to imitate. This instructed movement is used to guide a subject 3 through successive expressions or facial movements at random to ensure that the subject being biometrically authenticated is alive and responds to the instructions 5 within predetermined accuracy and precision.
  • the instruction 5 may include instructing 5 the subject to blink, smile, frown, squint, turn his head, or to imitate an expression 8 shown on a display.
  • the method 1 includes the further step of detecting relative movement 9 of the portions 4 between two or more images 2.
  • more than two successive images 2 will be used to determine relative movement of the portions through successive images.
  • two images 2b, and 2c will be used for illustrative purposes.
  • the image 2b shows the subject 3 expressionless whilst image 2c shows the subject 3 after the instruction has been given to the subject 3 to imitate or present a surprised expression.
  • Relative movement 9 of portions 4 between successive images (2b and 2c) are detected and tracked using conventional optical flow estimation algorithm such as the Lucas-Kanade method.
  • the movements 9 are typically recorded in the form of an array or list of vectors in the image plane.
  • the detected movements 9 are compared to known movements for the particular facial movement or expression to determine whether or not the recorded movement 9 corresponds to the instructed movement. Instructed movements may be recorded per subject at the time of enrollment to provide for individual facial responses for the subject or may be compiled from collective responses which include common movements between various subjects. If the detected movements, within a predetermined degree of accuracy and precision, correspond to the instructed movement the action is accepted as a live action. An eye biometric authentication system may, depending on the level of security required, require that a number of successive live actions are performed and accepted before determining that the detected movement corresponds with liveness of the subject. If accepted, the accepted movements 9 may be stored against the profile of the subject 3. The accepted movements may also be used to adjust the instructed movements for better estimating future comparisons to the instructed movements. In determining liveness, exact matches of previous accepted movements may be regarded as an attempted spoof as it is improbable that a user will move their face exactly the same more than once.
  • a method of determining liveness of a subject 3 for eye biometric authentication on a computing 6 device with associated processor and memories includes the step of instructing the subject to position at least part of his face within view of the built in camera of the tablet.
  • the camera captures a series of images whilst the instruction is give.
  • a color is displayed on at least part of a display of the computing device and an image is captured whilst the color is displayed on the display.
  • a reflection of the display on the face of the subject may be detected from the captured image and if the hue of the reflection matches, within a margin of error, the color displayed it can be determined whether or not the reflection corresponds to the color displayed on the display of the computing device.
  • the reflection may be detected on portions of the subjects face or skin. Alternatively, should the subject's skin be unsuitable for detecting reflections, a reflection in the eyes of the subject may be detected.
  • the colors may be selected randomly to prevent spoofing with video footage of a predetermined color.
  • the liveness may be confirmed by receiving input from at least one connected device and determining whether or not the input corresponds to liveness.
  • the connected device may be a microphone connected to the computing device, wherein the input is sound and corresponds to liveness if the subject's breathing is detected in the sound.
  • the connected device may be an accelerometer, wherein the input is acceleration of the computing device is used to determine liveness. A profile of how a subject, or subjects in general, move the computing device closer to their face may be compiled, and upon instruction of a subject to position his face within view of the camera of the movement obtained from the corresponding acceleration may be compared to the profile.
  • the connected device may also be the image capture device, wherein the input is a degree of focus of different portions of the subject's face.
  • the invention will provide to a method of determining liveness for eye biometric authentication which is robust and can be performed on consumer devices such as smart phones, tablets or computers with webcams. It is further envisaged that the method will decrease the probability of forging input for eye biometric authentication.
  • a smartphone or desktop computer equipped with a webcam may be used instead of the subject using a tablet computer.
  • a smartphone or desktop computer equipped with a webcam may be used instead of a subject's entire face, portions of the face may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention relates to a method of determining liveness for eye biometric authentication and more specifically, but not exclusively, to a method of determining liveness for eye biometric authentication using the conjunctiva of the eye. The method of determining liveness of a subject (3) for eye biometric authentication includes the step of capturing two or more images (2). The images (2) should include part of the subject's face and may include specific facial features such as skin, eyes, eyebrows, ears nose, cheeks, nose, mouth, and chin. The method (1) includes the step of selecting portions (4) of the image and (1) the step of instructing (5) the subject (3) to move or deform, at least part of, his face. The method (1) includes the further step of detecting relative movement (9) of the portions (4) between two or more images (2).

Description

METHOD OF DETERMINING LIVENESS FOR EYE BIOMETRIC
AUTHENTICATION
FIELD OF THE INVENTION
The invention relates to a method of determining liveness for eye biometric authentication and more specifically, but not exclusively, to a method of determining liveness for eye biometric authentication using the conjunctiva of the eye.
BACKGROUND TO THE INVENTION
Identification and authentication using biometrics, and specifically eye and conjunctival biometrics, are known in the art. It is further known that liveness testing, which is used to determine whether or not a user being identified or authenticated, is required to enhance the security of the biometric authentication or identification.
United States patent number 8,437,513 entitled "Spoof detection for biometric authentication" describes technologies relating to biometric authentication based on images of the eye. In general, one aspect of the subject matter described in the specification can be embodied in methods that include obtaining images of a subject including a view of an eye. The methods may further include determining a behavioral metric based on detected movement of the eye as the eye appears in a plurality of the images, determining a spatial metric based on a distance from a sensor to a landmark that appears in a plurality of the images each having a different respective focus distance, and determining a reflectance metric based on detected changes in surface glare or specular reflection patterns on a surface of the eye. The methods may further include determining a score based on the behavioral, spatial, and reflectance metrics and rejecting or accepting the one or more images based on the score. OBJECT OF THE INVENTION
It is accordingly an object of the invention to provide a method of determining liveness for eye biometric authentication of the type described above and/or provide a useful alternative to the prior art.
SUMMARY OF THE INVENTION
In accordance with the invention there is provided method of determining liveness of a subject for eye biometric authentication of the subject comprising the steps of:
- obtaining two or more images of, at least part of, the subject;
- the images including part of the subject's face;
- selecting portions of the image on the subject's face;
- instructing the subject to move/deform parts of the subject's face;
- detecting relative movement of the selected portions between two or more images; and
- determining whether the detected movement correspond with liveness of the subject.
The step of determining whether the detected movements correspond to liveness may be preceded by a number of iterations of the preceding steps.
The step of selecting portions of the face may be preceded by the step of detecting the face of the subject in the image.
The step of selecting portions of the face may involve specific selection of regions associated with specific muscle groups of the face with known movement freedoms. The step of selecting portions of the face may on a uniform distribution of skin regions.
The obtained images may include at least part of the subject's face, eyebrows, nose, check, jaw, mouth, skin, and/or chin.
The step of detecting the face of the subject may be preceded by detecting the eye and selecting a portion around the eye.
The step of instructing the subject to move/deform part of the subject's face may be:
- instructing the subject to blink;
- instructing the subject to smile;
- instructing the subject to frown;
- instructing the subject to squint;
- instructing the subject to turn the subject's head; and
- instructing the subject to mimic an expression shown on a display.
Instructing the subject may be done visually, by displaying a message on a display, or audibly, by prompting the subject with beeps or spoken instructions.
The step of detecting movement of selected portions may be done by detecting optical flow of the portions between two or more images. The step of detecting movement of the selected portions may be conducted using the Lucas- Kanade tracking method or any optical flow approach where pixel patches' relative movement is measured. This may include the modelling of pixel patch movement, but also its deformation. Deformation may be assumed affine in nature.
The step of determining whether detected movement corresponds with liveness of the subject may be done by comparing the detected movement with predetermined data representing the instructed movement and determining whether the detected movement corresponds to the instructed movement.
The detected movement and instructed movement may be represented as an array of vectors representing the movement between two or more images. The detected array may be compared to the instructed array to determine a degree of similarity between the detected movement and the instructed movement.
The step of enrolment may involve the user providing a series of preferred expressions for use in liveness testing. He/she may then demonstrate the expression and, through the method of facial deformation modeling described above, an exemplar array of vector measurements of relative motions of face portions for these preferred expressions. These calibration exemplars may then be demonstrated schematically to the user as instruction in the liveness test.
The step of determining whether the detected movement corresponds with liveness of the subject may involve a test which establishes whether agreement is too precise and further detect the possibility of spoofing based on replay of recorded data. In accordance with a second aspect of the invention there is provided a method of determining liveness of a subject for eye biometric authentication on a computing device with associated processor and memories comprising:
- instructing a subject to position at least part of the subject's face within view of an image capture device of the computing device;
- displaying a color on at least part of a display of the computing device;
- obtaining an image of the subject including part of the face of subject;
- detecting a reflection of the display on the face and determining whether or not the reflection corresponds to the color displayed on the display of the computing device; and
- determining whether the reflection corresponds with liveness of the subject.
The reflection may be detected in an eye of the subject, wherein the shape and color of the reflection in the eye is detected.
The method may include the step of receiving input from at least one connected device and determining whether or not the input corresponds to liveness.
The connected device may be a microphone connected to the computing device, wherein the input is sound and corresponds to liveness if the subject's breathing is detected in the sound.
The connected device may be an accelerometer, wherein the input is acceleration of the computing device is used to determine liveness. The connected device may be the image capture device, wherein the input is a degree of focus of different portions of the subject's face.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will now be further described, by way of example only, with reference to the accompanying drawings wherein:
Figure 1 is a schematic representation of a series of images used in a method of determining liveness; and
Figure 2 is a schematic representation of a subject using a tablet computer implementing a method for determining liveness.
DETAILED DESCRIPTION OF THE DRAWINGS
With reference to the drawings, in which like features are indicated by like numerals, a method of determining liveness for eye biometric authentication, or a plurality of steps thereof, is generally designated by reference numeral 1 .
The method of determining liveness of a subject 3 for eye biometric authentication includes the step of capturing two or more images 2 (shown as 2a, 2b, 2c, and 2d in Figure 1 ). The images 2 should include part of the subject's face and may include specific facial features such as skin, eyes, eyebrows, ears nose, cheeks, nose, mouth, and chin. The method 1 includes the step of selecting portions 4 of the image, specifically portions of the image representing part of the subject's face. This step will typically be preceded by determining the position of the subject's face in the image 2. The portions 4 are selected uniformly sampling portions on the face of the subject 3. For illustrative purposes in this example, the portions 4 were selected at specific regions on the subject's 3 face to more effectively describe the method. In practice, a much larger number of portions would be selected and will typically be uniformly distributed across the subject's face. Alternatively, a non-uniform distribution, where more portions are selected in muscle groups known to be active in the face, may be used.
The method 1 further includes the step of instructing 5 the subject 3 to move or deform, at least part of, his face. The instruction 5 may be visual, as described below, or audible, such as voice instructions or beeps, depending on the device on which the biometric authentication is performed. For example, if a tablet computer 6 with a front facing camera 7 is used instructions may be prompted to the subject on the screen in writing. Alternatively, the tablet 6 screen may display, for example, a face with an expression 8 for the subject to imitate. This instructed movement is used to guide a subject 3 through successive expressions or facial movements at random to ensure that the subject being biometrically authenticated is alive and responds to the instructions 5 within predetermined accuracy and precision. By using randomly selected actions for instructed movement the chance of forging input to the biometric authentication by using prerecorded footage is diminished. The instruction 5 may include instructing 5 the subject to blink, smile, frown, squint, turn his head, or to imitate an expression 8 shown on a display.
The method 1 includes the further step of detecting relative movement 9 of the portions 4 between two or more images 2. In practice, more than two successive images 2 will be used to determine relative movement of the portions through successive images. In this example, two images 2b, and 2c will be used for illustrative purposes. The image 2b shows the subject 3 expressionless whilst image 2c shows the subject 3 after the instruction has been given to the subject 3 to imitate or present a surprised expression. Relative movement 9 of portions 4 between successive images (2b and 2c) are detected and tracked using conventional optical flow estimation algorithm such as the Lucas-Kanade method. The movements 9 are typically recorded in the form of an array or list of vectors in the image plane.
The detected movements 9 are compared to known movements for the particular facial movement or expression to determine whether or not the recorded movement 9 corresponds to the instructed movement. Instructed movements may be recorded per subject at the time of enrollment to provide for individual facial responses for the subject or may be compiled from collective responses which include common movements between various subjects. If the detected movements, within a predetermined degree of accuracy and precision, correspond to the instructed movement the action is accepted as a live action. An eye biometric authentication system may, depending on the level of security required, require that a number of successive live actions are performed and accepted before determining that the detected movement corresponds with liveness of the subject. If accepted, the accepted movements 9 may be stored against the profile of the subject 3. The accepted movements may also be used to adjust the instructed movements for better estimating future comparisons to the instructed movements. In determining liveness, exact matches of previous accepted movements may be regarded as an attempted spoof as it is improbable that a user will move their face exactly the same more than once.
In a second embodiment of the invention, a method of determining liveness of a subject 3 for eye biometric authentication on a computing 6 device with associated processor and memories includes the step of instructing the subject to position at least part of his face within view of the built in camera of the tablet. The camera captures a series of images whilst the instruction is give. A color is displayed on at least part of a display of the computing device and an image is captured whilst the color is displayed on the display.
A reflection of the display on the face of the subject may be detected from the captured image and if the hue of the reflection matches, within a margin of error, the color displayed it can be determined whether or not the reflection corresponds to the color displayed on the display of the computing device. The reflection may be detected on portions of the subjects face or skin. Alternatively, should the subject's skin be unsuitable for detecting reflections, a reflection in the eyes of the subject may be detected. The colors may be selected randomly to prevent spoofing with video footage of a predetermined color. In addition to detecting the reflection, the liveness may be confirmed by receiving input from at least one connected device and determining whether or not the input corresponds to liveness. The connected device may be a microphone connected to the computing device, wherein the input is sound and corresponds to liveness if the subject's breathing is detected in the sound. The connected device may be an accelerometer, wherein the input is acceleration of the computing device is used to determine liveness. A profile of how a subject, or subjects in general, move the computing device closer to their face may be compiled, and upon instruction of a subject to position his face within view of the camera of the movement obtained from the corresponding acceleration may be compared to the profile. The connected device may also be the image capture device, wherein the input is a degree of focus of different portions of the subject's face. As a subject moves the computing device closer to his face, certain portions of the face which are closer to the device will be in focus whilst further portions will be out of focus or vice versa. This may be detected and a profile compiled which corresponds to a human face. An attempt to spoof the authorization with prerecorded video footage will have a uniform focus profile.
It is envisaged that the invention will provide to a method of determining liveness for eye biometric authentication which is robust and can be performed on consumer devices such as smart phones, tablets or computers with webcams. It is further envisaged that the method will decrease the probability of forging input for eye biometric authentication.
The invention is not limited to the precise details as described herein. For example, instead of the subject using a tablet computer a smartphone or desktop computer equipped with a webcam may be used. Further, instead of a subject's entire face, portions of the face may be used.

Claims

1 . A method of determining liveness of a subject for eye biometric authentication comprising the steps of:
- obtaining two or more images of, at least part of, the subject;
- the images including part of the subject's face;
- selecting portions of the image on the subject's face;
- instructing the subject to move/deform parts of the subject's face;
- detecting relative movement of the selected portions between two or more images; and
- determining whether the detected movement correspond with liveness of the subject.
2. The method of claim 1 wherein the step of determining whether the detected movements correspond to liveness is preceded by a number of iterations of the preceding steps.
3. The method of claim 1 or 2 wherein the step of selecting portions of the face is preceded by the step of detecting the face of the subject in the image.
4. The method of claims 1 to 3 wherein the step of selecting portions of the face includes selection of regions associated with specific muscle groups of the face with known movement freedom.
5. The method of claims 1 to 4 wherein the step of selecting portions of the face is on a uniform distribution of regions.
6. The method of claims 1 to 5 wherein the obtained images includes at least on part of the subject's face selected from the group comprising the subject's eyebrows, nose, check, jaw, mouth, skin, and chin.
7. The method of claims 3 to 6 wherein the step of detecting the face of the subject is preceded by detecting the eye and selecting a portion around the eye.
8. The method of claims 1 to 7 wherein the step of instructing the subject to deform part of the subject's face is an instruction selected from the group comprising:
- instructing the subject to blink;
- instructing the subject to smile;
- instructing the subject to frown;
- instructing the subject to squint;
- instructing the subject to turn the subject's head; and
- instructing the subject to mimic an expression shown on a display.
9. The method of claims 1 to 8 wherein instruction of the subject is done visually by displaying a message on a display.
10. The method of claims 1 to 8 wherein instruction of the subject is done audibly by prompting the subject with beeps.
1 1 . The method of claims 1 to 8 wherein instruction of the subject is done audibly by prompting the subject with spoken instructions.
12. The method of claims 1 to 1 1 wherein the step of detecting movement of selected portions is done by detecting optical flow of the portions between two images.
13. The method of claims 1 to 12 wherein the step of detecting movement of the selected portions is conducted using the Lucas-Kanade tracking method.
14. The method of claims 1 to 12 wherein the step of detecting movement of the selected portions is conducted using an optical flow approach where pixel patches' relative movement is measured.
15. The method of claims 1 to 14 wherein the step of determining whether detected movement corresponds with liveness of the subject is done by comparing the detected movement with predetermined data representing the instructed movement and determining whether the detected movement corresponds to the instructed movement.
16. The method of claims 1 to 15 wherein the detected movement and instructed movement is represented as an array of vectors representing the movement between two or more images.
17. The method of claim 16 wherein the detected array is compared to the instructed array to determine a degree of similarity between the detected movement and the instructed movement.
18. The method of claims 1 to 17 wherein the subject enrolls by providing a series of preferred expressions for use in liveness testing.
19. The method of claim 18 wherein the subject demonstrates the preferred expression and an exemplar array of vector measurements of relative motions of face portions for these preferred expressions is compiled.
20. A method of determining liveness of a subject for eye biometric authentication on a computing device with associated processor and memories comprising:
- instructing a subject to position at least part of the subject's face within view of an image capture device of the computing device;
- displaying a color on at least part of a display of the computing device;
- obtaining an image of the subject including part of the face of subject;
- detecting a reflection of the display on the face and determining whether or not the reflection corresponds to the color displayed on the display of the computing device; and
- determining whether the reflection corresponds with liveness of the subject.
21 . The method of claim 20 wherein the reflection is detected in an eye of the subject.
22. The method of claim 21 wherein the shape and color of the reflection in the eye is detected.
23. The method of claims 20 to 22 wherein the method includes the step of receiving input from at least one connected device and determining whether or not the input corresponds to liveness.
24. The method of claims 20 to 23 wherein the connected device is a microphone connected to the computing device, wherein the input is sound and corresponds to liveness if the subject's breathing is detected in the sound.
25. The method of claims 20 to 24 wherein the connected device is an accelerometer, wherein the input is acceleration of the computing device and is used to determine liveness.
26. The method of claims 20 to 25 wherein the connected device is the image capture device and the input is a degree of focus of different portions of the subject's face.
27. A method of determining liveness of a subject for eye biometric authentication substantially as described herein and illustrated in the accompanying drawings.
PCT/IB2015/053941 2014-05-30 2015-05-26 Method of determining liveness for eye biometric authentication Ceased WO2015181729A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ZA2014/03979 2014-05-30
ZA201403979 2014-05-30

Publications (1)

Publication Number Publication Date
WO2015181729A1 true WO2015181729A1 (en) 2015-12-03

Family

ID=53434406

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2015/053941 Ceased WO2015181729A1 (en) 2014-05-30 2015-05-26 Method of determining liveness for eye biometric authentication

Country Status (1)

Country Link
WO (1) WO2015181729A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107316015A (en) * 2017-06-19 2017-11-03 南京邮电大学 A kind of facial expression recognition method of high accuracy based on depth space-time characteristic
CN108804884A (en) * 2017-05-02 2018-11-13 北京旷视科技有限公司 Identity authentication method, device and computer storage media
CN108875676A (en) * 2018-06-28 2018-11-23 北京旷视科技有限公司 Biopsy method, apparatus and system
CN111105539A (en) * 2018-10-26 2020-05-05 珠海格力电器股份有限公司 Access control management method and device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8437513B1 (en) 2012-08-10 2013-05-07 EyeVerify LLC Spoof detection for biometric authentication

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8437513B1 (en) 2012-08-10 2013-05-07 EyeVerify LLC Spoof detection for biometric authentication

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
"Recent Advances in Face Recognition", 1 June 2008, INTECH, ISBN: 978-9-53-761934-3, article GANG PAN ET AL: "Chapter 9 : Liveness Detection for Face Recognition", pages: 109 - 125, XP055166422, DOI: 10.5772/6397 *
EE-SIN NG ET AL: "Face verification using temporal affective cues", PATTERN RECOGNITION (ICPR), 2012 21ST INTERNATIONAL CONFERENCE ON, IEEE, 11 November 2012 (2012-11-11), pages 1249 - 1252, XP032329554, ISBN: 978-1-4673-2216-4 *
KLAUS KOLLREIDER ET AL: "Real-Time Face Detection and Motion Analysis With Application in Liveness Assessment", IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, IEEE, PISCATAWAY, NJ, US, vol. 2, no. 3, 1 September 2007 (2007-09-01), pages 548 - 558, XP011190354, ISSN: 1556-6013, DOI: 10.1109/TIFS.2007.902037 *
SCHUCKERS STEPHANIE A C: "SPOOFING AND ANTI-SPOOFING MEASURES", INFORMATION SECURITY TECHNICAL REPORT, ELSEVIER ADVANCED TECHNOLOGY, AMSTERDAM, NL, vol. 7, no. 4, 1 December 2002 (2002-12-01), pages 56 - 62, XP008071912, ISSN: 1363-4127, DOI: 10.1016/S1363-4127(02)00407-7 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108804884A (en) * 2017-05-02 2018-11-13 北京旷视科技有限公司 Identity authentication method, device and computer storage media
CN108804884B (en) * 2017-05-02 2020-08-07 北京旷视科技有限公司 Identity authentication method, device and computer storage medium
CN107316015A (en) * 2017-06-19 2017-11-03 南京邮电大学 A kind of facial expression recognition method of high accuracy based on depth space-time characteristic
CN107316015B (en) * 2017-06-19 2020-06-30 南京邮电大学 High-precision facial expression recognition method based on deep space-time characteristics
CN108875676A (en) * 2018-06-28 2018-11-23 北京旷视科技有限公司 Biopsy method, apparatus and system
CN108875676B (en) * 2018-06-28 2021-08-10 北京旷视科技有限公司 Living body detection method, device and system
US11195037B2 (en) 2018-06-28 2021-12-07 Beijing Kuangshi Technology Co., Ltd. Living body detection method and system, computer-readable storage medium
CN111105539A (en) * 2018-10-26 2020-05-05 珠海格力电器股份有限公司 Access control management method and device

Similar Documents

Publication Publication Date Title
JP5159950B2 (en) Image processing apparatus, method, and program
US10546183B2 (en) Liveness detection
JP7040952B2 (en) Face recognition method and equipment
CN105184246B (en) Living body detection method and living body detection system
CN105023010B (en) A kind of human face in-vivo detection method and system
EP3332403B1 (en) Liveness detection
CN106897658B (en) Method and device for identifying living body of human face
Li et al. Seeing your face is not enough: An inertial sensor-based liveness detection for face authentication
CN105612533B (en) Liveness detection method, liveness detection system, and computer program product
KR20190038594A (en) Face recognition-based authentication
EP2434372A2 (en) Controlled access to functionality of a wireless device
KR101288447B1 (en) Gaze tracking apparatus, display apparatus and method therof
JP6822482B2 (en) Line-of-sight estimation device, line-of-sight estimation method, and program recording medium
CN114077726A (en) System, method and machine-readable medium for authenticating a user
WO2016090379A2 (en) Detection of print-based spoofing attacks
JP2018524654A (en) Activity detection method and device, and identity authentication method and device
WO2016127437A1 (en) Live body face verification method and system, and computer program product
WO2019067903A1 (en) Head pose estimation from local eye region
JP5001930B2 (en) Motion recognition apparatus and method
KR101640014B1 (en) Iris recognition apparatus for detecting false face image
JP5170094B2 (en) Spoofing detection system, spoofing detection method, and spoofing detection program
US20150023606A1 (en) Reliability acquiring apparatus, reliability acquiring method, and reliability acquiring program
JPWO2020065954A1 (en) Authentication device, authentication method, authentication program and recording medium
CN110929705A (en) Living body detection method and device, identity authentication method and system and storage medium
WO2015181729A1 (en) Method of determining liveness for eye biometric authentication

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15730282

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15730282

Country of ref document: EP

Kind code of ref document: A1