[go: up one dir, main page]

WO2021131029A1 - Filter generation device, estimation device, facial authentication system, filter generation method, and recording medium - Google Patents

Filter generation device, estimation device, facial authentication system, filter generation method, and recording medium Download PDF

Info

Publication number
WO2021131029A1
WO2021131029A1 PCT/JP2019/051472 JP2019051472W WO2021131029A1 WO 2021131029 A1 WO2021131029 A1 WO 2021131029A1 JP 2019051472 W JP2019051472 W JP 2019051472W WO 2021131029 A1 WO2021131029 A1 WO 2021131029A1
Authority
WO
WIPO (PCT)
Prior art keywords
hostile
image
unit
filter
face recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/051472
Other languages
French (fr)
Japanese (ja)
Inventor
光佑 吉田
大生 近藤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to PCT/JP2019/051472 priority Critical patent/WO2021131029A1/en
Publication of WO2021131029A1 publication Critical patent/WO2021131029A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present invention relates to a filter generation device, an estimation device, a face recognition system, a filter generation method, and a recording medium.
  • Patent Document 1 discloses a face recognition system that executes multitasking including a face recognition task and a biometric detection task.
  • An example of an object of the present invention is to provide a filter generation device, an estimation device, a face recognition system, a filter generation method, and a recording medium capable of solving the above problems.
  • the filter generation device includes a hostile perturbation acquisition means for acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image, and a plurality of the hostiles.
  • a filter generation means for generating a filter based on a target perturbation is provided.
  • the filter generation method is based on a step of acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image, and a plurality of the hostile perturbations.
  • the recording medium includes a step of acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image, and a plurality of the hostile perturbations. It is a recording medium for recording a process for generating a filter based on the above and a program for executing the filter.
  • FIG. 5 is a flowchart showing an example of a processing procedure in which the estimation device according to the embodiment determines whether or not the face image to be authenticated is a hostile sample.
  • FIG. 1 is a schematic configuration diagram showing an example of a device configuration of a face recognition system according to an embodiment.
  • the face recognition system 1 includes a face recognition device 100, a filter generation device 200, and an estimation device 300.
  • Face recognition system 1 performs face recognition using an artificial intelligence model. Specifically, the face recognition system 1 inputs the face image of the authentication target person into the artificial intelligence model, acquires the feature vector of the face image, and authenticates the authentication target person using the obtained feature vector.
  • the artificial intelligence model referred to here is a model that has been trained by machine learning.
  • the face recognition system 1 estimates the possibility of a hostile sample (Adversarial Example) for the face image of the person to be authenticated.
  • the hostile sample here is classified into the same class as the original image in the classification by humans, whereas the class classification by artificial intelligence is different from the original image. It is an image after processing classified into. Also, processing the original image to generate a hostile sample is also referred to as a hostile sample.
  • processing in hostile samples is referred to as adding "noise" because human classification does not cause a significant change in different classes.
  • a hostile sample in particular, an example of intentionally deceiving a neural network model by deep learning is known.
  • the artificial intelligence model used by the face recognition system 1 is not limited to deep learning, and can be various artificial intelligence models in which classification errors due to hostile samples can occur.
  • the face recognition system 1 may erroneously authenticate. Therefore, it is expected that the face recognition system 1 can prevent or reduce the damage caused by erroneous authentication when the hostile sample is used by estimating the possibility of the hostile sample.
  • the face recognition system 1 may notify the possibility of a hostile sample together with the authentication result.
  • the face recognition system 1 may determine whether or not the sample is hostile and notify the determination result.
  • the face recognition system 1 may calculate a score (index value) indicating the magnitude of the possibility of the hostile sample and notify the calculated score.
  • the face recognition system 1 when the face recognition system 1 performs login authentication of a certain system such as a computer system, it may be determined whether or not it is a hostile sample in addition to the face recognition. Then, when the face recognition system 1 determines that the sample is hostile, login may not be permitted regardless of the result of face recognition.
  • the face recognition system 1 When the face recognition system 1 is provided with a gate opening / closing mechanism and the entrance is permitted by opening the gate, it may be determined whether or not the sample is hostile in addition to the face recognition. Then, when the face recognition system 1 determines that the sample is hostile, the entrance may not be permitted by not opening the gate regardless of the result of the face recognition.
  • a hostile sample in face recognition is the use of eyeglasses. If you are wearing glasses with hostile noise, the image taken by the camera may be a hostile sample. In this case, the face recognition system 1 can prevent or reduce the damage caused by the erroneous authentication as described above by estimating the possibility that the captured image is a hostile sample.
  • the registered image for face recognition is a hostile sample.
  • a target person who wants to manipulate the authentication result illegally submits a photo of a hostile sample.
  • a target person who wants to manipulate the authentication result illegally wears glasses with hostile noise and is photographed.
  • the face recognition system 1 may estimate the possibility that the registered image is a hostile sample. For example, when registering an image, the face recognition system 1 determines whether or not the image to be registered is a hostile sample, and if it determines that the image is a hostile sample, the registration is stopped and an alarm message is output. You may. Alternatively, the face recognition system 1 may determine whether or not the registered image is a hostile sample at the time of executing face recognition or periodically by background processing or the like.
  • the face recognition device 100 performs the face recognition described above for the face recognition system 1.
  • the face recognition device 100 may be an existing one as long as it can output the result of class classification in face recognition and the feature vector of the face image calculated in the process of face recognition. That is, the face recognition system 1 may be configured by using the existing face recognition device as the face recognition device 100.
  • the result of the classification in the face recognition referred to here is whether the person to be authenticated is judged to be one or more of the pre-registered people, or is judged to be different from any person. This is the judgment result.
  • the feature vector of the image referred to here is a vector indicating the feature amount of the image.
  • the machine learning model receives the input of the face image and outputs the feature vector of the face image.
  • the estimation device 300 estimates the possibility of the hostile sample described above for the face recognition system 1.
  • the filter generation device 200 generates a filter (image filter) for the estimation device 300 to use for estimating the possibility of a hostile sample.
  • the filter generated by the filter generation device 200 is also referred to as a hostile sample determination filter.
  • Each of the face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured by using a computer such as a personal computer (PC) or a workstation (WS). Alternatively, one or more of the face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured by using dedicated hardware.
  • a computer such as a personal computer (PC) or a workstation (WS).
  • WS workstation
  • one or more of the face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured by using dedicated hardware.
  • the computer constituting the face recognition device 100 may simulate the function of the neural network.
  • the face recognition device 100 may include a neural network configured in hardware.
  • the face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured as separate devices. Alternatively, any two or more of the face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured as an integrated device. For example, the face recognition device 100 and the estimation device 300 may be configured as an integrated device using one computer. Then, as described above for the face recognition system 1, this device may perform processing based on the estimation result of the possibility of the hostile sample in addition to the result of the face recognition.
  • FIG. 2 is a schematic block diagram showing an example of the functional configuration of the face recognition device 100.
  • the face recognition device 100 includes a first communication unit 110, an authentication image acquisition unit 120, an authentication result processing unit 130, a first storage unit 180, and a first control unit 190.
  • the first storage unit 180 includes an authentication database unit 181.
  • the first control unit 190 includes a feature vector calculation unit 191 and a classification unit 192.
  • the first communication unit 110 communicates with another device.
  • the first communication unit 110 receives the face image generated by the filter generation device 200 as a candidate for the hostile sample and the original face image thereof.
  • the first communication unit 110 transmits the result of the classification in the face recognition to the filter generation device 200 for each of the hostile sample candidate and the original face image.
  • the result of the classification in the face recognition is that the filter generation device 200 determines whether or not the candidate for the hostile sample is a hostile sample and collects the hostile sample when the filter for determining the hostile sample is generated. Used for.
  • the first communication unit 110 transmits the face image to be authenticated to the estimation device 300, and the estimation device 300 receives the face image obtained by applying the hostile sample determination filter to the face image to be authenticated.
  • the face image to be authenticated is also referred to as a first image.
  • a face image obtained by applying a hostile sample determination filter to the first image is also referred to as a second image.
  • the first communication unit 110 transmits the feature vector of the first image and the feature vector of the second image to the estimation device 300.
  • the feature vector of the first image is also referred to as a first feature vector.
  • the feature vector of the second image is also referred to as a second feature vector.
  • the first feature vector and the second feature vector transmitted by the first communication unit 110 to the estimation device 300 are used by the estimation device 300 to estimate the possibility that the face image to be authenticated is a hostile sample.
  • the authentication image acquisition unit 120 acquires a face image to be authenticated.
  • the authentication image acquisition unit 120 may be configured to include a camera and capture a face image of the person to be authenticated.
  • the authentication image acquisition unit 120 may be configured to include a camera or a scanner, and may read a face photograph of a person to be authenticated, for example, a face photograph of a passport or a face photograph of a driver's license.
  • the first communication unit 110 may be used as the authentication image acquisition unit 120 to receive the face image from another device, such as receiving the face image transmitted by the client device on the Internet.
  • the authentication result processing unit 130 performs processing according to the result of face authentication by the face authentication device 100.
  • the processing performed by the authentication result processing unit 130 can be various processing depending on the usage pattern of the face recognition system 1. For example, when the face recognition system 1 performs login authentication to a certain system, when the face recognition system 1 succeeds in face recognition, the authentication result processing unit 130 informs the user identified as the person to be authenticated by the face recognition. The login process may be performed accordingly. Successful face recognition in this case means that the person to be authenticated is identified as one of the users registered in the authentication database.
  • the authentication result processing unit 130 may control the display device of the system to display a message indicating that the authentication has failed. If the authentication fails in this case, it is determined that the authentication target person is different from any user registered in the authentication database.
  • the authentication result processing unit 130 may control the opening / closing of the gate according to the result of face recognition.
  • the authentication result processing unit 130 may open the gate.
  • Successful face recognition in this case means that the person to be authenticated is identified as one of the admission persons registered in the authentication database.
  • the authentication result processing unit 130 may keep the gate closed. If the authentication fails in this case, it is determined that the person to be authenticated is different from any admission person registered in the authentication database.
  • the first storage unit 180 stores various data.
  • the function of the first storage unit 180 is executed by using the storage device included in the face recognition device 100.
  • the authentication database unit 181 stores the authentication database.
  • the authentication database referred to here is data to be compared with the data of the authentication target person in face authentication.
  • the authentication database contains data of candidates whose authentication target is identified by face recognition.
  • the data of each candidate includes a feature vector of the candidate's facial image. In the classification in face recognition, each candidate and no corresponding candidate correspond to the example of the class. Then, determining which candidate the authentication target person is, or determining that the person to be authenticated is different from any candidate corresponds to the example of classification.
  • the authentication database includes data for each user.
  • the data for each user in this case includes a feature vector of the user's face image, the user name of the user, and setting information when the user logs in.
  • the first control unit 190 controls each unit of the face recognition device 100 to execute various processes.
  • the function of the first control unit 190 is executed, for example, by the CPU (Central Processing Unit) included in the face authentication device 100 reading a program from the first storage unit 180 and executing the program.
  • CPU Central Processing Unit
  • the feature vector calculation unit 191 is configured to include the above-mentioned artificial intelligence model, and calculates the feature vector of the face image. Specifically, the feature vector calculation unit 191 calculates the feature vector (first feature vector) of the face image to be authenticated. In addition, the feature vector calculation unit 191 calculates the feature vector of the hostile sampling candidate generated and transmitted by the filter generation device 200. In addition, the feature vector calculation unit 191 calculates the feature vector (second feature vector) of the second image generated and transmitted by the estimation device 300.
  • the classification unit 192 classifies the face in face recognition using the feature vector of the face image to be authenticated. For example, the classification unit 192 calculates a score that quantitatively indicates the degree of similarity between the feature vector of the face image of the candidate and the feature vector of the face image to be authenticated for each candidate shown in the authentication database. The score calculated by the classification unit 192 is also referred to as an authentication score.
  • the classification unit 192 may calculate the authentication score indicating that the smaller the score, the greater the degree of similarity of the feature vectors.
  • the classification unit 192 determines that the authentication target person is different from any candidate. On the other hand, when any one or more of the calculated authentication scores is equal to or higher than the authentication threshold value, the classification unit 192 determines that the authentication target person is the candidate whose authentication score is calculated to be the largest.
  • FIG. 3 is a schematic block diagram showing an example of the functional configuration of the filter generation device 200.
  • the filter generation device 200 includes a second communication unit 210, a second storage unit 280, and a second control unit 290.
  • the second control unit 290 includes a hostile perturbation acquisition unit 291, a filter generation image acquisition unit 292, a noise addition unit 293, a hostile sample selection unit 294, an acquisition processing unit 295, and a filter generation unit 296. Be prepared.
  • the second communication unit 210 communicates with other devices.
  • the second communication unit 210 transmits the face image generated by the filter generation device 200 as a candidate for the hostile sample and the original face image thereof to the face recognition device 100.
  • the second communication unit 210 receives the result of the classification in the face recognition from the face recognition device 100 for each of the hostile sample candidate and the original face image.
  • the second storage unit 280 stores various data.
  • the function of the second storage unit 280 is executed by using the storage device included in the filter generation device 200.
  • the second control unit 290 controls each unit of the filter generation device 200 to execute various processes.
  • the function of the second control unit 290 is executed, for example, by the CPU included in the filter generation device 200 reading a program from the second storage unit 280 and executing the program.
  • the hostile perturbation acquisition unit 291 acquires a plurality of adversarial perturbations due to the difference between the face image and the hostile sample based on the face image.
  • the hostile sample based on the face image referred to here is a face image as a hostile sample in which noise is added to the face image.
  • the hostile perturbation here is the difference obtained by subtracting the original facial image from the hostile sample. Therefore, a hostile perturbation is an image that corresponds to the noise added to the original facial image to generate a hostile sample.
  • the hostile perturbation acquisition unit 291 corresponds to an example of a hostile perturbation acquisition means.
  • the method by which the hostile perturbation acquisition unit 291 acquires a hostile perturbation is not limited to a specific method.
  • the hostile perturbation acquisition unit 291 adds noise to the face image to generate a hostile sample, the noise may be used as the hostile perturbation.
  • the hostile perturbation acquisition unit 291 acquires an existing hostile sample and its original face image
  • the image of may be calculated as a hostile perturbation.
  • the corresponding pixels referred to here are pixels having the same relative position in the image. Therefore, the pixels that overlap when a plurality of images of the same size are overlapped with their edges aligned correspond to the corresponding pixels.
  • Equal image size means that both the number of vertical pixels and the number of horizontal pixels of a plurality of images are the same.
  • the hostile perturbation acquisition unit 291 acquires an existing hostile sample
  • the hostile perturbation may be acquired by using a known hostile perturbation calculation method.
  • the hostile perturbation acquisition unit 291 may acquire an existing hostile perturbation.
  • the plurality of hostile perturbations acquired by the hostile perturbation acquisition unit 291 may all be calculated using the same facial image.
  • the hostile perturbation acquisition unit 291 may acquire the hostile perturbation calculated by using the face image for each of the plurality of face images.
  • the plurality of face images may be the face images of the same person.
  • facial images of a plurality of persons may be included.
  • the filter generation image acquisition unit 292 acquires a face image that is the source of the hostile sample.
  • the face image acquired by the filter generation image acquisition unit 292 is also referred to as a filter generation image.
  • the method by which the filter generation image acquisition unit 292 acquires the filter generation image is not limited to a specific method.
  • the filter generation device 200 may include a camera, and the filter generation image acquisition unit 292 may acquire the face image captured by the camera as the filter generation image.
  • the authentication image acquisition unit 120 of the face recognition device 100 may include a camera, and the filter generation image acquisition unit 292 may acquire the face image captured by the camera as a filter generation image.
  • the filter generation image acquisition unit 292 may acquire the existing face image, such as the second communication unit 210 receiving the existing face image from another device.
  • the noise addition unit 293 acquires candidates for hostile samples by adding noise to the filter generation image.
  • the noise addition unit 293 may randomly generate a noise image in which the amount of increase / decrease in the brightness value of each pixel is within a predetermined range, and add the generated noise to the filter generation image.
  • the hostile sample selection unit 294 selects a hostile sample from the hostile sample candidates generated by the noise addition unit 293.
  • the method by which the hostile sample selection unit 294 selects a hostile sample is not limited to a specific method.
  • the face recognition device 100 may classify each of the hostile sample candidate and the original face image (filter generation image) in face recognition, and transmit the classification result to the filter generation device 200. Then, the hostile sample selection unit 294 may select, among the candidates for the hostile sample, a candidate classified into a class different from the filter generation image as the hostile sample.
  • the filter generation device 200 may include a display device and an input device, display hostile sample candidates on the display device, and accept the user operation of the determination result on the input device. Then, the hostile sample selection unit 294 may manually select an image determined to be a hostile sample from the candidates for the hostile sample.
  • Acquisition processing unit 295 acquires hostile perturbations. Specifically, the acquisition processing unit 295 acquires the noise added to the filter generation image by the noise addition unit 293 as a hostile perturbation in order to generate the hostile sample selected by the hostile sample selection unit 294. ..
  • the filter generation unit 296 generates a hostile sample determination filter based on a plurality of hostile perturbations. Therefore, the hostile perturbation acquisition unit 291 may acquire a plurality of hostile perturbations.
  • the filter generation unit 296 generates a filter for determining a hostile sample by synthesizing a plurality of hostile perturbations, for example, obtaining an average or a weighted sum of a plurality of hostile perturbations.
  • the filter generation unit 296 corresponds to an example of a filter generation means.
  • the average of a plurality of hostile perturbations is one image obtained by calculating the average of pixel values for each corresponding pixel for a plurality of hostile perturbations as a plurality of images having the same size.
  • the weighted total of multiple hostile perturbations is a weighting coefficient set for each hostile perturbation for multiple hostile perturbations as multiple images of the same size, and the weighted total is calculated for each corresponding pixel. It is one image obtained.
  • the filter generator 296 multiplies each of the plurality of hostile perturbations by a weighting factor that increases the difference between the original image and the hostile sample based on the difference between the original image and the hostile sample. May be done.
  • a weighting factor that increases the difference between the original image and the hostile sample based on the difference between the original image and the hostile sample. May be done.
  • the degree of difference between the original image and the hostile sample for example, the distance between the feature vector of the original image and the feature vector of the hostile sample can be used.
  • L1 norm, L2 norm, L ⁇ norm, or cosine distance can be used.
  • the weight used by the filter generation unit 296 for the total weighting is not limited to a specific one.
  • FIG. 4 is a schematic block diagram showing an example of the functional configuration of the estimation device 300.
  • the estimation device 300 includes a third communication unit 310, a determination result processing unit 320, a third storage unit 380, and a third control unit 390.
  • the third control unit 390 includes a first image acquisition unit 391, a filter application unit 392, a score calculation unit 393, and a determination unit 394.
  • the third communication unit 310 communicates with other devices.
  • the third communication unit 310 receives the face image (first image) to be authenticated from the face recognition device 100, and the estimation device 300 applies the hostile sample determination filter to the face image to be authenticated.
  • the face image (second image) is transmitted to the face recognition device 100.
  • the third communication unit 310 receives the feature vector of the first image and the feature vector of the second image from the face recognition device 100.
  • the determination result processing unit 320 When the estimation device 300 determines that the face image to be authenticated is a hostile sample, the determination result processing unit 320 performs processing according to the determination result.
  • the processing performed by the determination result processing unit 320 can be various processing depending on the usage pattern of the face authentication system 1.
  • the process performed by the determination result processing unit 320 is also referred to as a hostile sample response process.
  • the judgment result processing unit 320 displays a judgment result that the face image to be authenticated is a hostile sample. The result may be notified to the person.
  • the determination result processing unit 320 notifies the person of the score, such as displaying a score indicating the possibility that the face image to be authenticated is a hostile sample, regardless of the determination result of whether or not the sample is hostile. You may do so.
  • the third communication unit 310 functions as the determination result processing unit 320 to notify the face that the face image to be authenticated is a hostile sample. It may be transmitted to the authentication device 100.
  • the authentication result processing unit 130 performs processing when face recognition fails regardless of the result of face recognition. You may.
  • the third communication unit 310 functions as the determination result processing unit 320, and the face image to be authenticated is a hostile sample.
  • the notification to that effect may be transmitted to the face recognition device 100.
  • the authentication result processing unit 130 performs processing when face recognition fails regardless of the result of face recognition. You may.
  • the third storage unit 380 stores various data.
  • the function of the third storage unit 380 is executed by using the storage device included in the estimation device 300.
  • the third control unit 390 controls each unit of the estimation device 300 to execute various processes.
  • the function of the third control unit 390 is executed, for example, by the CPU included in the estimation device 300 reading a program from the third storage unit 380 and executing the program.
  • the first image acquisition unit 391 acquires the first image. Specifically, the face recognition device 100 transmits a face image (first image) to be authenticated acquired when performing face recognition to the estimation device 300. In the estimation device 300, the first image acquisition unit 391 extracts the first signal from the signal received from the face recognition device 100 by the third communication unit 310.
  • the filter application unit 392 applies a hostile sample determination filter to the first image acquired by the first image acquisition unit 391 to acquire the second image. For example, the filter application unit 392 adds the value of each pixel of the first image to the value of the corresponding pixel in the hostile sample determination filter as the application of the hostile sample determination filter to the first image.
  • the filter application unit 392 corresponds to an example of the filter application means.
  • the score calculation unit 393 calculates a score that quantitatively indicates the degree of difference between the first feature vector (feature vector of the first image) and the second feature vector (feature vector of the second image).
  • the score calculated by the score calculation unit 393 is also referred to as a hostile sample score.
  • the method in which the score calculation unit 393 calculates the hostile sample score and the method in which the classification unit 192 calculates the authentication score may be the same method or different methods.
  • the score calculation unit 393 corresponds to an example of the score calculation means.
  • the method by which the score calculation unit 393 calculates the hostile sample score is not limited to a specific method.
  • the score calculation unit 393 may calculate a hostile sample score indicating that the larger the score, the greater the degree of difference between the first feature vector and the second feature vector.
  • the score calculation unit 393 may calculate a hostile sample score indicating that the smaller the score, the greater the degree of difference between the first feature vector and the second feature vector.
  • the score calculation unit 393 may calculate any of the L1 norm, L2 norm, L ⁇ norm, or cosine distance between the first feature vector and the second feature vector as a hostile sample score.
  • the cosine distance is 1 minus the cosine similarity.
  • These hostile sample scores correspond to the example of the hostile sample scores showing that the larger the score, the greater the degree of difference between the first feature vector and the second feature vector.
  • the score calculation unit 393 may calculate either the cosine similarity between the first feature vector and the second feature vector or the inner product as a hostile sample score.
  • These hostile sample scores correspond to examples of hostile sample scores indicating that the smaller the score, the greater the degree of difference between the first feature vector and the second feature vector.
  • the hostile sample score calculated by the score calculation unit 393 is used as a score indicating the degree of possibility that the face image to be authenticated is a hostile sample. It is presumed that the greater the degree of difference between the first feature vector and the second feature vector indicated by the hostile sample score, the higher the possibility that the face image to be authenticated is a hostile sample.
  • the larger the hostile sample score indicates that the degree of difference between the first feature vector and the second feature vector is greater, the larger the hostile sample score, the more likely the face image to be authenticated is a hostile sample. Indicates that is high.
  • the smaller the hostile sample score indicates that the degree of difference between the first feature vector and the second feature vector is larger, the smaller the hostile sample score, the more the face image to be authenticated is the hostile sample. Indicates a high probability.
  • the hostile sample score as a score indicating the magnitude of the possibility that the face image to be authenticated is a hostile sample is described as "the first image and the first image to which the hostile sample determination filter is applied.
  • the greater the degree of difference between the first image and the second image to which the hostile sample determination filter is applied to the first image, the higher the possibility that the first image is a hostile sample.” Positive results have been obtained.
  • the determination unit 394 determines whether or not the face image to be authenticated is a hostile sample based on the hostile sample score calculated by the score calculation unit 393. For example, the determination unit 394 compares the hostile sample score with a predetermined determination threshold. When the larger the hostile sample score indicates that the face image to be authenticated is more likely to be a hostile sample, the determination unit 394 determines that the face to be authenticated is equal to or greater than the determination threshold value. Determine that the image is a hostile sample. In this case, the determination unit 394 determines that the face image to be authenticated is not a hostile sample when the hostile sample score is less than the determination threshold value.
  • the determination unit 394 determines the face to be authenticated when the hostile sample score is equal to or less than the determination threshold value. Determine that the image is a hostile sample. In this case, the determination unit 394 determines that the face image to be authenticated is not a hostile sample when the hostile sample score is larger than the determination threshold value.
  • the determination result processing unit 320 described above performs processing according to the determination result of the determination unit 394.
  • the estimation device 300 may output a score indicating the degree of possibility that the face image to be authenticated is a hostile sample instead of the determination result of whether or not the face image to be authenticated is a hostile sample. In this case, the estimation device 300 does not have to include the determination unit 394.
  • the estimation device 300 may output the hostile sample score as it is as a score indicating the magnitude of the possibility that the face image to be authenticated is a hostile sample.
  • the estimation device 300 may process and output the hostile sample score.
  • the estimation device 300 may convert the hostile sample score into a score indicating the magnitude of the possibility that the face image to be authenticated is a hostile sample and output it.
  • the estimation device 300 estimates and estimates the possibility that the registered image or the image to be registered is a hostile sample in addition to or instead of the face image to be authenticated. The process may be performed according to the result.
  • FIG. 5 is a flowchart showing an example of a processing procedure in which the filter generation device 200 generates a filter for hostile sample determination.
  • the filter generation device 200 performs the processing of FIG. 5, for example, when the generation of the hostile sample determination filter is instructed by a user operation.
  • the filter generation image acquisition unit 292 acquires the filter generation image (step S101).
  • the noise addition unit 293 adds noise to the filter generation image acquired by the filter generation image acquisition unit 292 to generate hostile sample candidates (step S102).
  • the hostile sample selection unit 294 acquires the result of the classification in the face recognition of the face recognition device 100 for each of the filter generation image and the hostile sample candidate (step S103).
  • the second communication unit 210 transmits a filter generation image and a candidate for a hostile sample to the face recognition device 100.
  • the face recognition device 100 classifies each of the filter generation image and the hostile sample candidate into classes, and transmits the classification result to the filter generation device 200.
  • the hostile sample selection unit 294 uses the signal received by the second communication unit 210 from the filter generation device 200 to classify the filter generation image and the hostile sample candidate. Is extracted.
  • the hostile sample selection unit 294 determines whether or not the candidate for the hostile sample is a hostile sample (step S104). Specifically, when the filter generation image and the hostile sample candidate are classified into different classes, the hostile sample selection unit 294 determines that the hostile sample candidate is a hostile sample. On the other hand, when the filter generation image and the hostile sample candidate are classified into the same class, the hostile sample selection unit 294 determines that the hostile sample candidate is not a hostile sample.
  • the acquisition processing unit 295 acquires the hostile perturbation and stores it in the second storage unit 280. (Step S111). For example, the acquisition processing unit 295 acquires the noise added to the filter generation image by the noise addition unit 293 in step S102 as a hostile perturbation.
  • step S112 the acquisition processing unit 295 determines whether or not the number of acquired hostile perturbations has reached a predetermined number. If the hostile sample selection unit 294 determines in step S104 that the candidate for the hostile sample is not a hostile sample (step S104: NO), the process proceeds to step S112.
  • step S112 determines that the number of acquired hostile perturbations has not reached a predetermined number (step S112: NO). If the acquisition processing unit 295 determines in step S112 that the number of acquired hostile perturbations has not reached a predetermined number (step S112: NO), the processing returns to step S102. On the other hand, in step S112, when the acquisition processing unit 295 determines that the number of acquired hostile perturbations has reached a predetermined number (step S112: YES), the filter generation unit 296 determines that the number of acquired hostile perturbations has reached a predetermined number (step S112: YES). To generate a hostile sample determination filter (step S121). After step S121, the filter generator 200 ends the process of FIG.
  • FIG. 6 is a flowchart showing an example of a processing procedure in which the estimation device 300 determines whether or not the face image to be authenticated is a hostile sample. For example, when the face recognition device 100 starts face recognition, it transmits a signal instructing the start of the process of FIG. 6 to the estimation device 300. The estimation device 300 starts the process of FIG. 6 in response to an instruction from the face recognition device 100.
  • the first image acquisition unit 391 acquires the first image (face image to be authenticated) (step S211). Specifically, the face recognition device 100 transmits the first image to the estimation device 300. The first image acquisition unit 391 extracts the first image from the signal received from the face recognition device 100 by the third communication unit 310. The first image acquisition unit 391 may acquire the registered image or the image to be registered as the first image in addition to or instead of the face image to be authenticated. The filter application unit 392 applies a hostile sample determination filter to the first image acquired by the first image acquisition unit 391 to generate a second image (step S212).
  • the score calculation unit 393 acquires the second feature vector (step S213).
  • the third communication unit 310 transmits the second image to the face recognition device 100.
  • the feature vector calculation unit 191 calculates the feature vector (second feature vector) of the second image
  • the first communication unit 110 transmits the obtained second feature vector to the estimation device 300. ..
  • the score calculation unit 393 extracts the second feature vector from the signal received from the face recognition device 100 by the third communication unit 310.
  • the score calculation unit 393 acquires the first feature vector (step S221).
  • the feature vector calculation unit 191 calculates the feature vector (first feature vector) of the face image (first image) to be authenticated, and the first communication unit 110 obtains the first feature vector.
  • the feature vector is transmitted to the estimation device 300.
  • the score calculation unit 393 extracts the first feature vector from the signal received from the face recognition device 100 by the third communication unit 310.
  • the estimation device 300 may perform the processes of steps S211 to S213 and the processes of steps S221 in parallel, or may sequentially perform the processes.
  • the score calculation unit 393 calculates a hostile sample score (step S231). As described above, the method by which the score calculation unit 393 calculates the hostile sample score is not limited to a specific method. Then, the score calculation unit 393 determines whether or not the face image to be authenticated is a hostile sample (step S232). For example, as described above, the score calculation unit 393 compares the hostile sample score with the determination threshold value to determine whether or not the face image to be authenticated is a hostile sample.
  • step S232 When it is determined that the face image to be authenticated is a hostile sample (step S232: YES), the determination result processing unit 320 executes the hostile sample correspondence process (step S233). As described above, the processing performed by the determination result processing unit 320 can be various processing depending on the usage pattern of the face authentication system 1. After step S233, the estimation device 300 ends the process of FIG. On the other hand, when it is determined in step S232 that the face image to be authenticated is not a hostile sample (step S232: NO), the estimation device 300 ends the process of FIG.
  • the hostile perturbation acquisition unit 291 acquires a plurality of hostile perturbations.
  • the filter generation unit 296 generates a filter for determining a hostile sample based on a plurality of hostile perturbations acquired by the hostile perturbation acquisition unit 291. According to the filter generation device 200, a filter for determining a hostile sample can be obtained. Using the hostile sample determination filter, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample.
  • the filter generation unit 296 calculates the hostile sample determination filter by averaging a plurality of hostile perturbations.
  • the filter generation unit 296 can calculate the hostile sample determination filter by a relatively simple process such as averaging the pixel values for each pixel of the hostile perturbation. According to the filter generation device 200, the processing load of the filter generation unit 296 can be lightened in this respect.
  • the filter application unit 392 acquires a second image to which a hostile sample determination filter is applied to the first image, which is an image used for face recognition.
  • the score calculation unit 393 calculates a hostile sample score that quantitatively indicates the degree of difference between the feature vector of the first image and the feature vector of the second image. According to the estimation device 300, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample. Specifically, the hostile sample score indicates that the face image to be authenticated may be a hostile sample.
  • the score calculation unit 393 can compare the hostile sample score with the determination threshold value to determine whether or not the face image to be authenticated is a hostile sample.
  • FIG. 7 is a diagram showing an example of the configuration of the filter generation device according to the embodiment.
  • the filter generation device 500 shown in FIG. 7 includes a hostile perturbation acquisition unit 501 and a filter generation unit 502.
  • the hostile perturbation acquisition unit 501 acquires a plurality of hostile perturbations due to the difference between the face image and the hostile sample based on the face image.
  • the filter generation unit 502 generates a filter based on a plurality of hostile perturbations.
  • the hostile perturbation acquisition unit 501 corresponds to an example of a hostile perturbation acquisition means.
  • the filter generation unit 502 corresponds to an example of a filter generation means. According to the filter generation device 500, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample by using this filter.
  • FIG. 8 is a diagram showing an example of a processing procedure in the filter generation method according to the embodiment.
  • the filter generation method shown in FIG. 8 includes a hostile perturbation acquisition step (step S501) and a filter generation step (step S502).
  • a hostile perturbation acquisition step (step S501) a plurality of hostile perturbations due to the difference between the face image and the hostile sample based on the face image are acquired.
  • a filter generation step (step S502) a filter is generated based on a plurality of hostile perturbations. According to the filter generation method shown in FIG. 8, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample by using this filter.
  • FIG. 9 is a schematic block diagram showing an example of a computer configuration according to at least one embodiment.
  • the computer 700 includes a CPU 710, a main storage device 720, an auxiliary storage device 730, and an interface 740. Any one or more of the face recognition device 100, the filter generation device 200, the estimation device 300, and the filter generation device 500 may be mounted on the computer 700. In that case, the operation of each of the above-mentioned processing units is stored in the auxiliary storage device 730 in the form of a program.
  • the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
  • the CPU 710 secures a storage area corresponding to each of the above-mentioned storage units in the main storage device 720 according to the program. Communication between each device and other devices is executed by having the interface 740 have a communication function and performing communication according to the control of the CPU 710.
  • the operations of the first control unit 190 and each unit thereof are stored in the auxiliary storage device 730 in the form of a program.
  • the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program. Further, the CPU 710 secures the first storage unit 180 and the storage area corresponding to each unit in the main storage device 720 according to the program.
  • the communication performed by the first communication unit 110 is executed by the interface 740 including a communication device and performing communication according to the control of the CPU 710.
  • the process of acquiring the face image for authentication by the authentication image acquisition unit 120 is executed by, for example, the interface 740 including an image acquisition device such as a camera or a scanner and operating according to the control of the CPU 710.
  • the processing performed by the authentication result processing unit 130 is executed by, for example, the interface 740 including a device such as a display device or a communication device according to the processing mode of the authentication result processing unit 130 and operating according to the control of the CPU 710.
  • the operations of the second control unit 290 and each unit thereof are stored in the auxiliary storage device 730 in the form of a program.
  • the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program. Further, the CPU 710 secures a storage area corresponding to the second storage unit 280 in the main storage device 720 according to the program.
  • the communication performed by the second communication unit 210 is executed by the interface 740 including a communication device and performing communication according to the control of the CPU 710.
  • the operations of the third control unit 390 and each unit thereof are stored in the auxiliary storage device 730 in the form of a program.
  • the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program. Further, the CPU 710 secures a storage area corresponding to the third storage unit 380 in the main storage device 720 according to the program.
  • the communication performed by the third communication unit 310 is executed when the interface 740 includes a communication device and performs communication according to the control of the CPU 710.
  • the processing performed by the determination result processing unit 320 is executed by, for example, the interface 740 including a device such as a display device or a communication device according to the processing mode of the determination result processing unit 320 and operating according to the control of the CPU 710.
  • the operations of the hostile perturbation acquisition unit 501 and the filter generation unit 502 are stored in the auxiliary storage device 730 in the form of a program.
  • the CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program. Further, the CPU 710 secures a storage area required for processing of the filter generation device 500 in the main storage device 720 according to the program.
  • the input / output process performed by the filter generation device 500 is executed when the interface 740 is provided with a device such as a communication device according to the mode of processing and operates according to the control of the CPU 710.
  • a computer-readable recording medium is used to record a program for realizing all or part of the functions of the face recognition device 100, the filter generation device 200, the estimation device 300, and the filter generation device 500, and this recording medium.
  • the processing of each part may be performed by loading the program recorded in the above into a computer system and executing the program.
  • the term "computer system” as used herein includes hardware such as an OS (operating system) and peripheral devices.
  • "Computer readable recording medium” includes flexible disks, optomagnetic disks, portable media such as ROM (Read Only Memory) and CD-ROM (Compact Disc Read Only Memory), and hard disks built into computer systems.
  • a storage device includes flexible disks, optomagnetic disks, portable media such as ROM (Read Only Memory) and CD-ROM (Compact Disc Read Only Memory), and hard disks built into computer systems.
  • the above-mentioned program may be a program for realizing a part of the above-mentioned functions, and may be a
  • the embodiment of the present invention may be applied to a filter generation device, an estimation device, a face recognition system, a filter generation method, and a recording medium.
  • Face recognition system 100 Face recognition device 110 First communication unit 120 Authentication image acquisition unit 180 First storage unit 181 Authentication database unit 190 First control unit 191 Feature vector calculation unit 192 Classification unit 200, 500 Filter generation device 210 Second Communication unit 280 Second storage unit 290 Second control unit 291, 501 Hostile perturbation acquisition unit 292 Filter generation image acquisition unit 293 Noise addition unit 294 Hostile sample selection unit 295 Acquisition processing unit 296, 502 Filter generation unit 300 Estimator 310 Third communication unit 320 Judgment result processing unit 380 Third storage unit 390 Third control unit 391 First image acquisition unit 392 Filter application unit 393 Score calculation unit 394 Judgment unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Collating Specific Patterns (AREA)

Abstract

A filter generation device comprises: an adversarial-perturbation acquisition means that acquires a plurality of adversarial perturbations due to the difference between a face image and an adversarial sample based on the face image; and a filter generation means that generates a filter on the basis of the plurality of adversarial perturbations.

Description

フィルタ生成装置、推定装置、顔認証システム、フィルタ生成方法および記録媒体Filter generator, estimation device, face recognition system, filter generation method and recording medium

 本発明は、フィルタ生成装置、推定装置、顔認証システム、フィルタ生成方法および記録媒体に関する。 The present invention relates to a filter generation device, an estimation device, a face recognition system, a filter generation method, and a recording medium.

 顔認証におけるなりすましの防止に関連して、特許文献1には、顔認識タスクと生体検知タスクとを含むマルチタスクを実行する顔認識システムが開示されている。 In relation to the prevention of spoofing in face recognition, Patent Document 1 discloses a face recognition system that executes multitasking including a face recognition task and a biometric detection task.

日本国特表2018-508801号公報Japan Special Table 2018-508801 Gazette

 人工知能を用いて顔認証を行う場合、認証対象の顔画像が敵対的サンプルである可能性を推定できることが好ましい。 When performing face recognition using artificial intelligence, it is preferable to be able to estimate the possibility that the face image to be authenticated is a hostile sample.

 本発明の目的の一例は、上記の問題を解決することができるフィルタ生成装置、推定装置、顔認証システム、フィルタ生成方法および記録媒体を提供することである。 An example of an object of the present invention is to provide a filter generation device, an estimation device, a face recognition system, a filter generation method, and a recording medium capable of solving the above problems.

 本発明の第1の態様によれば、フィルタ生成装置は、顔画像と、前記顔画像に基づく敵対的サンプルとの差による敵対的摂動を複数取得する敵対的摂動取得手段と、複数の前記敵対的摂動に基づいてフィルタを生成するフィルタ生成手段と、を備える。 According to the first aspect of the present invention, the filter generation device includes a hostile perturbation acquisition means for acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image, and a plurality of the hostiles. A filter generation means for generating a filter based on a target perturbation is provided.

 本発明の第2の態様によれば、フィルタ生成方法は、顔画像と、前記顔画像に基づく敵対的サンプルとの差による敵対的摂動を複数取得する工程と、複数の前記敵対的摂動に基づいてフィルタを生成する工程と、を含む。 According to the second aspect of the present invention, the filter generation method is based on a step of acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image, and a plurality of the hostile perturbations. The process of generating a filter and the like.

 本発明の第3の態様によれば、記録媒体は、コンピュータに、顔画像と、前記顔画像に基づく敵対的サンプルとの差による敵対的摂動を複数取得する工程と、複数の前記敵対的摂動に基づいてフィルタを生成する工程と、を実行させるためのプログラムを記録する記録媒体である。 According to a third aspect of the present invention, the recording medium includes a step of acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image, and a plurality of the hostile perturbations. It is a recording medium for recording a process for generating a filter based on the above and a program for executing the filter.

 上記した態様によれば、認証対象の顔画像が敵対的サンプルである可能性を推定できる。 According to the above aspect, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample.

実施形態に係る顔認証システムの装置構成の例を示す概略構成図である。It is a schematic block diagram which shows the example of the apparatus configuration of the face recognition system which concerns on embodiment. 実施形態に係る顔認証装置の機能構成の例を示す概略ブロック図である。It is a schematic block diagram which shows the example of the functional structure of the face recognition apparatus which concerns on embodiment. 実施形態に係るフィルタ生成装置の機能構成の例を示す概略ブロック図である。It is a schematic block diagram which shows the example of the functional structure of the filter generation apparatus which concerns on embodiment. 実施形態に係る推定装置の機能構成の例を示す概略ブロック図である。It is a schematic block diagram which shows the example of the functional structure of the estimation apparatus which concerns on embodiment. 実施形態に係るフィルタ生成装置が、敵対的サンプル判定用フィルタを生成する処理手順の例を示すフローチャートである。It is a flowchart which shows the example of the processing procedure which the filter generation apparatus which concerns on embodiment generates a filter for hostile sample determination. 実施形態に係る推定装置が、認証対象の顔画像が敵対的サンプルか否かを判定する処理手順の例を示すフローチャートである。FIG. 5 is a flowchart showing an example of a processing procedure in which the estimation device according to the embodiment determines whether or not the face image to be authenticated is a hostile sample. 実施形態に係るフィルタ生成装置の構成の例を示す図である。It is a figure which shows the example of the structure of the filter generation apparatus which concerns on embodiment. 実施形態に係るフィルタ生成方法における処理の手順の例を示す図である。It is a figure which shows the example of the processing procedure in the filter generation method which concerns on embodiment. 少なくとも1つの実施形態に係るコンピュータの構成の例を示す概略ブロック図である。It is a schematic block diagram which shows the example of the structure of the computer which concerns on at least one Embodiment.

 以下、本発明の実施形態を説明するが、以下の実施形態は請求の範囲にかかる発明を限定するものではない。また、実施形態の中で説明されている特徴の組み合わせの全てが発明の解決手段に必須であるとは限らない。
 図1は、実施形態に係る顔認証システムの装置構成の例を示す概略構成図である。図1に示す構成で、顔認証システム1は、顔認証装置100と、フィルタ生成装置200と、推定装置300とを備える。
Hereinafter, embodiments of the present invention will be described, but the following embodiments do not limit the inventions claimed. Also, not all combinations of features described in the embodiments are essential to the means of solving the invention.
FIG. 1 is a schematic configuration diagram showing an example of a device configuration of a face recognition system according to an embodiment. With the configuration shown in FIG. 1, the face recognition system 1 includes a face recognition device 100, a filter generation device 200, and an estimation device 300.

 顔認証システム1は、人工知能モデルを用いて顔認証を行う。具体的には、顔認証システム1は、認証対象者の顔画像を人工知能モデルに入力して顔画像の特徴ベクトルを取得し、得られた特徴ベクトルを用いて認証対象者の認証を行う。ここでいう人工知能モデルは、機械学習による学習済みのモデルである。また、顔認証システム1は、認証対象者の顔画像について、敵対的サンプル(Adversarial Example)の可能性を推定する。 Face recognition system 1 performs face recognition using an artificial intelligence model. Specifically, the face recognition system 1 inputs the face image of the authentication target person into the artificial intelligence model, acquires the feature vector of the face image, and authenticates the authentication target person using the obtained feature vector. The artificial intelligence model referred to here is a model that has been trained by machine learning. In addition, the face recognition system 1 estimates the possibility of a hostile sample (Adversarial Example) for the face image of the person to be authenticated.

 ここでいう敵対的サンプルは、元の画像を加工することで、人間によるクラス分類では元の画像の場合と同じクラスに分類されるのに対し、人工知能によるクラス分類では元の画像と異なるクラスに分類される、加工後の画像である。また、元の画像を加工して敵対的サンプルを生成することも、敵対的サンプルと称する。 By processing the original image, the hostile sample here is classified into the same class as the original image in the classification by humans, whereas the class classification by artificial intelligence is different from the original image. It is an image after processing classified into. Also, processing the original image to generate a hostile sample is also referred to as a hostile sample.

 敵対的サンプルにおける加工は、人間によるクラス分類では、クラスが異なるという有意な変化を生じさせないことから、「ノイズ」を加えることと称される。
 敵対的サンプルの例として、特に、深層学習(Deep Learning)によるニューラルネットワークモデルを意図的にだます例が知られている。ただし、顔認証システム1が用いる人工知能モデルは深層学習に限定されず、敵対的サンプルによるクラス分類誤りが生じ得るいろいろな人工知能モデルとすることができる。
Processing in hostile samples is referred to as adding "noise" because human classification does not cause a significant change in different classes.
As an example of a hostile sample, in particular, an example of intentionally deceiving a neural network model by deep learning is known. However, the artificial intelligence model used by the face recognition system 1 is not limited to deep learning, and can be various artificial intelligence models in which classification errors due to hostile samples can occur.

 顔認証システム1による顔認証に敵対的サンプルが用いられた場合、顔認証システム1が誤認証する可能性がある。そこで、顔認証システム1が、敵対的サンプルの可能性を推定することで、敵対的サンプルが用いられた場合の誤認証による被害を防止または軽減できると期待される。 If a hostile sample is used for face recognition by the face recognition system 1, the face recognition system 1 may erroneously authenticate. Therefore, it is expected that the face recognition system 1 can prevent or reduce the damage caused by erroneous authentication when the hostile sample is used by estimating the possibility of the hostile sample.

 例えば、顔認証システム1が、顔認証の結果を管理者など人に通知する場合、顔認証システム1が、認証結果と共に敵対的サンプルの可能性を通知するようにしてもよい。この場合、顔認証システム1が、敵対的サンプルか否かを判定し、判定結果を通知するようにしてもよい。あるいは、顔認証システム1が、敵対的サンプルの可能性の大きさを示すスコア(指標値)を算出し、算出したスコアを通知するようにしてもよい。 For example, when the face recognition system 1 notifies a person such as an administrator of the result of face recognition, the face recognition system 1 may notify the possibility of a hostile sample together with the authentication result. In this case, the face recognition system 1 may determine whether or not the sample is hostile and notify the determination result. Alternatively, the face recognition system 1 may calculate a score (index value) indicating the magnitude of the possibility of the hostile sample and notify the calculated score.

 あるいは、顔認証システム1が、例えばコンピュータシステムなど、あるシステムのログイン認証を行う場合、顔認証に加えて敵対的サンプルか否かの判定を行うようにしてもよい。そして、顔認証システム1が、敵対的サンプルと判定したときは、顔認証の結果にかかわらず、ログインを許可しないようにしてもよい。 Alternatively, when the face recognition system 1 performs login authentication of a certain system such as a computer system, it may be determined whether or not it is a hostile sample in addition to the face recognition. Then, when the face recognition system 1 determines that the sample is hostile, login may not be permitted regardless of the result of face recognition.

 顔認証システム1がゲート開閉機構を備え、ゲートを開くことによる入場許可を行う場合も、顔認証に加えて敵対的サンプルか否かの判定を行うようにしてもよい。そして、顔認証システム1が、敵対的サンプルと判定したときは、顔認証の結果にかかわらず、ゲートを開かないことで入場を許可しないようにしてもよい。 When the face recognition system 1 is provided with a gate opening / closing mechanism and the entrance is permitted by opening the gate, it may be determined whether or not the sample is hostile in addition to the face recognition. Then, when the face recognition system 1 determines that the sample is hostile, the entrance may not be permitted by not opening the gate regardless of the result of the face recognition.

 顔認証における敵対的サンプルの例として、眼鏡を用いる例が挙げられる。敵対的なノイズが乗った眼鏡を着用している場合、カメラで撮影された画像が敵対的サンプルになることがある。
 この場合、顔認証システム1が、撮影された画像が敵対的サンプルである可能性を推定することで、上記のように誤認証による被害を防止または軽減できる。
An example of a hostile sample in face recognition is the use of eyeglasses. If you are wearing glasses with hostile noise, the image taken by the camera may be a hostile sample.
In this case, the face recognition system 1 can prevent or reduce the damage caused by the erroneous authentication as described above by estimating the possibility that the captured image is a hostile sample.

 また、顔認証のための登録画像が敵対的サンプルである場合が考えられる。例えば、本人が登録用の顔写真を提出する運用の場合、認証結果を不正に操作したい対象者が、敵対的サンプルの写真を提出することが考えられる。あるいは、登録用の顔写真を撮影する運用の場合、認証結果を不正に操作したい対象者が、敵対的なノイズが乗った眼鏡を着用して撮影を受けることが考えられる。 Also, it is possible that the registered image for face recognition is a hostile sample. For example, in the case of an operation in which the person submits a face photo for registration, it is conceivable that a target person who wants to manipulate the authentication result illegally submits a photo of a hostile sample. Alternatively, in the case of an operation of taking a face photograph for registration, it is conceivable that a target person who wants to manipulate the authentication result illegally wears glasses with hostile noise and is photographed.

 この場合、顔認証システム1が、登録画像が敵対的サンプルである可能性を推定するようにしてもよい。例えば画像の登録時に、顔認証システム1が、登録対象の画像が敵対的サンプルか否かを判定し、敵対的サンプルであると判定した場合は、登録を中止して警報メッセージを出力するようにしてもよい。あるいは、顔認証システム1が、顔認証の実行時に、あるいはバックグラウンド処理等で定期的に、登録画像が敵対的サンプルか否かを判定するようにしてもよい。 In this case, the face recognition system 1 may estimate the possibility that the registered image is a hostile sample. For example, when registering an image, the face recognition system 1 determines whether or not the image to be registered is a hostile sample, and if it determines that the image is a hostile sample, the registration is stopped and an alarm message is output. You may. Alternatively, the face recognition system 1 may determine whether or not the registered image is a hostile sample at the time of executing face recognition or periodically by background processing or the like.

 顔認証装置100は、顔認証システム1について上述した顔認証を行う。
 顔認証装置100は、顔認証におけるクラス分類の結果、および、顔認証の過程で算出する顔画像の特徴ベクトルを出力可能なものであれば、既設のものであってもよい。すなわち、既設の顔認証装置を顔認証装置100として用いて顔認証システム1を構成するようにしてもよい。
The face recognition device 100 performs the face recognition described above for the face recognition system 1.
The face recognition device 100 may be an existing one as long as it can output the result of class classification in face recognition and the feature vector of the face image calculated in the process of face recognition. That is, the face recognition system 1 may be configured by using the existing face recognition device as the face recognition device 100.

 ここでいう顔認証におけるクラス分類の結果は、認証対象の人物を、予め登録されている1人以上の人物のうち何れの人物と判定するか、あるいは、何れの人物とも異なると判定するかの判定結果である。
 ここでいう、画像の特徴ベクトルは、その画像の特徴量を示すベクトルである。顔認証装置100では、機械学習モデルが顔画像の入力を受けて、その顔画像の特徴ベクトルを出力する。
The result of the classification in the face recognition referred to here is whether the person to be authenticated is judged to be one or more of the pre-registered people, or is judged to be different from any person. This is the judgment result.
The feature vector of the image referred to here is a vector indicating the feature amount of the image. In the face recognition device 100, the machine learning model receives the input of the face image and outputs the feature vector of the face image.

 推定装置300は、顔認証システム1について上述した、敵対的サンプルの可能性の推定を行う。
 フィルタ生成装置200は、推定装置300が、敵対的サンプルの可能性の推定に用いるためのフィルタ(画像フィルタ)を生成する。フィルタ生成装置200が生成するフィルタを、敵対的サンプル判定用フィルタとも称する。
The estimation device 300 estimates the possibility of the hostile sample described above for the face recognition system 1.
The filter generation device 200 generates a filter (image filter) for the estimation device 300 to use for estimating the possibility of a hostile sample. The filter generated by the filter generation device 200 is also referred to as a hostile sample determination filter.

 顔認証装置100、フィルタ生成装置200、推定装置300の各々は、例えばパソコン(Personal Computer;PC)またはワークステーション(Workstation;WS)等のコンピュータを用いて構成されていてもよい。あるいは、顔認証装置100、フィルタ生成装置200、および、推定装置300のうち1つ以上が、専用のハードウェアを用いて構成されていてもよい。 Each of the face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured by using a computer such as a personal computer (PC) or a workstation (WS). Alternatively, one or more of the face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured by using dedicated hardware.

 例えば、顔認証装置100が顔認証用の人工知能モデルとしてニューラルネットワークモデルを用いる場合、顔認証装置100を構成するコンピュータが、ニューラルネットワークの機能を模擬的に実行するようにしてもよい。あるいは、顔認証装置100が、ハードウェア的に構成されたニューラルネットワークを備えていてもよい。 For example, when the face recognition device 100 uses a neural network model as an artificial intelligence model for face recognition, the computer constituting the face recognition device 100 may simulate the function of the neural network. Alternatively, the face recognition device 100 may include a neural network configured in hardware.

 顔認証装置100、フィルタ生成装置200、推定装置300が、それぞれ別々の装置として構成されていてもよい。あるいは、顔認証装置100、フィルタ生成装置200、および、推定装置300のうち何れか2つ以上が、一体の装置として構成されていてもよい。
 例えば、顔認証装置100と推定装置300とが、1つのコンピュータを用いて一体の装置として構成されていてもよい。そして、この装置が、顔認証システム1について上述したように、顔認証の結果に加えて敵対的サンプルの可能性の推定結果に基づいて、処理を行うようにしてもよい。
The face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured as separate devices. Alternatively, any two or more of the face recognition device 100, the filter generation device 200, and the estimation device 300 may be configured as an integrated device.
For example, the face recognition device 100 and the estimation device 300 may be configured as an integrated device using one computer. Then, as described above for the face recognition system 1, this device may perform processing based on the estimation result of the possibility of the hostile sample in addition to the result of the face recognition.

 図2は、顔認証装置100の機能構成の例を示す概略ブロック図である。図2に示す構成で、顔認証装置100は、第一通信部110と、認証用画像取得部120と、認証結果処理部130と、第一記憶部180と、第一制御部190とを備える。第一記憶部180は、認証データベース部181を備える。第一制御部190は、特徴ベクトル算出部191と、分類部192とを備える。 FIG. 2 is a schematic block diagram showing an example of the functional configuration of the face recognition device 100. With the configuration shown in FIG. 2, the face recognition device 100 includes a first communication unit 110, an authentication image acquisition unit 120, an authentication result processing unit 130, a first storage unit 180, and a first control unit 190. .. The first storage unit 180 includes an authentication database unit 181. The first control unit 190 includes a feature vector calculation unit 191 and a classification unit 192.

 第一通信部110は、他の装置と通信を行う。特に、第一通信部110は、フィルタ生成装置200が敵対的サンプルの候補として生成する顔画像と、その元の顔画像とを受信する。そして、第一通信部110は、敵対的サンプルの候補、元の顔画像それぞれについて、顔認証におけるクラス分類の結果をフィルタ生成装置200へ送信する。
 顔認証におけるクラス分類の結果は、フィルタ生成装置200が、敵対的サンプル判定用フィルタを生成する際に、敵対的サンプルの候補が敵対的サンプルか否かを判定して敵対的サンプルを収集するために用いられる。
The first communication unit 110 communicates with another device. In particular, the first communication unit 110 receives the face image generated by the filter generation device 200 as a candidate for the hostile sample and the original face image thereof. Then, the first communication unit 110 transmits the result of the classification in the face recognition to the filter generation device 200 for each of the hostile sample candidate and the original face image.
The result of the classification in the face recognition is that the filter generation device 200 determines whether or not the candidate for the hostile sample is a hostile sample and collects the hostile sample when the filter for determining the hostile sample is generated. Used for.

 また、第一通信部110は、認証対象の顔画像を推定装置300に送信し、推定装置300が認証対象の顔画像に敵対的サンプル判定用フィルタを適用して得られる顔画像を受信する。認証対象の顔画像を、第一画像とも称する。第一画像に敵対的サンプル判定用フィルタを適用して得られる顔画像を、第二画像とも称する。
 そして、第一通信部110は、第一画像の特徴ベクトル、および、第二画像の特徴ベクトルを、推定装置300へ送信する。第一画像の特徴ベクトルを、第一特徴ベクトルとも称する。第二画像の特徴ベクトルを、第二特徴ベクトルとも称する。
 第一通信部110が推定装置300へ送信する第一特徴ベクトルおよび第二特徴ベクトルは、推定装置300が、認証対象の顔画像が敵対的サンプルである可能性を推定するために用いられる。
Further, the first communication unit 110 transmits the face image to be authenticated to the estimation device 300, and the estimation device 300 receives the face image obtained by applying the hostile sample determination filter to the face image to be authenticated. The face image to be authenticated is also referred to as a first image. A face image obtained by applying a hostile sample determination filter to the first image is also referred to as a second image.
Then, the first communication unit 110 transmits the feature vector of the first image and the feature vector of the second image to the estimation device 300. The feature vector of the first image is also referred to as a first feature vector. The feature vector of the second image is also referred to as a second feature vector.
The first feature vector and the second feature vector transmitted by the first communication unit 110 to the estimation device 300 are used by the estimation device 300 to estimate the possibility that the face image to be authenticated is a hostile sample.

 認証用画像取得部120は、認証対象の顔画像を取得する。認証用画像取得部120がカメラを含んで構成され、認証対象者の顔画像を撮像するようにしてもよい。あるいは、認証用画像取得部120が、カメラまたはスキャナを含んで構成され、例えばパスポートの顔写真または運転免許証の顔写真など、認証対象者の顔写真を読み取るようにしてもよい。あるいは、第一通信部110が認証用画像取得部120として用いられ、インターネット上のクライアント装置が送信する顔画像を受信するなど、他の装置から顔画像を受信するようにしてもよい。 The authentication image acquisition unit 120 acquires a face image to be authenticated. The authentication image acquisition unit 120 may be configured to include a camera and capture a face image of the person to be authenticated. Alternatively, the authentication image acquisition unit 120 may be configured to include a camera or a scanner, and may read a face photograph of a person to be authenticated, for example, a face photograph of a passport or a face photograph of a driver's license. Alternatively, the first communication unit 110 may be used as the authentication image acquisition unit 120 to receive the face image from another device, such as receiving the face image transmitted by the client device on the Internet.

 認証結果処理部130は、顔認証装置100による顔認証の結果に応じた処理を行う。認証結果処理部130が行う処理は、顔認証システム1の使用形態に応じていろいろな処理とすることができる。
 例えば、顔認証システム1が、あるシステムへのログイン認証を行う場合、顔認証システム1が顔認証に成功したときは、認証結果処理部130が、顔認証で認証対象者と同定されたユーザに応じてログイン処理を行うようにしてもよい。この場合の顔認証に成功することは、認証対象者が認証データベースに登録されている何れかのユーザと同定されることである。
The authentication result processing unit 130 performs processing according to the result of face authentication by the face authentication device 100. The processing performed by the authentication result processing unit 130 can be various processing depending on the usage pattern of the face recognition system 1.
For example, when the face recognition system 1 performs login authentication to a certain system, when the face recognition system 1 succeeds in face recognition, the authentication result processing unit 130 informs the user identified as the person to be authenticated by the face recognition. The login process may be performed accordingly. Successful face recognition in this case means that the person to be authenticated is identified as one of the users registered in the authentication database.

 一方、顔認証システム1が顔認証に失敗したときは、認証結果処理部130が、そのシステムの表示装置を制御して、認証に失敗した旨のメッセージを表示させるようにしてもよい。この場合の認証に失敗することは、認証対象者が、認証データベースに登録されている何れのユーザとも異なると判定されることである。 On the other hand, when the face recognition system 1 fails in face recognition, the authentication result processing unit 130 may control the display device of the system to display a message indicating that the authentication has failed. If the authentication fails in this case, it is determined that the authentication target person is different from any user registered in the authentication database.

 あるいは、顔認証システム1がゲート開閉機構を備え、ゲートを開くことによる入場許可を行う場合、認証結果処理部130が、顔認証の結果に応じてゲートの開閉を制御するようにしてもよい。この場合、顔認証システム1が顔認証に成功したときは、認証結果処理部130が、ゲートを開くようにしてもよい。この場合の顔認証に成功することは、認証対象者が認証データベースに登録されている何れかの入場許可者と同定されることである。
 一方、顔認証システム1が顔認証に失敗したときは、認証結果処理部130が、ゲートを閉じたままにしてもよい。この場合の認証に失敗することは、認証対象者が、認証データベースに登録されている何れの入場許可者とも異なると判定されることである。
Alternatively, when the face recognition system 1 is provided with a gate opening / closing mechanism and the entrance is permitted by opening the gate, the authentication result processing unit 130 may control the opening / closing of the gate according to the result of face recognition. In this case, when the face recognition system 1 succeeds in face recognition, the authentication result processing unit 130 may open the gate. Successful face recognition in this case means that the person to be authenticated is identified as one of the admission persons registered in the authentication database.
On the other hand, when the face recognition system 1 fails in face recognition, the authentication result processing unit 130 may keep the gate closed. If the authentication fails in this case, it is determined that the person to be authenticated is different from any admission person registered in the authentication database.

 第一記憶部180は、各種データを記憶する。第一記憶部180の機能は、顔認証装置100が備える記憶デバイスを用いて実行される。
 認証データベース部181は、認証データベースを記憶する。ここでいう認証データベースは、顔認証において認証対象者のデータとの比較対象となるデータである。認証データベースは、顔認証にて認証対象者が同定される候補者のデータを含む。各候補者のデータは、その候補者の顔画像の特徴ベクトルを含む。顔認証におけるクラス分類では、各候補者、および、該当する候補者無しが、クラスの例に該当する。そして、認証対象者が何れの候補者であるかを判定すること、あるいは、何れの候補者とも異なると判定することが、クラス分類の例に該当する。
The first storage unit 180 stores various data. The function of the first storage unit 180 is executed by using the storage device included in the face recognition device 100.
The authentication database unit 181 stores the authentication database. The authentication database referred to here is data to be compared with the data of the authentication target person in face authentication. The authentication database contains data of candidates whose authentication target is identified by face recognition. The data of each candidate includes a feature vector of the candidate's facial image. In the classification in face recognition, each candidate and no corresponding candidate correspond to the example of the class. Then, determining which candidate the authentication target person is, or determining that the person to be authenticated is different from any candidate corresponds to the example of classification.

 例えば、顔認証装置100が顔認証によるログイン認証を行う場合、認証データベースは、ユーザ毎のデータを含む。この場合のユーザ毎のデータは、そのユーザの顔画像の特徴ベクトル、そのユーザのユーザ名、および、そのユーザがログインする場合の設定情報を含む。 For example, when the face authentication device 100 performs login authentication by face authentication, the authentication database includes data for each user. The data for each user in this case includes a feature vector of the user's face image, the user name of the user, and setting information when the user logs in.

 第一制御部190は、顔認証装置100の各部を制御して各種処理を実行する。第一制御部190の機能は、例えば、顔認証装置100が備えるCPU(Central Processing Unit、中央処理装置)が、第一記憶部180からプログラムを読み出して実行することで実行される。 The first control unit 190 controls each unit of the face recognition device 100 to execute various processes. The function of the first control unit 190 is executed, for example, by the CPU (Central Processing Unit) included in the face authentication device 100 reading a program from the first storage unit 180 and executing the program.

 特徴ベクトル算出部191は、上述した人工知能モデルを含んで構成され、顔画像の特徴ベクトルを算出する。具体的には、特徴ベクトル算出部191は、認証対象の顔画像の特徴ベクトル(第一特徴ベクトル)を算出する。また、特徴ベクトル算出部191は、フィルタ生成装置200が生成して送信する、敵対的サンプリングの候補の特徴ベクトルを算出する。また、特徴ベクトル算出部191は、推定装置300が生成して送信する、第二画像の特徴ベクトル(第二特徴ベクトル)を算出する。 The feature vector calculation unit 191 is configured to include the above-mentioned artificial intelligence model, and calculates the feature vector of the face image. Specifically, the feature vector calculation unit 191 calculates the feature vector (first feature vector) of the face image to be authenticated. In addition, the feature vector calculation unit 191 calculates the feature vector of the hostile sampling candidate generated and transmitted by the filter generation device 200. In addition, the feature vector calculation unit 191 calculates the feature vector (second feature vector) of the second image generated and transmitted by the estimation device 300.

 分類部192は、認証対象の顔画像の特徴ベクトルを用いて、顔認証におけるクラス分類を行う。例えば、分類部192は、認証データベースに示される候補者毎に、その候補者の顔画像の特徴ベクトルと、認証対象の顔画像の特徴ベクトルとの類似度を量的に示すスコアを算出する。分類部192が算出するスコアを、認証スコアとも称する。 The classification unit 192 classifies the face in face recognition using the feature vector of the face image to be authenticated. For example, the classification unit 192 calculates a score that quantitatively indicates the degree of similarity between the feature vector of the face image of the candidate and the feature vector of the face image to be authenticated for each candidate shown in the authentication database. The score calculated by the classification unit 192 is also referred to as an authentication score.

 以下では、分類部192が算出するスコアが大きいほど、特徴ベクトルの類似の度合いが大きいことを示す場合を例に説明する。ただし、分類部192が、スコアが小さいほど、特徴ベクトルの類似の度合いが大きいことを示す認証スコアを算出するようにしてもよい。 In the following, an example will be described in which the larger the score calculated by the classification unit 192, the greater the degree of similarity of the feature vectors. However, the classification unit 192 may calculate the authentication score indicating that the smaller the score, the greater the degree of similarity of the feature vectors.

 算出した認証スコアが何れも所定の認証閾値未満である場合、分類部192は、認証対象者が何れの候補者とも異なると判定する。一方、算出した認証スコアのうち何れか1つ以上が認証閾値以上である場合、分類部192は、認証対象者が、認証スコアが最も大きく算出された候補者であると判定する。 When all the calculated authentication scores are less than the predetermined authentication threshold value, the classification unit 192 determines that the authentication target person is different from any candidate. On the other hand, when any one or more of the calculated authentication scores is equal to or higher than the authentication threshold value, the classification unit 192 determines that the authentication target person is the candidate whose authentication score is calculated to be the largest.

 図3は、フィルタ生成装置200の機能構成の例を示す概略ブロック図である。図3に示す構成で、フィルタ生成装置200は、第二通信部210と、第二記憶部280と、第二制御部290とを備える。第二制御部290は、敵対的摂動取得部291と、フィルタ生成用画像取得部292と、ノイズ付加部293と、敵対的サンプル選択部294と、取得処理部295と、フィルタ生成部296とを備える。 FIG. 3 is a schematic block diagram showing an example of the functional configuration of the filter generation device 200. With the configuration shown in FIG. 3, the filter generation device 200 includes a second communication unit 210, a second storage unit 280, and a second control unit 290. The second control unit 290 includes a hostile perturbation acquisition unit 291, a filter generation image acquisition unit 292, a noise addition unit 293, a hostile sample selection unit 294, an acquisition processing unit 295, and a filter generation unit 296. Be prepared.

 第二通信部210は、他の装置と通信を行う。特に、第二通信部210は、フィルタ生成装置200が敵対的サンプルの候補として生成する顔画像と、その元の顔画像とを顔認証装置100へ送信する。そして、第二通信部210は、敵対的サンプルの候補、元の顔画像それぞれについて、顔認証におけるクラス分類の結果を顔認証装置100から受信する。 The second communication unit 210 communicates with other devices. In particular, the second communication unit 210 transmits the face image generated by the filter generation device 200 as a candidate for the hostile sample and the original face image thereof to the face recognition device 100. Then, the second communication unit 210 receives the result of the classification in the face recognition from the face recognition device 100 for each of the hostile sample candidate and the original face image.

 第二記憶部280は、各種データを記憶する。第二記憶部280の機能は、フィルタ生成装置200が備える記憶デバイスを用いて実行される。
 第二制御部290は、フィルタ生成装置200の各部を制御して各種処理を実行する。第二制御部290の機能は、例えば、フィルタ生成装置200が備えるCPUが、第二記憶部280からプログラムを読み出して実行することで実行される。
The second storage unit 280 stores various data. The function of the second storage unit 280 is executed by using the storage device included in the filter generation device 200.
The second control unit 290 controls each unit of the filter generation device 200 to execute various processes. The function of the second control unit 290 is executed, for example, by the CPU included in the filter generation device 200 reading a program from the second storage unit 280 and executing the program.

 敵対的摂動取得部291は、顔画像と、その顔画像に基づく敵対的サンプルとの差による敵対的摂動(Adversarial Perturbation)を複数取得する。ここでいう顔画像に基づく敵対的サンプルは、その顔画像にノイズが付加された、敵対的サンプルとしての顔画像である。ここでいう敵対的摂動は、敵対的サンプルから元の顔画像を減算した差である。したがって、敵対的摂動は、敵対的サンプルを生成するために、元の顔画像に加えられるノイズに相当する画像である。
 敵対的摂動取得部291は、敵対的摂動取得手段の例に該当する。
The hostile perturbation acquisition unit 291 acquires a plurality of adversarial perturbations due to the difference between the face image and the hostile sample based on the face image. The hostile sample based on the face image referred to here is a face image as a hostile sample in which noise is added to the face image. The hostile perturbation here is the difference obtained by subtracting the original facial image from the hostile sample. Therefore, a hostile perturbation is an image that corresponds to the noise added to the original facial image to generate a hostile sample.
The hostile perturbation acquisition unit 291 corresponds to an example of a hostile perturbation acquisition means.

 敵対的摂動取得部291が、敵対的摂動を取得する方法は、特定の方法に限定されない。
 敵対的摂動取得部291が、顔画像にノイズを加えて敵対的サンプルを生成する場合、そのノイズを敵対的摂動として用いるようにしてもよい。
The method by which the hostile perturbation acquisition unit 291 acquires a hostile perturbation is not limited to a specific method.
When the hostile perturbation acquisition unit 291 adds noise to the face image to generate a hostile sample, the noise may be used as the hostile perturbation.

 あるいは、敵対的摂動取得部291が、既存の敵対的サンプルとその元の顔画像とを取得する場合、敵対的サンプルの各画素の値から元の顔画像で対応する画素の値を減算した差分の画像を、敵対的摂動として算出するようにしてもよい。ここでいう対応する画素は、画像内における相対位置が同じ画素である。したがって、サイズが等しい複数の画像を、端を揃えて重ね合わせたときに重なる画素が、対応する画素に該当する。画像のサイズが等しいとは、複数の画像の縦の画素数および横の画素数の何れも同じであることである。 Alternatively, when the hostile perturbation acquisition unit 291 acquires an existing hostile sample and its original face image, the difference obtained by subtracting the value of the corresponding pixel in the original face image from the value of each pixel of the hostile sample. The image of may be calculated as a hostile perturbation. The corresponding pixels referred to here are pixels having the same relative position in the image. Therefore, the pixels that overlap when a plurality of images of the same size are overlapped with their edges aligned correspond to the corresponding pixels. Equal image size means that both the number of vertical pixels and the number of horizontal pixels of a plurality of images are the same.

 あるいは、敵対的摂動取得部291が既存の敵対的サンプルを取得する場合、公知の敵対的摂動計算方法を用いて敵対的摂動を取得するようにしてもよい。
 あるいは、敵対的摂動取得部291が、既存の敵対的摂動を取得するようにしてもよい。
Alternatively, when the hostile perturbation acquisition unit 291 acquires an existing hostile sample, the hostile perturbation may be acquired by using a known hostile perturbation calculation method.
Alternatively, the hostile perturbation acquisition unit 291 may acquire an existing hostile perturbation.

 敵対的摂動取得部291が取得する複数の敵対的摂動は、全て同一の顔画像を用いて算出されたものであってもよい。
 あるいは、敵対的摂動取得部291が、複数の顔画像の各々について、その顔画像を用いて算出される敵対的摂動を取得するようにしてもよい。敵対的摂動取得部291が、複数の顔画像の各々について、その顔画像を用いて算出される敵対的摂動を取得する場合、複数の顔画像が、同一の人物の顔画像であってもよいし、複数の人物の顔画像が含まれていてもよい。
 以下では、敵対的摂動取得部291が、顔画像にノイズを加えて敵対的サンプルを生成する場合を例に説明する。
The plurality of hostile perturbations acquired by the hostile perturbation acquisition unit 291 may all be calculated using the same facial image.
Alternatively, the hostile perturbation acquisition unit 291 may acquire the hostile perturbation calculated by using the face image for each of the plurality of face images. When the hostile perturbation acquisition unit 291 acquires the hostile perturbations calculated by using the face images for each of the plurality of face images, the plurality of face images may be the face images of the same person. However, facial images of a plurality of persons may be included.
In the following, a case where the hostile perturbation acquisition unit 291 adds noise to the face image to generate a hostile sample will be described as an example.

 フィルタ生成用画像取得部292は、敵対的サンプルの元となる顔画像を取得する。フィルタ生成用画像取得部292が取得する顔画像を、フィルタ生成用画像とも称する。
 フィルタ生成用画像取得部292がフィルタ生成用画像を取得する方法は、特定の方法に限定されない。例えば、フィルタ生成装置200がカメラを備え、フィルタ生成用画像取得部292が、そのカメラが撮像した顔画像をフィルタ生成用画像として取得するようにしてもよい。あるいは、顔認証装置100の認証用画像取得部120がカメラを備え、フィルタ生成用画像取得部292が、そのカメラが撮像した顔画像をフィルタ生成用画像として取得するようにしてもよい。あるいは、第二通信部210が既存の顔画像を他の装置から受信するなど、フィルタ生成用画像取得部292が、既存の顔画像を取得するようにしてもよい。
The filter generation image acquisition unit 292 acquires a face image that is the source of the hostile sample. The face image acquired by the filter generation image acquisition unit 292 is also referred to as a filter generation image.
The method by which the filter generation image acquisition unit 292 acquires the filter generation image is not limited to a specific method. For example, the filter generation device 200 may include a camera, and the filter generation image acquisition unit 292 may acquire the face image captured by the camera as the filter generation image. Alternatively, the authentication image acquisition unit 120 of the face recognition device 100 may include a camera, and the filter generation image acquisition unit 292 may acquire the face image captured by the camera as a filter generation image. Alternatively, the filter generation image acquisition unit 292 may acquire the existing face image, such as the second communication unit 210 receiving the existing face image from another device.

 ノイズ付加部293は、フィルタ生成用画像にノイズを加えることで、敵対的サンプルの候補を取得する。例えば、ノイズ付加部293が、各画素の輝度値の増減量が所定の範囲内となるノイズ画像をランダムに生成し、生成したノイズをフィルタ生成用画像に加えるようにしてもよい。 The noise addition unit 293 acquires candidates for hostile samples by adding noise to the filter generation image. For example, the noise addition unit 293 may randomly generate a noise image in which the amount of increase / decrease in the brightness value of each pixel is within a predetermined range, and add the generated noise to the filter generation image.

 敵対的サンプル選択部294は、ノイズ付加部293が生成する敵対的サンプルの候補のうち、敵対的サンプルを選択する。敵対的サンプル選択部294が、敵対的サンプルを選択する方法は、特定の方法に限定されない。
 例えば、顔認証装置100が、敵対的サンプルの候補、元の顔画像(フィルタ生成用画像)それぞれについて顔認証におけるクラス分類を行い、分類結果をフィルタ生成装置200に送信するようにしてもよい。そして、敵対的サンプル選択部294が、敵対的サンプルの候補のうち、フィルタ生成用画像とは異なるクラスに分類される候補を、敵対的サンプルとして選択するようにしてもよい。
The hostile sample selection unit 294 selects a hostile sample from the hostile sample candidates generated by the noise addition unit 293. The method by which the hostile sample selection unit 294 selects a hostile sample is not limited to a specific method.
For example, the face recognition device 100 may classify each of the hostile sample candidate and the original face image (filter generation image) in face recognition, and transmit the classification result to the filter generation device 200. Then, the hostile sample selection unit 294 may select, among the candidates for the hostile sample, a candidate classified into a class different from the filter generation image as the hostile sample.

 あるいは、顔認証システム1のユーザ等の人が、敵対的サンプルの候補を目視して、敵対的サンプルか否かを判定するようにしてもよい。この場合、フィルタ生成装置200が、表示装置および入力デバイスを備え、敵対的サンプルの候補を表示装置に表示し、判定結果のユーザ操作を入力デバイスにて受け付けるようにしてもよい。そして、敵対的サンプル選択部294が、敵対的サンプルの候補のうち、人手で敵対的サンプルと判定された画像を選択するようにしてもよい。 Alternatively, a person such as a user of the face recognition system 1 may visually check the candidates for the hostile sample and determine whether or not the sample is hostile. In this case, the filter generation device 200 may include a display device and an input device, display hostile sample candidates on the display device, and accept the user operation of the determination result on the input device. Then, the hostile sample selection unit 294 may manually select an image determined to be a hostile sample from the candidates for the hostile sample.

 取得処理部295は、敵対的摂動を取得する。具体的には、取得処理部295は、敵対的サンプル選択部294が選択した敵対的サンプルを生成するために、ノイズ付加部293がフィルタ生成用画像に加えたノイズを、敵対的摂動として取得する。 Acquisition processing unit 295 acquires hostile perturbations. Specifically, the acquisition processing unit 295 acquires the noise added to the filter generation image by the noise addition unit 293 as a hostile perturbation in order to generate the hostile sample selected by the hostile sample selection unit 294. ..

 フィルタ生成部296は、複数の敵対的摂動に基づいて敵対的サンプル判定用フィルタを生成する。そのために、敵対的摂動取得部291が複数の敵対的摂動を取得するようにしてもよい。フィルタ生成部296は、例えば複数の敵対的摂動の平均または重み付け和を求めるなど、複数の敵対的摂動を合成することで敵対的サンプル判定用フィルタを生成する。
 フィルタ生成部296は、フィルタ生成手段の例に該当する。
The filter generation unit 296 generates a hostile sample determination filter based on a plurality of hostile perturbations. Therefore, the hostile perturbation acquisition unit 291 may acquire a plurality of hostile perturbations. The filter generation unit 296 generates a filter for determining a hostile sample by synthesizing a plurality of hostile perturbations, for example, obtaining an average or a weighted sum of a plurality of hostile perturbations.
The filter generation unit 296 corresponds to an example of a filter generation means.

 複数の敵対的摂動の平均とは、サイズが等しい複数の画像としての複数の敵対的摂動について、対応する画素毎に画素値の平均を算出して得られる、1つの画像である。複数の敵対的摂動の重み付け合計とは、サイズが等しい複数の画像としての複数の敵対的摂動について、敵対的摂動毎に重み係数を設定しておき、対応する画素毎に重み付け合計を算出して得られる、1つの画像である。 The average of a plurality of hostile perturbations is one image obtained by calculating the average of pixel values for each corresponding pixel for a plurality of hostile perturbations as a plurality of images having the same size. The weighted total of multiple hostile perturbations is a weighting coefficient set for each hostile perturbation for multiple hostile perturbations as multiple images of the same size, and the weighted total is calculated for each corresponding pixel. It is one image obtained.

 例えば、フィルタ生成部296が、複数の敵対的摂動の各々について、元の画像と敵対的サンプルとの相違度に基づいて、相違度が大きい敵対的摂動ほど大きい重み係数を乗算して、重み付け合計を行うようにしてもよい。元の画像と敵対的サンプルとの相違度として、例えば、元の画像の特徴ベクトルと、敵対的サンプルの特徴ベクトルとの距離を用いることができる。この場合の距離として、例えば、L1ノルム、L2ノルム、L∞ノルム、またはコサイン距離を用いることができる。
 ただし、フィルタ生成部296が重み付け合計に用いる重みは、特定のものに限定されない。
For example, the filter generator 296 multiplies each of the plurality of hostile perturbations by a weighting factor that increases the difference between the original image and the hostile sample based on the difference between the original image and the hostile sample. May be done. As the degree of difference between the original image and the hostile sample, for example, the distance between the feature vector of the original image and the feature vector of the hostile sample can be used. As the distance in this case, for example, L1 norm, L2 norm, L∞ norm, or cosine distance can be used.
However, the weight used by the filter generation unit 296 for the total weighting is not limited to a specific one.

 図4は、推定装置300の機能構成の例を示す概略ブロック図である。図4に示す構成で、推定装置300は、第三通信部310と、判定結果処理部320と、第三記憶部380と、第三制御部390とを備える。第三制御部390は、第一画像取得部391と、フィルタ適用部392と、スコア算出部393と、判定部394とを備える。 FIG. 4 is a schematic block diagram showing an example of the functional configuration of the estimation device 300. With the configuration shown in FIG. 4, the estimation device 300 includes a third communication unit 310, a determination result processing unit 320, a third storage unit 380, and a third control unit 390. The third control unit 390 includes a first image acquisition unit 391, a filter application unit 392, a score calculation unit 393, and a determination unit 394.

 第三通信部310は、他の装置と通信を行う。特に、第三通信部310は、認証対象の顔画像(第一画像)を顔認証装置100から受信し、推定装置300が認証対象の顔画像に敵対的サンプル判定用フィルタを適用して得られる顔画像(第二画像)を顔認証装置100へ送信する。そして、第三通信部310は、第一画像の特徴ベクトル、および、第二画像の特徴ベクトルを、顔認証装置100から受信する。 The third communication unit 310 communicates with other devices. In particular, the third communication unit 310 receives the face image (first image) to be authenticated from the face recognition device 100, and the estimation device 300 applies the hostile sample determination filter to the face image to be authenticated. The face image (second image) is transmitted to the face recognition device 100. Then, the third communication unit 310 receives the feature vector of the first image and the feature vector of the second image from the face recognition device 100.

 判定結果処理部320は、認証対象の顔画像が敵対的サンプルであると推定装置300が判定した場合に、判定結果に応じた処理を行う。判定結果処理部320が行う処理は、顔認証システム1の使用形態に応じていろいろな処理とすることができる。判定結果処理部320が行う処理を、敵対的サンプル対応処理とも称する。 When the estimation device 300 determines that the face image to be authenticated is a hostile sample, the determination result processing unit 320 performs processing according to the determination result. The processing performed by the determination result processing unit 320 can be various processing depending on the usage pattern of the face authentication system 1. The process performed by the determination result processing unit 320 is also referred to as a hostile sample response process.

 例えば、顔認証システム1が、顔認証の結果を管理者など人に通知する場合、判定結果処理部320が、認証対象の顔画像が敵対的サンプルであるとの判定結果を表示するなど、判定結果をその人に通知するようにしてもよい。あるいは、判定結果処理部320が、敵対的サンプルか否かの判定結果にかかわらず、認証対象の顔画像が敵対的サンプルである可能性を示すスコアを表示するなど、スコアをその人に通知するようにしてもよい。 For example, when the face recognition system 1 notifies a person such as an administrator of the result of face recognition, the judgment result processing unit 320 displays a judgment result that the face image to be authenticated is a hostile sample. The result may be notified to the person. Alternatively, the determination result processing unit 320 notifies the person of the score, such as displaying a score indicating the possibility that the face image to be authenticated is a hostile sample, regardless of the determination result of whether or not the sample is hostile. You may do so.

 あるいは、顔認証システム1が、あるシステムへのログイン認証を行う場合、第三通信部310が判定結果処理部320として機能して、認証対象の顔画像が敵対的サンプルである旨の通知を顔認証装置100へ送信するようにしてもよい。認証対象の顔画像が敵対的サンプルである旨の通知を受けた顔認証装置100では、認証結果処理部130が、顔認証の結果にかかわらず、顔認証に失敗した場合の処理を行うようにしてもよい。 Alternatively, when the face recognition system 1 performs login authentication to a certain system, the third communication unit 310 functions as the determination result processing unit 320 to notify the face that the face image to be authenticated is a hostile sample. It may be transmitted to the authentication device 100. In the face recognition device 100 that has been notified that the face image to be authenticated is a hostile sample, the authentication result processing unit 130 performs processing when face recognition fails regardless of the result of face recognition. You may.

 あるいは、顔認証システム1がゲート開閉機構を備え、ゲートを開くことによる入場許可を行う場合も、第三通信部310が判定結果処理部320として機能して、認証対象の顔画像が敵対的サンプルである旨の通知を顔認証装置100へ送信するようにしてもよい。認証対象の顔画像が敵対的サンプルである旨の通知を受けた顔認証装置100では、認証結果処理部130が、顔認証の結果にかかわらず、顔認証に失敗した場合の処理を行うようにしてもよい。 Alternatively, when the face recognition system 1 is provided with a gate opening / closing mechanism and the entrance is permitted by opening the gate, the third communication unit 310 functions as the determination result processing unit 320, and the face image to be authenticated is a hostile sample. The notification to that effect may be transmitted to the face recognition device 100. In the face recognition device 100 that has been notified that the face image to be authenticated is a hostile sample, the authentication result processing unit 130 performs processing when face recognition fails regardless of the result of face recognition. You may.

 第三記憶部380は、各種データを記憶する。第三記憶部380の機能は、推定装置300が備える記憶デバイスを用いて実行される。
 第三制御部390は、推定装置300の各部を制御して各種処理を実行する。第三制御部390の機能は、例えば、推定装置300が備えるCPUが、第三記憶部380からプログラムを読み出して実行することで実行される。
The third storage unit 380 stores various data. The function of the third storage unit 380 is executed by using the storage device included in the estimation device 300.
The third control unit 390 controls each unit of the estimation device 300 to execute various processes. The function of the third control unit 390 is executed, for example, by the CPU included in the estimation device 300 reading a program from the third storage unit 380 and executing the program.

 第一画像取得部391は、第一画像を取得する。具体的には、顔認証装置100が、顔認証を行う際に取得する認証対象の顔画像(第一画像)を推定装置300へ送信する。推定装置300では、第一画像取得部391は、第三通信部310が顔認証装置100から受信する信号から第一信号を抽出する。 The first image acquisition unit 391 acquires the first image. Specifically, the face recognition device 100 transmits a face image (first image) to be authenticated acquired when performing face recognition to the estimation device 300. In the estimation device 300, the first image acquisition unit 391 extracts the first signal from the signal received from the face recognition device 100 by the third communication unit 310.

 フィルタ適用部392は、第一画像取得部391が取得する第一画像に敵対的サンプル判定用フィルタを適用して第二画像を取得する。例えば、フィルタ適用部392は、第一画像への敵対的サンプル判定用フィルタの適用として、第一画像の各画素の値に、敵対的サンプル判定用フィルタで対応する画素の値を加算する。
 フィルタ適用部392は、フィルタ適用手段の例に該当する。
The filter application unit 392 applies a hostile sample determination filter to the first image acquired by the first image acquisition unit 391 to acquire the second image. For example, the filter application unit 392 adds the value of each pixel of the first image to the value of the corresponding pixel in the hostile sample determination filter as the application of the hostile sample determination filter to the first image.
The filter application unit 392 corresponds to an example of the filter application means.

 スコア算出部393は、第一特徴ベクトル(第一画像の特徴ベクトル)と第二特徴ベクトル(第二画像の特徴ベクトル)との相違度を量的に示すスコアを算出する。スコア算出部393が算出するスコアを、敵対的サンプルスコアとも称する。スコア算出部393が敵対的サンプルスコアを算出する方法と、分類部192が認証スコアを算出する方法とは、同様の方法であってもよいし、異なる方法であってもよい。
 スコア算出部393は、スコア算出手段の例に該当する。
The score calculation unit 393 calculates a score that quantitatively indicates the degree of difference between the first feature vector (feature vector of the first image) and the second feature vector (feature vector of the second image). The score calculated by the score calculation unit 393 is also referred to as a hostile sample score. The method in which the score calculation unit 393 calculates the hostile sample score and the method in which the classification unit 192 calculates the authentication score may be the same method or different methods.
The score calculation unit 393 corresponds to an example of the score calculation means.

 スコア算出部393が敵対的サンプルスコアを算出する方法は、特定の方法に限定されない。スコア算出部393が、スコアが大きいほど、第一特徴ベクトルと第二特徴ベクトルとの相違の度合いが大きいことを示す敵対的サンプルスコアを算出するようにしてもよい。あるいは、スコア算出部393が、スコアが小さいほど、第一特徴ベクトルと第二特徴ベクトルとの相違の度合いが大きいことを示す敵対的サンプルスコアを算出するようにしてもよい。 The method by which the score calculation unit 393 calculates the hostile sample score is not limited to a specific method. The score calculation unit 393 may calculate a hostile sample score indicating that the larger the score, the greater the degree of difference between the first feature vector and the second feature vector. Alternatively, the score calculation unit 393 may calculate a hostile sample score indicating that the smaller the score, the greater the degree of difference between the first feature vector and the second feature vector.

 例えば、スコア算出部393が、敵対的サンプルスコアとして、第一特徴ベクトルと第二特徴ベクトルとのL1ノルム、L2ノルム、L∞ノルム、またはコサイン距離の何れかを算出するようにしてもよい。コサイン距離は、1からコサイン類似度を減算したものである。これらの敵対的サンプルスコアは、スコアが大きいほど、第一特徴ベクトルと第二特徴ベクトルとの相違の度合いが大きいことを示す敵対的サンプルスコアの例に該当する。 For example, the score calculation unit 393 may calculate any of the L1 norm, L2 norm, L∞ norm, or cosine distance between the first feature vector and the second feature vector as a hostile sample score. The cosine distance is 1 minus the cosine similarity. These hostile sample scores correspond to the example of the hostile sample scores showing that the larger the score, the greater the degree of difference between the first feature vector and the second feature vector.

 あるいは、スコア算出部393が、敵対的サンプルスコアとして、第一特徴ベクトルと第二特徴ベクトルとのコサイン類似度、または、内積の何れかを算出するようにしてもよい。これらの敵対的サンプルスコアは、スコアが小さいほど、第一特徴ベクトルと第二特徴ベクトルとの相違の度合いが大きいことを示す敵対的サンプルスコアの例に該当する。 Alternatively, the score calculation unit 393 may calculate either the cosine similarity between the first feature vector and the second feature vector or the inner product as a hostile sample score. These hostile sample scores correspond to examples of hostile sample scores indicating that the smaller the score, the greater the degree of difference between the first feature vector and the second feature vector.

 スコア算出部393が算出する敵対的サンプルスコアは、認証対象の顔画像が敵対的サンプルである可能性の大きさを示すスコアとして用いられる。敵対的サンプルスコアが示す第一特徴ベクトルと第二特徴ベクトルとの相違の度合いが大きいほど、認証対象の顔画像が敵対的サンプルである可能性が高いと推定される。 The hostile sample score calculated by the score calculation unit 393 is used as a score indicating the degree of possibility that the face image to be authenticated is a hostile sample. It is presumed that the greater the degree of difference between the first feature vector and the second feature vector indicated by the hostile sample score, the higher the possibility that the face image to be authenticated is a hostile sample.

 敵対的サンプルスコアが大きいほど、第一特徴ベクトルと第二特徴ベクトルとの相違の度合いが大きいことを示す場合、敵対的サンプルスコアが大きいほど、認証対象の顔画像が敵対的サンプルである可能性が高いことを示す。一方、敵対的サンプルスコアが小さいほど、第一特徴ベクトルと第二特徴ベクトルとの相違の度合いが大きいことを示す場合、敵対的サンプルスコアが小さいほど、認証対象の顔画像が敵対的サンプルである可能性が高いことを示す。 If the larger the hostile sample score indicates that the degree of difference between the first feature vector and the second feature vector is greater, the larger the hostile sample score, the more likely the face image to be authenticated is a hostile sample. Indicates that is high. On the other hand, when the smaller the hostile sample score indicates that the degree of difference between the first feature vector and the second feature vector is larger, the smaller the hostile sample score, the more the face image to be authenticated is the hostile sample. Indicates a high probability.

 敵対的サンプルスコアを、認証対象の顔画像が敵対的サンプルである可能性の大きさを示すスコアとして用いることは、「第一画像と、第一画像に敵対的サンプル判定用フィルタを適用した第二画像との相違の度合いが大きいほど、第一画像が敵対的サンプルである可能性が高い」ということに基づく。実験で、「第一画像と、第一画像に敵対的サンプル判定用フィルタを適用した第二画像との相違の度合いが大きいほど、第一画像が敵対的サンプルである可能性が高い」ことについて、肯定的な結果が得られている。 Using the hostile sample score as a score indicating the magnitude of the possibility that the face image to be authenticated is a hostile sample is described as "the first image and the first image to which the hostile sample determination filter is applied. The greater the degree of difference between the two images, the more likely the first image is a hostile sample. " In the experiment, "The greater the degree of difference between the first image and the second image to which the hostile sample determination filter is applied to the first image, the higher the possibility that the first image is a hostile sample." , Positive results have been obtained.

 判定部394は、スコア算出部393が算出する敵対的サンプルスコアに基づいて、認証対象の顔画像が敵対的サンプルであるか否かを判定する。例えば、判定部394は、敵対的サンプルスコアと、所定の判定閾値とを比較する。
 敵対的サンプルスコアが大きいほど、認証対象の顔画像が敵対的サンプルである可能性が高いことを示す場合、判定部394は、敵対的サンプルスコアが判定閾値以上であるときに、認証対象の顔画像が敵対的サンプルであると判定する。この場合、判定部394は、敵対的サンプルスコアが判定閾値未満であるときは、認証対象の顔画像が敵対的サンプルではないと判定する。
The determination unit 394 determines whether or not the face image to be authenticated is a hostile sample based on the hostile sample score calculated by the score calculation unit 393. For example, the determination unit 394 compares the hostile sample score with a predetermined determination threshold.
When the larger the hostile sample score indicates that the face image to be authenticated is more likely to be a hostile sample, the determination unit 394 determines that the face to be authenticated is equal to or greater than the determination threshold value. Determine that the image is a hostile sample. In this case, the determination unit 394 determines that the face image to be authenticated is not a hostile sample when the hostile sample score is less than the determination threshold value.

 敵対的サンプルスコアが小さいほど、認証対象の顔画像が敵対的サンプルである可能性が高いことを示す場合、判定部394は、敵対的サンプルスコアが判定閾値以下であるときに、認証対象の顔画像が敵対的サンプルであると判定する。この場合、判定部394は、敵対的サンプルスコアが判定閾値よりも大きいときは、認証対象の顔画像が敵対的サンプルではないと判定する。
 上述した判定結果処理部320は、判定部394の判定結果に応じた処理を行う。
When the smaller the hostile sample score indicates that the face image to be authenticated is more likely to be a hostile sample, the determination unit 394 determines the face to be authenticated when the hostile sample score is equal to or less than the determination threshold value. Determine that the image is a hostile sample. In this case, the determination unit 394 determines that the face image to be authenticated is not a hostile sample when the hostile sample score is larger than the determination threshold value.
The determination result processing unit 320 described above performs processing according to the determination result of the determination unit 394.

 ただし、推定装置300が、認証対象の顔画像が敵対的サンプルであるか否かの判定結果に代えて、敵対的サンプルである可能性の大きさを示すスコアを出力するようにしてもよい。この場合、推定装置300が、判定部394を備えていなくてもよい。 However, the estimation device 300 may output a score indicating the degree of possibility that the face image to be authenticated is a hostile sample instead of the determination result of whether or not the face image to be authenticated is a hostile sample. In this case, the estimation device 300 does not have to include the determination unit 394.

 推定装置300が、認証対象の顔画像が敵対的サンプルである可能性の大きさを示すスコアとして、敵対的サンプルスコアをそのまま出力してもよい。あるいは、推定装置300が、敵対的サンプルスコアを加工して出力するようにしてもよい。例えば、推定装置300が、敵対的サンプルスコアを、認証対象の顔画像が敵対的サンプルである可能性の大きさを百分率で示すスコアに換算して出力するようにしてもよい。
 また、顔認証システム1について上述したように、推定装置300が、認証対象の顔画像に加えて、あるいは代えて、登録画像または登録対象の画像が敵対的サンプルである可能性を推定し、推定結果に応じた処理を行うようにしてもよい。
The estimation device 300 may output the hostile sample score as it is as a score indicating the magnitude of the possibility that the face image to be authenticated is a hostile sample. Alternatively, the estimation device 300 may process and output the hostile sample score. For example, the estimation device 300 may convert the hostile sample score into a score indicating the magnitude of the possibility that the face image to be authenticated is a hostile sample and output it.
Further, as described above for the face recognition system 1, the estimation device 300 estimates and estimates the possibility that the registered image or the image to be registered is a hostile sample in addition to or instead of the face image to be authenticated. The process may be performed according to the result.

 次に、図5および図6を参照して、顔認証システム1の動作について説明する。
 図5は、フィルタ生成装置200が、敵対的サンプル判定用フィルタを生成する処理手順の例を示すフローチャートである。フィルタ生成装置200は、例えば、敵対的サンプル判定用フィルタの生成をユーザ操作によって指示されることを契機として、図5の処理を行う。
Next, the operation of the face recognition system 1 will be described with reference to FIGS. 5 and 6.
FIG. 5 is a flowchart showing an example of a processing procedure in which the filter generation device 200 generates a filter for hostile sample determination. The filter generation device 200 performs the processing of FIG. 5, for example, when the generation of the hostile sample determination filter is instructed by a user operation.

 図5の処理で、フィルタ生成用画像取得部292は、フィルタ生成用画像を取得する(ステップS101)。ノイズ付加部293は、フィルタ生成用画像取得部292が取得したフィルタ生成用画像にノイズを加えて、敵対的サンプルの候補を生成する(ステップS102)。
 次に、敵対的サンプル選択部294は、フィルタ生成用画像、敵対的サンプルの候補のそれぞれについて、顔認証装置100の顔認証におけるクラス分類の結果を取得する(ステップS103)。
In the process of FIG. 5, the filter generation image acquisition unit 292 acquires the filter generation image (step S101). The noise addition unit 293 adds noise to the filter generation image acquired by the filter generation image acquisition unit 292 to generate hostile sample candidates (step S102).
Next, the hostile sample selection unit 294 acquires the result of the classification in the face recognition of the face recognition device 100 for each of the filter generation image and the hostile sample candidate (step S103).

 例えば、第二通信部210が、フィルタ生成用画像、および、敵対的サンプルの候補を、顔認証装置100へ送信する。顔認証装置100は、フィルタ生成用画像、敵対的サンプルの候補のそれぞれをクラス分類し、クラス分類の結果をフィルタ生成装置200へ送信する。フィルタ生成装置200では、敵対的サンプル選択部294が、第二通信部210がフィルタ生成装置200から受信する信号から、フィルタ生成用画像のクラス分類結果、および、敵対的サンプルの候補のクラス分類結果を抽出する。 For example, the second communication unit 210 transmits a filter generation image and a candidate for a hostile sample to the face recognition device 100. The face recognition device 100 classifies each of the filter generation image and the hostile sample candidate into classes, and transmits the classification result to the filter generation device 200. In the filter generation device 200, the hostile sample selection unit 294 uses the signal received by the second communication unit 210 from the filter generation device 200 to classify the filter generation image and the hostile sample candidate. Is extracted.

 そして、敵対的サンプル選択部294は、敵対的サンプルの候補が敵対的サンプルか否かを判定する(ステップS104)。具体的には、フィルタ生成用画像と敵対的サンプルの候補とが異なるクラスに分類されている場合、敵対的サンプル選択部294は、敵対的サンプルの候補が敵対的サンプルであると判定する。一方、フィルタ生成用画像と敵対的サンプルの候補とが同じクラスに分類されている場合、敵対的サンプル選択部294は、敵対的サンプルの候補が敵対的サンプルではないと判定する。 Then, the hostile sample selection unit 294 determines whether or not the candidate for the hostile sample is a hostile sample (step S104). Specifically, when the filter generation image and the hostile sample candidate are classified into different classes, the hostile sample selection unit 294 determines that the hostile sample candidate is a hostile sample. On the other hand, when the filter generation image and the hostile sample candidate are classified into the same class, the hostile sample selection unit 294 determines that the hostile sample candidate is not a hostile sample.

 敵対的サンプルの候補が敵対的サンプルであると敵対的サンプル選択部294が判定した場合(ステップS104:YES)、取得処理部295は、敵対的摂動を取得し、第二記憶部280に記憶させる(ステップS111)。例えば、取得処理部295は、ステップS102でノイズ付加部293がフィルタ生成用画像に加えたノイズを、敵対的摂動として取得する。 When the hostile sample selection unit 294 determines that the hostile sample candidate is a hostile sample (step S104: YES), the acquisition processing unit 295 acquires the hostile perturbation and stores it in the second storage unit 280. (Step S111). For example, the acquisition processing unit 295 acquires the noise added to the filter generation image by the noise addition unit 293 in step S102 as a hostile perturbation.

 そして、取得処理部295は、取得した敵対的摂動の個数が、所定の個数に達したか否かを判定する(ステップS112)。
 また、ステップS104で、敵対的サンプルの候補が敵対的サンプルではないと敵対的サンプル選択部294が判定した場合(ステップS104:NO)、処理がステップS112へ進む。
Then, the acquisition processing unit 295 determines whether or not the number of acquired hostile perturbations has reached a predetermined number (step S112).
If the hostile sample selection unit 294 determines in step S104 that the candidate for the hostile sample is not a hostile sample (step S104: NO), the process proceeds to step S112.

 ステップS112で、取得処理部295が、取得した敵対的摂動の個数が、所定の個数に達していないと判定した場合(ステップS112:NO)、処理がステップS102へ戻る。
 一方、ステップS112で、取得処理部295が、取得した敵対的摂動の個数が、所定の個数に達したと判定した場合(ステップS112:YES)、フィルタ生成部296が、得られた敵対的摂動を用いて敵対的サンプル判定用フィルタを生成する(ステップS121)。
 ステップS121の後、フィルタ生成装置200は、図5の処理を終了する。
If the acquisition processing unit 295 determines in step S112 that the number of acquired hostile perturbations has not reached a predetermined number (step S112: NO), the processing returns to step S102.
On the other hand, in step S112, when the acquisition processing unit 295 determines that the number of acquired hostile perturbations has reached a predetermined number (step S112: YES), the filter generation unit 296 determines that the number of acquired hostile perturbations has reached a predetermined number (step S112: YES). To generate a hostile sample determination filter (step S121).
After step S121, the filter generator 200 ends the process of FIG.

 図6は、推定装置300が、認証対象の顔画像が敵対的サンプルか否かを判定する処理手順の例を示すフローチャートである。例えば、顔認証装置100が、顔認証を開始する際に、図6の処理の開始を指示する信号を推定装置300へ送信する。推定装置300は、顔認証装置100からの指示を契機として、図6の処理を開始する。 FIG. 6 is a flowchart showing an example of a processing procedure in which the estimation device 300 determines whether or not the face image to be authenticated is a hostile sample. For example, when the face recognition device 100 starts face recognition, it transmits a signal instructing the start of the process of FIG. 6 to the estimation device 300. The estimation device 300 starts the process of FIG. 6 in response to an instruction from the face recognition device 100.

 図6の処理で、第一画像取得部391は、第一画像(認証対象の顔画像)を取得する(ステップS211)。具体的には、顔認証装置100が、第一画像を推定装置300へ送信する。第一画像取得部391は、第三通信部310が顔認証装置100から受信する信号から、第一画像を抽出する。第一画像取得部391が、認証対象の顔画像に加えて、あるいは代えて、登録画像または登録対象の画像を第一画像として取得するようにしてもよい。
 フィルタ適用部392は、第一画像取得部391が取得した第一画像に敵対的サンプル判定用フィルタを適用して、第二画像を生成する(ステップS212)。
In the process of FIG. 6, the first image acquisition unit 391 acquires the first image (face image to be authenticated) (step S211). Specifically, the face recognition device 100 transmits the first image to the estimation device 300. The first image acquisition unit 391 extracts the first image from the signal received from the face recognition device 100 by the third communication unit 310. The first image acquisition unit 391 may acquire the registered image or the image to be registered as the first image in addition to or instead of the face image to be authenticated.
The filter application unit 392 applies a hostile sample determination filter to the first image acquired by the first image acquisition unit 391 to generate a second image (step S212).

 そして、スコア算出部393は、第二特徴ベクトルを取得する(ステップS213)。例えば、第三通信部310が第二画像を顔認証装置100へ送信する。顔認証装置100では、特徴ベクトル算出部191が、第二画像の特徴ベクトル(第二特徴ベクトル)を算出し、第一通信部110が、得られた第二特徴ベクトルを推定装置300へ送信する。スコア算出部393は、第三通信部310が顔認証装置100から受信する信号から、第二特徴ベクトルを抽出する。 Then, the score calculation unit 393 acquires the second feature vector (step S213). For example, the third communication unit 310 transmits the second image to the face recognition device 100. In the face recognition device 100, the feature vector calculation unit 191 calculates the feature vector (second feature vector) of the second image, and the first communication unit 110 transmits the obtained second feature vector to the estimation device 300. .. The score calculation unit 393 extracts the second feature vector from the signal received from the face recognition device 100 by the third communication unit 310.

 また、スコア算出部393は、第一特徴ベクトルを取得する(ステップS221)。例えば、顔認証装置100では、特徴ベクトル算出部191が、認証対象の顔画像(第一画像)の特徴ベクトル(第一特徴ベクトル)を算出し、第一通信部110が、得られた第一特徴ベクトルを推定装置300へ送信する。スコア算出部393は、第三通信部310が顔認証装置100から受信する信号から、第一特徴ベクトルを抽出する。
 推定装置300が、ステップS211からS213の処理と、ステップS221の処理とを並行処理するようにしてもよいし、逐次処理するようにしてもよい。
Further, the score calculation unit 393 acquires the first feature vector (step S221). For example, in the face recognition device 100, the feature vector calculation unit 191 calculates the feature vector (first feature vector) of the face image (first image) to be authenticated, and the first communication unit 110 obtains the first feature vector. The feature vector is transmitted to the estimation device 300. The score calculation unit 393 extracts the first feature vector from the signal received from the face recognition device 100 by the third communication unit 310.
The estimation device 300 may perform the processes of steps S211 to S213 and the processes of steps S221 in parallel, or may sequentially perform the processes.

 ステップS213およびステップS221の後、スコア算出部393は、敵対的サンプルスコアを算出する(ステップS231)。上述したように、スコア算出部393が敵対的サンプルスコアを算出する方法は、特定の方法に限定されない。
 そして、スコア算出部393は、認証対象の顔画像が敵対的サンプルか否かを判定する(ステップS232)。例えば上述したように、スコア算出部393は、敵対的サンプルスコアと判定閾値とを比較して、認証対象の顔画像が敵対的サンプルか否かを判定する。
After step S213 and step S221, the score calculation unit 393 calculates a hostile sample score (step S231). As described above, the method by which the score calculation unit 393 calculates the hostile sample score is not limited to a specific method.
Then, the score calculation unit 393 determines whether or not the face image to be authenticated is a hostile sample (step S232). For example, as described above, the score calculation unit 393 compares the hostile sample score with the determination threshold value to determine whether or not the face image to be authenticated is a hostile sample.

 認証対象の顔画像が敵対的サンプルであると判定した場合(ステップS232:YES)、判定結果処理部320は、敵対的サンプル対応処理を実行する(ステップS233)。上述したように、判定結果処理部320が行う処理は、顔認証システム1の使用形態に応じていろいろな処理とすることができる。
 ステップS233の後、推定装置300は、図6の処理を終了する。
 一方、ステップS232で、認証対象の顔画像が敵対的サンプルではないと判定した場合(ステップS232:NO)、推定装置300は、図6の処理を終了する。
When it is determined that the face image to be authenticated is a hostile sample (step S232: YES), the determination result processing unit 320 executes the hostile sample correspondence process (step S233). As described above, the processing performed by the determination result processing unit 320 can be various processing depending on the usage pattern of the face authentication system 1.
After step S233, the estimation device 300 ends the process of FIG.
On the other hand, when it is determined in step S232 that the face image to be authenticated is not a hostile sample (step S232: NO), the estimation device 300 ends the process of FIG.

 以上のように、敵対的摂動取得部291は、敵対的摂動を複数取得する。フィルタ生成部296は、敵対的摂動取得部291が取得した複数の敵対的摂動に基づいて、敵対的サンプル判定用フィルタを生成する。
 フィルタ生成装置200によれば、敵対的サンプル判定用フィルタを得ることができる。敵対的サンプル判定用フィルタを用いて、認証対象の顔画像が敵対的サンプルである可能性を推定できる。
As described above, the hostile perturbation acquisition unit 291 acquires a plurality of hostile perturbations. The filter generation unit 296 generates a filter for determining a hostile sample based on a plurality of hostile perturbations acquired by the hostile perturbation acquisition unit 291.
According to the filter generation device 200, a filter for determining a hostile sample can be obtained. Using the hostile sample determination filter, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample.

 また、フィルタ生成部296は、複数の敵対的摂動の平均をとって敵対的サンプル判定用フィルタを算出する。
 フィルタ生成部296は、敵対的摂動の画素毎に画素値の平均をとるといった比較的簡単な処理で敵対的サンプル判定用フィルタを算出することができる。フィルタ生成装置200によれば、この点で、フィルタ生成部296の処理負荷が軽くて済む。
In addition, the filter generation unit 296 calculates the hostile sample determination filter by averaging a plurality of hostile perturbations.
The filter generation unit 296 can calculate the hostile sample determination filter by a relatively simple process such as averaging the pixel values for each pixel of the hostile perturbation. According to the filter generation device 200, the processing load of the filter generation unit 296 can be lightened in this respect.

 また、フィルタ適用部392は、顔認証に用いられる画像である第一画像に対して、敵対的サンプル判定用フィルタを適用した第二画像を取得する。スコア算出部393は、第一画像の特徴ベクトルと第二画像の特徴ベクトルとの相違度を量的に示す敵対的サンプルスコアを算出する。
 推定装置300によれば、認証対象の顔画像が敵対的サンプルである可能性を推定できる。具体的には、敵対的サンプルスコアは、認証対象の顔画像が敵対的サンプルである可能性を示す。また、スコア算出部393は、敵対的サンプルスコアと判定閾値とを比較して、認証対象の顔画像が敵対的サンプルであるか否かを判定することができる。
In addition, the filter application unit 392 acquires a second image to which a hostile sample determination filter is applied to the first image, which is an image used for face recognition. The score calculation unit 393 calculates a hostile sample score that quantitatively indicates the degree of difference between the feature vector of the first image and the feature vector of the second image.
According to the estimation device 300, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample. Specifically, the hostile sample score indicates that the face image to be authenticated may be a hostile sample. In addition, the score calculation unit 393 can compare the hostile sample score with the determination threshold value to determine whether or not the face image to be authenticated is a hostile sample.

 図7は、実施形態に係るフィルタ生成装置の構成の例を示す図である。図7に示すフィルタ生成装置500は、敵対的摂動取得部501と、フィルタ生成部502と、を備える。
 かかる構成で、敵対的摂動取得部501は、顔画像と、顔画像に基づく敵対的サンプルとの差による敵対的摂動を複数取得する。フィルタ生成部502は、複数の敵対的摂動に基づいてフィルタを生成する。
 敵対的摂動取得部501は、敵対的摂動取得手段の例に該当する。フィルタ生成部502は、フィルタ生成手段の例に該当する。
 フィルタ生成装置500によれば、このフィルタを用いて、認証対象の顔画像が敵対的サンプルである可能性を推定できる。
FIG. 7 is a diagram showing an example of the configuration of the filter generation device according to the embodiment. The filter generation device 500 shown in FIG. 7 includes a hostile perturbation acquisition unit 501 and a filter generation unit 502.
With this configuration, the hostile perturbation acquisition unit 501 acquires a plurality of hostile perturbations due to the difference between the face image and the hostile sample based on the face image. The filter generation unit 502 generates a filter based on a plurality of hostile perturbations.
The hostile perturbation acquisition unit 501 corresponds to an example of a hostile perturbation acquisition means. The filter generation unit 502 corresponds to an example of a filter generation means.
According to the filter generation device 500, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample by using this filter.

 図8は、実施形態に係るフィルタ生成方法における処理の手順の例を示す図である。図8に示すフィルタ生成方法は、敵対的摂動取得工程(ステップS501)と、フィルタ生成工程(ステップS502)とを含む。
 敵対的摂動取得工程(ステップS501)では、顔画像と、顔画像に基づく敵対的サンプルとの差による敵対的摂動を複数取得する。フィルタ生成工程(ステップS502)では、複数の敵対的摂動に基づいてフィルタを生成する。
 図8に示すフィルタ生成方法によれば、このフィルタを用いて、認証対象の顔画像が敵対的サンプルである可能性を推定できる。
FIG. 8 is a diagram showing an example of a processing procedure in the filter generation method according to the embodiment. The filter generation method shown in FIG. 8 includes a hostile perturbation acquisition step (step S501) and a filter generation step (step S502).
In the hostile perturbation acquisition step (step S501), a plurality of hostile perturbations due to the difference between the face image and the hostile sample based on the face image are acquired. In the filter generation step (step S502), a filter is generated based on a plurality of hostile perturbations.
According to the filter generation method shown in FIG. 8, it is possible to estimate the possibility that the face image to be authenticated is a hostile sample by using this filter.

 図9は、少なくとも1つの実施形態に係るコンピュータの構成の例を示す概略ブロック図である。
 図9に示す構成で、コンピュータ700は、CPU710と、主記憶装置720と、補助記憶装置730と、インタフェース740とを備える。
 上記の顔認証装置100、フィルタ生成装置200、推定装置300、および、フィルタ生成装置500のうち何れか1つ以上が、コンピュータ700に実装されてもよい。その場合、上述した各処理部の動作は、プログラムの形式で補助記憶装置730に記憶されている。CPU710は、プログラムを補助記憶装置730から読み出して主記憶装置720に展開し、当該プログラムに従って上記処理を実行する。また、CPU710は、プログラムに従って、上述した各記憶部に対応する記憶領域を主記憶装置720に確保する。各装置と他の装置との通信は、インタフェース740が通信機能を有し、CPU710の制御に従って通信を行うことで実行される。
FIG. 9 is a schematic block diagram showing an example of a computer configuration according to at least one embodiment.
With the configuration shown in FIG. 9, the computer 700 includes a CPU 710, a main storage device 720, an auxiliary storage device 730, and an interface 740.
Any one or more of the face recognition device 100, the filter generation device 200, the estimation device 300, and the filter generation device 500 may be mounted on the computer 700. In that case, the operation of each of the above-mentioned processing units is stored in the auxiliary storage device 730 in the form of a program. The CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program. Further, the CPU 710 secures a storage area corresponding to each of the above-mentioned storage units in the main storage device 720 according to the program. Communication between each device and other devices is executed by having the interface 740 have a communication function and performing communication according to the control of the CPU 710.

 顔認証装置100がコンピュータ700に実装される場合、第一制御部190およびその各部の動作は、プログラムの形式で補助記憶装置730に記憶されている。CPU710は、プログラムを補助記憶装置730から読み出して主記憶装置720に展開し、当該プログラムに従って上記処理を実行する。
 また、CPU710は、プログラムに従って、第一記憶部180およびその各部に対応する記憶領域を主記憶装置720に確保する。第一通信部110が行う通信は、インタフェース740が通信装置を備え、CPU710の制御に従って通信を行うことで実行される。認証用画像取得部120が認証用の顔画像を取得する処理は、例えばインタフェース740がカメラまたはスキャナ等の画像取得装置を備え、CPU710の制御に従って動作することで実行される。認証結果処理部130が行う処理は、例えばインタフェース740が、表示装置または通信装置など認証結果処理部130の処理の態様に応じた装置を備え、CPU710の制御に従って動作することで実行される。
When the face recognition device 100 is mounted on the computer 700, the operations of the first control unit 190 and each unit thereof are stored in the auxiliary storage device 730 in the form of a program. The CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
Further, the CPU 710 secures the first storage unit 180 and the storage area corresponding to each unit in the main storage device 720 according to the program. The communication performed by the first communication unit 110 is executed by the interface 740 including a communication device and performing communication according to the control of the CPU 710. The process of acquiring the face image for authentication by the authentication image acquisition unit 120 is executed by, for example, the interface 740 including an image acquisition device such as a camera or a scanner and operating according to the control of the CPU 710. The processing performed by the authentication result processing unit 130 is executed by, for example, the interface 740 including a device such as a display device or a communication device according to the processing mode of the authentication result processing unit 130 and operating according to the control of the CPU 710.

 フィルタ生成装置200がコンピュータ700に実装される場合、第二制御部290およびその各部の動作は、プログラムの形式で補助記憶装置730に記憶されている。CPU710は、プログラムを補助記憶装置730から読み出して主記憶装置720に展開し、当該プログラムに従って上記処理を実行する。
 また、CPU710は、プログラムに従って、第二記憶部280に対応する記憶領域を主記憶装置720に確保する。第二通信部210が行う通信は、インタフェース740が通信装置を備え、CPU710の制御に従って通信を行うことで実行される。
When the filter generation device 200 is mounted on the computer 700, the operations of the second control unit 290 and each unit thereof are stored in the auxiliary storage device 730 in the form of a program. The CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
Further, the CPU 710 secures a storage area corresponding to the second storage unit 280 in the main storage device 720 according to the program. The communication performed by the second communication unit 210 is executed by the interface 740 including a communication device and performing communication according to the control of the CPU 710.

 推定装置300がコンピュータ700に実装される場合、第三制御部390およびその各部の動作は、プログラムの形式で補助記憶装置730に記憶されている。CPU710は、プログラムを補助記憶装置730から読み出して主記憶装置720に展開し、当該プログラムに従って上記処理を実行する。
 また、CPU710は、プログラムに従って、第三記憶部380に対応する記憶領域を主記憶装置720に確保する。第三通信部310が行う通信は、インタフェース740が通信装置を備え、CPU710の制御に従って通信を行うことで実行される。判定結果処理部320が行う処理は、例えばインタフェース740が、表示装置または通信装置など判定結果処理部320の処理の態様に応じた装置を備え、CPU710の制御に従って動作することで実行される。
When the estimation device 300 is mounted on the computer 700, the operations of the third control unit 390 and each unit thereof are stored in the auxiliary storage device 730 in the form of a program. The CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
Further, the CPU 710 secures a storage area corresponding to the third storage unit 380 in the main storage device 720 according to the program. The communication performed by the third communication unit 310 is executed when the interface 740 includes a communication device and performs communication according to the control of the CPU 710. The processing performed by the determination result processing unit 320 is executed by, for example, the interface 740 including a device such as a display device or a communication device according to the processing mode of the determination result processing unit 320 and operating according to the control of the CPU 710.

 フィルタ生成装置500がコンピュータ700に実装される場合、敵対的摂動取得部501と、フィルタ生成部502との動作は、プログラムの形式で補助記憶装置730に記憶されている。CPU710は、プログラムを補助記憶装置730から読み出して主記憶装置720に展開し、当該プログラムに従って上記処理を実行する。
 また、CPU710は、プログラムに従って、フィルタ生成装置500の処理に必要な記憶領域を主記憶装置720に確保する。フィルタ生成装置500が行う入出力処理は、インタフェース740が通信装置など処理の態様に応じた装置を備え、CPU710の制御に従って動作することで実行される。
When the filter generation device 500 is mounted on the computer 700, the operations of the hostile perturbation acquisition unit 501 and the filter generation unit 502 are stored in the auxiliary storage device 730 in the form of a program. The CPU 710 reads the program from the auxiliary storage device 730, expands it to the main storage device 720, and executes the above processing according to the program.
Further, the CPU 710 secures a storage area required for processing of the filter generation device 500 in the main storage device 720 according to the program. The input / output process performed by the filter generation device 500 is executed when the interface 740 is provided with a device such as a communication device according to the mode of processing and operates according to the control of the CPU 710.

 なお、顔認証装置100、フィルタ生成装置200、推定装置300、および、フィルタ生成装置500の全部または一部の機能を実現するためのプログラムをコンピュータ読み取り可能な記録媒体に記録して、この記録媒体に記録されたプログラムをコンピュータシステムに読み込ませ、実行することにより各部の処理を行ってもよい。ここでいう「コンピュータシステム」とは、OS(オペレーティングシステム)や周辺機器等のハードウェアを含む。
 「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM(Read Only Memory)、CD-ROM(Compact Disc Read Only Memory)等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。また上記プログラムは、前述した機能の一部を実現するためのものであっても良く、さらに前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるものであっても良い。
A computer-readable recording medium is used to record a program for realizing all or part of the functions of the face recognition device 100, the filter generation device 200, the estimation device 300, and the filter generation device 500, and this recording medium. The processing of each part may be performed by loading the program recorded in the above into a computer system and executing the program. The term "computer system" as used herein includes hardware such as an OS (operating system) and peripheral devices.
"Computer readable recording medium" includes flexible disks, optomagnetic disks, portable media such as ROM (Read Only Memory) and CD-ROM (Compact Disc Read Only Memory), and hard disks built into computer systems. A storage device. Further, the above-mentioned program may be a program for realizing a part of the above-mentioned functions, and may be a program for realizing the above-mentioned functions in combination with a program already recorded in the computer system.

 以上、本発明の実施形態について図面を参照して詳述してきたが、具体的な構成はこの実施形態に限られるものではなく、この発明の要旨を逸脱しない範囲の設計変更等も含まれる。 As described above, the embodiment of the present invention has been described in detail with reference to the drawings, but the specific configuration is not limited to this embodiment, and design changes and the like within a range not deviating from the gist of the present invention are also included.

 本発明の実施形態は、フィルタ生成装置、推定装置、顔認証システム、フィルタ生成方法および記録媒体に適用してもよい。 The embodiment of the present invention may be applied to a filter generation device, an estimation device, a face recognition system, a filter generation method, and a recording medium.

 1 顔認証システム
 100 顔認証装置
 110 第一通信部
 120 認証用画像取得部
 180 第一記憶部
 181 認証データベース部
 190 第一制御部
 191 特徴ベクトル算出部
 192 分類部
 200、500 フィルタ生成装置
 210 第二通信部
 280 第二記憶部
 290 第二制御部
 291、501 敵対的摂動取得部
 292 フィルタ生成用画像取得部
 293 ノイズ付加部
 294 敵対的サンプル選択部
 295 取得処理部
 296、502 フィルタ生成部
 300 推定装置
 310 第三通信部
 320 判定結果処理部
 380 第三記憶部
 390 第三制御部
 391 第一画像取得部
 392 フィルタ適用部
 393 スコア算出部
 394 判定部
1 Face recognition system 100 Face recognition device 110 First communication unit 120 Authentication image acquisition unit 180 First storage unit 181 Authentication database unit 190 First control unit 191 Feature vector calculation unit 192 Classification unit 200, 500 Filter generation device 210 Second Communication unit 280 Second storage unit 290 Second control unit 291, 501 Hostile perturbation acquisition unit 292 Filter generation image acquisition unit 293 Noise addition unit 294 Hostile sample selection unit 295 Acquisition processing unit 296, 502 Filter generation unit 300 Estimator 310 Third communication unit 320 Judgment result processing unit 380 Third storage unit 390 Third control unit 391 First image acquisition unit 392 Filter application unit 393 Score calculation unit 394 Judgment unit

Claims (6)

 顔画像と、前記顔画像に基づく敵対的サンプルとの差による敵対的摂動を複数取得する敵対的摂動取得手段と、
 複数の前記敵対的摂動に基づいてフィルタを生成するフィルタ生成手段と、
 を備えるフィルタ生成装置。
A hostile perturbation acquisition means for acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image,
A filter generation means that generates a filter based on a plurality of the hostile perturbations,
A filter generator comprising.
 前記フィルタ生成手段は、複数の前記敵対的摂動の平均をとって前記フィルタを算出する、
 請求項1に記載のフィルタ生成装置。
The filter generating means calculates the filter by averaging a plurality of the hostile perturbations.
The filter generator according to claim 1.
 顔認証に用いられる画像である第一画像に対して、請求項1または請求項2に記載のフィルタを適用した第二画像を取得するフィルタ適用手段と、
 前記第一画像の特徴ベクトルと前記第二画像の特徴ベクトルとの相違度を量的に示すスコアを算出するスコア算出手段と、
 を備える推定装置。
A filter application means for acquiring a second image obtained by applying the filter according to claim 1 or 2 to a first image which is an image used for face recognition.
A score calculation means for calculating a score that quantitatively indicates the degree of difference between the feature vector of the first image and the feature vector of the second image.
Estimator equipped with.
 顔認証を行う顔認証装置と、
 請求項3に記載の推定装置と、
 を備える顔認証システム。
A face recognition device that performs face recognition and
The estimation device according to claim 3 and
Face recognition system with.
 顔画像と、前記顔画像に基づく敵対的サンプルとの差による敵対的摂動を複数取得する工程と、
 複数の前記敵対的摂動に基づいてフィルタを生成する工程と、
 を含むフィルタ生成方法。
A process of acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image, and
A process of generating a filter based on a plurality of the hostile perturbations, and
Filter generation method including.
 コンピュータに、
 顔画像と、前記顔画像に基づく敵対的サンプルとの差による敵対的摂動を複数取得する工程と、
 複数の前記敵対的摂動に基づいてフィルタを生成する工程と、
 を実行させるためのプログラムを記録する記録媒体。
On the computer
A process of acquiring a plurality of hostile perturbations due to a difference between a face image and a hostile sample based on the face image, and
A process of generating a filter based on a plurality of the hostile perturbations, and
A recording medium that records a program for executing.
PCT/JP2019/051472 2019-12-27 2019-12-27 Filter generation device, estimation device, facial authentication system, filter generation method, and recording medium Ceased WO2021131029A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/051472 WO2021131029A1 (en) 2019-12-27 2019-12-27 Filter generation device, estimation device, facial authentication system, filter generation method, and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/051472 WO2021131029A1 (en) 2019-12-27 2019-12-27 Filter generation device, estimation device, facial authentication system, filter generation method, and recording medium

Publications (1)

Publication Number Publication Date
WO2021131029A1 true WO2021131029A1 (en) 2021-07-01

Family

ID=76573841

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/051472 Ceased WO2021131029A1 (en) 2019-12-27 2019-12-27 Filter generation device, estimation device, facial authentication system, filter generation method, and recording medium

Country Status (1)

Country Link
WO (1) WO2021131029A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114387647A (en) * 2021-12-29 2022-04-22 北京瑞莱智慧科技有限公司 Method and device for generating anti-disturbance and storage medium
WO2023286269A1 (en) * 2021-07-16 2023-01-19 日本電気株式会社 Learning data generation device, learning data generation method, program, detection model generation method, and authentication system
WO2023058569A1 (en) * 2021-10-07 2023-04-13 株式会社日立製作所 Adversarial data detection device and adversarial data detection method
JPWO2023079587A1 (en) * 2021-11-02 2023-05-11

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013196295A (en) * 2012-03-19 2013-09-30 Toshiba Corp Biological information processing apparatus
US20170330028A1 (en) * 2015-11-16 2017-11-16 MorphoTrak, LLC Facial matching system
JP2019079374A (en) * 2017-10-26 2019-05-23 株式会社Preferred Networks Image processing system, image processing method, and image processing program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013196295A (en) * 2012-03-19 2013-09-30 Toshiba Corp Biological information processing apparatus
US20170330028A1 (en) * 2015-11-16 2017-11-16 MorphoTrak, LLC Facial matching system
JP2019079374A (en) * 2017-10-26 2019-05-23 株式会社Preferred Networks Image processing system, image processing method, and image processing program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TAKAHASHI, KEN; SHUTO, KAZUYUKI: " Frequency-based Robustness Improvements for Adversarial Examples", PROCEEDINGS OF THE 35TH ANNUAL CONFERENCE OF THE JAPANESE SOCIETY OF SOFTWARE SCIENCE (JSSST), vol. 35, 28 August 2018 (2018-08-28), pages 1 - 8, XP009533927 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023286269A1 (en) * 2021-07-16 2023-01-19 日本電気株式会社 Learning data generation device, learning data generation method, program, detection model generation method, and authentication system
JPWO2023286269A1 (en) * 2021-07-16 2023-01-19
JP7597225B2 (en) 2021-07-16 2024-12-10 日本電気株式会社 LEARNING DATA GENERATION DEVICE, LEARNING DATA GENERATION METHOD, PROGRAM, DETECTION MODEL GENERATION METHOD, AND AUTHENTICATION SYSTEM
WO2023058569A1 (en) * 2021-10-07 2023-04-13 株式会社日立製作所 Adversarial data detection device and adversarial data detection method
JP2023056241A (en) * 2021-10-07 2023-04-19 株式会社日立製作所 Hostile data detection device and hostile data detection method
JP7564074B2 (en) 2021-10-07 2024-10-08 株式会社日立製作所 Adversarial data detection device and adversarial data detection method
JPWO2023079587A1 (en) * 2021-11-02 2023-05-11
JP7718502B2 (en) 2021-11-02 2025-08-05 日本電気株式会社 Biometric authentication device, biometric authentication method, and recording medium
CN114387647A (en) * 2021-12-29 2022-04-22 北京瑞莱智慧科技有限公司 Method and device for generating anti-disturbance and storage medium

Similar Documents

Publication Publication Date Title
US11023757B2 (en) Method and apparatus with liveness verification
Chakraborty et al. An overview of face liveness detection
EP3807792B1 (en) Authenticating an identity of a person
US12443392B2 (en) Systems and methods for private authentication with helper networks
US20210034864A1 (en) Iris liveness detection for mobile devices
JP2020074174A (en) System and method for performing fingerprint-based user authentication using images captured with mobile device
JP6921694B2 (en) Monitoring system
WO2021131029A1 (en) Filter generation device, estimation device, facial authentication system, filter generation method, and recording medium
CN111881429A (en) Activity detection method and apparatus, and face verification method and apparatus
WO2020065954A1 (en) Authentication device, authentication method, and storage medium
Wang et al. Silicone mask face anti-spoofing detection based on visual saliency and facial motion
Du Review of iris recognition: cameras, systems, and their applications
JP2022003526A (en) Information processor, detection system, method for processing information, and program
WO2020195732A1 (en) Image processing device, image processing method, and recording medium in which program is stored
CN111937005A (en) Biological feature recognition method, device, equipment and storage medium
Samatha et al. Securesense: Enhancing person verification through multimodal biometrics for robust authentication
Busch Challenges for automated face recognition systems
KR102215535B1 (en) Partial face image based identity authentication method using neural network and system for the method
US20220044014A1 (en) Iris authentication device, iris authentication method and recording medium
WO2021060256A1 (en) Facial authentication device, facial authentication method, and computer-readable recording medium
JP7509212B2 (en) Information processing system, information processing method, and computer program
Shen et al. Iritrack: Face presentation attack detection using iris tracking
Pala et al. On the accuracy and robustness of deep triplet embedding for fingerprint liveness detection
JP2004118731A (en) Image recognition device, image recognition method and program for allowing computer to perform the method
JP4708835B2 (en) Face detection device, face detection method, and face detection program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19957765

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19957765

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP