WO2012042631A1 - 登録プログラム、登録装置、および登録方法 - Google Patents
登録プログラム、登録装置、および登録方法 Download PDFInfo
- Publication number
- WO2012042631A1 WO2012042631A1 PCT/JP2010/067044 JP2010067044W WO2012042631A1 WO 2012042631 A1 WO2012042631 A1 WO 2012042631A1 JP 2010067044 W JP2010067044 W JP 2010067044W WO 2012042631 A1 WO2012042631 A1 WO 2012042631A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image information
- template
- information
- distance
- registration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7246—Details of waveform analysis using correlation, e.g. template matching or determination of similarity
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/50—Maintenance of biometric data or enrolment thereof
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4887—Locating particular structures in or on the body
- A61B5/489—Blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
Definitions
- the present invention relates to a registration program, a registration apparatus, and a registration method for registering in advance biometric information used for collation in order to perform personal authentication using biometric information.
- the human body includes biological information that can identify an individual, and some of them are used as information for identifying and authenticating the individual.
- biometric information that can be used for authentication includes fingerprints, eye retinas and irises, faces, blood vessels, and DNA (Deoxyribo Nucleic Acid).
- biometric authentication is performed by comparing biometric information (registration template) collected during registration with biometric information acquired during authentication.
- biometric information it is desirable to acquire biometric information with high accuracy in order to improve the accuracy of authentication based on this biometric information, and the authentication apparatus acquires biometric information by aligning sensing conditions.
- the authentication apparatus acquires biometric information by aligning sensing conditions.
- the registered template is inappropriate, such as template registration with an incorrect posture, a good collation result cannot be obtained even if the registered template is compared with the biometric information acquired at the time of authentication.
- the processing unit obtains the biological information of the same living body from the detection unit a plurality of times, determines the mutual similarity between the biological feature data from the plurality of times of the biological information, A plurality of high biometric feature data is registered in the storage unit.
- Such inappropriate registration templates need to be re-registered, and if a financial institution such as a bank is providing services, it will be necessary to re-register the templates at the counter.
- the user must go to a registration work place (mostly a sales office), which causes a decrease in customer satisfaction and service for financial institutions.
- the present invention has been made in view of such a point, and an object thereof is to provide a registration program, a registration apparatus, and a registration method capable of performing appropriate template registration by eliminating inappropriate template registration. To do.
- a registration program for causing a computer to execute a process for registering a template used for biometric authentication causes the computer to execute the following process.
- a plurality of image information is obtained from a sensor unit capable of outputting image information of a photographed living body and distance information at the time of photographing, distance information corresponding to each of the image information is obtained from the sensor unit, Grouping is performed according to distance information corresponding to each piece of image information, image information is mutually evaluated for each group according to distance information, image information as template candidates is extracted, and the extracted image information is Image information that becomes a template is extracted by evaluation.
- a registration device for registering a template used for biometric authentication includes image information acquisition means, distance information acquisition means, template candidate extraction means, and template extraction means.
- the image information acquisition means acquires a plurality of pieces of image information from a sensor unit that can output image information of a captured living body and distance information at the time of shooting.
- the distance information acquisition means acquires distance information corresponding to each piece of image information from the sensor unit.
- the template candidate extraction unit performs grouping on a plurality of image information according to distance information corresponding to each of the image information, and performs mutual evaluation of the image information for each group corresponding to the distance information, thereby obtaining an image as a template candidate. Extract information.
- the template extracting means extracts the image information as a template by mutually evaluating the extracted image information.
- a template registration method used for biometric authentication acquires a plurality of pieces of image information from a sensor unit that can output image information of a photographed living body and distance information at the time of photographing.
- the distance information corresponding to each of the image information is acquired from the image information, the plurality of image information is grouped according to the distance information corresponding to each of the image information, and the image information is mutually evaluated for each group according to the distance information.
- image information that is a template candidate is extracted, and the extracted image information is mutually evaluated to extract image information that is a template.
- FIG. 1 is a diagram illustrating a configuration of a registration apparatus according to the first embodiment.
- the registration device 10 acquires, from the sensor unit, the image information of the living body imaged by the sensor unit and the distance information at the time of imaging, and extracts the image information serving as a template.
- the sensor unit includes an image sensor and a distance measuring sensor, images a living body as a subject, generates image information of the living body and distance information at the time of shooting, and outputs the image information to the registration device 10.
- the image information is image data of a living body and is generated in a predetermined image format.
- the distance information is information representing the distance between the sensor unit and the living body at the time of shooting.
- the template is data collected in advance from a living body for use in matching of the living body.
- the registration apparatus 10 includes an image information acquisition unit 10a, a distance information acquisition unit 10b, a template candidate extraction unit 10c, and a template extraction unit 10d.
- the image information acquisition unit 10a acquires a plurality of pieces of image information.
- the distance information acquisition unit 10b acquires distance information (shooting distance) corresponding to each piece of image information acquired by the image information acquisition unit 10a.
- the template candidate extraction unit 10c performs grouping on a plurality of pieces of image information according to distance information corresponding to each piece of image information. Thereby, according to the shooting distance from the sensor unit, a plurality of pieces of image information are divided into, for example, three groups of “far”, “medium”, and “near”.
- the template candidate extraction unit 10c mutually evaluates image information for each group corresponding to the distance information, and extracts image information that is a template candidate.
- the mutual evaluation is performed by using one of the image information as a reference image and using the remaining image information as an evaluation target image and obtaining evaluation values for all combinations by a predetermined evaluation method while changing the reference image. Thereby, for example, two pieces of image information from the group set to “far”, two pieces of image information from the group set to “middle”, and two pieces of image information from the group set to “near” are obtained. Extracted as a template candidate.
- the template extracting unit 10d mutually evaluates the image information extracted by the template candidate extracting unit 10c.
- the template extraction unit 10d extracts image information serving as a template from this evaluation result.
- the template extracting unit 10d mutually evaluates six images extracted from three groups of “far”, “middle”, and “near” without changing the groups. From this evaluation result, the template extraction unit 10d extracts image information to be registered as templates for each of the “far”, “middle”, and “near” groups.
- the registration device 10 extracts appropriate image information as a template candidate for each distance, and then extracts more appropriate image information as a whole. Thereby, if there is appropriate image information in the image information for each distance, that is, if there is an appropriate posture in the process of registration operation, an image suitable for registration (evaluable to be optimal or close to optimal) Information is registered as a template.
- FIG. 2 is a diagram illustrating a configuration of the authentication system according to the second embodiment.
- a system in which the authentication system 1 performs authentication using a palm vein is exemplified, but the present invention is not limited to this, and the present invention can also be applied to a system that performs authentication at other feature detection sites of a living body.
- the authentication system 1 is a system that recognizes the characteristics of a living body and identifies and authenticates an individual, and can be used for information system logon, entrance / exit management, and the like.
- the authentication system 1 includes a registration device 20, a center server 60 connected to the registration device 20 via the network 2, and a verification device 50.
- the center server 60 associates and stores identification information for identifying an individual and biometric information (template) registered in advance before biometric authentication.
- the identification information for identifying an individual is a unique ID (IDentification) assigned to a user directly (for example, a user number) or indirectly (for example, an account number).
- the biometric information registered in advance is feature information obtained by extracting a feature portion from image information, encoded information obtained by encoding image information or feature information, and the like.
- the verification device 50 is a device that performs biometric authentication when authenticating a user.
- the verification device 50 is, for example, an ATM (Automated Transaction Device) installed in a financial institution, a security area management device, a personal computer that performs user authentication, or the like.
- ATM Automated Transaction Device
- the registration device 20 includes a processing device 21, a display 22, and a sensor unit 30, and includes a keyboard 23, a mouse 24, an IC (Integrated Circuit) card reader / writer 40 and the like as necessary.
- the sensor unit 30 has a built-in imaging device, images the palm of the user, and outputs a captured image to the processing device 21.
- the IC card reader / writer 40 reads and writes information on the user's IC card 41.
- the keyboard 23 and the mouse 24 accept input operations.
- a user who requests template registration uses the keyboard 23, mouse 24, or IC card reader / writer 40 to input identification information (for example, user ID) for identifying the user.
- the registration device 20 guides the template registration to the user by display using the display 22 and requests input of biometric information for template registration.
- the user inputs biometric information by holding his hand over the sensor unit 30.
- the registration device 20 that has input a palm vein image as biometric information creates a registration template from the input information and stores it in the storage unit of the processing device 21, the storage unit of the center server 60, or the storage unit of the user's IC card 41. Record.
- the registration device 20 records the registration template in the storage unit of the processing device 21 or the storage unit of the user's IC card 41. If the registration device 20 and the verification device are separate, the registration device 20 records the registration template in the storage unit of the center server 60 or the storage unit of the IC card 41. In this case, when the biometric authentication is performed, the verification device 50 refers to the template in the storage unit of the center server 60 or the storage unit of the IC card 41 and performs verification of the input biometric information.
- FIG. 3 is a diagram illustrating a state of template registration according to the second embodiment.
- the template registration for palm vein authentication is performed based on an image (image information) obtained by photographing the operation of holding the palm over the sensor unit 30.
- image information obtained by photographing the operation of holding the palm over the sensor unit 30.
- the operation of holding the palm over the sensor unit 30 is performed so as to approach the sensor unit 30 from above the sensor unit 30.
- the registration device 20 guides an outline of an operation of holding a palm over the sensor unit 30 to a person to be registered by an image displayed on the display 22 or a sound output from a speaker (not shown). Further, at the counter of a financial institution or the like, the person in charge of the registration is instructed by the person in charge of the operation of holding the palm over the sensor unit 30.
- the sensor unit 30 includes a wide-angle lens and can capture the palm as a subject in a wide range of the photographing range.
- the sensor unit 30 includes a distance measuring sensor and photographs a palm within a predetermined range.
- the hand of the person to be registered approaches the sensor unit 30 like a hand 90a, a hand 90b, a hand 90c, and a hand 90d.
- the palm 90a does not photograph the palm
- the hand 90b, hand 90c, and hand 90d photograph the palm.
- the image of the hand 90b is an image of the “far” position
- the image of the hand 90c is the image of the “middle” position
- the image of the hand 90d Enables grouping according to the distance, like an image at a “near” position.
- the threshold values used for grouping are set in advance such as “near” for 20 mm to 40 mm, “medium” for 40 mm to 60 mm, and “far” for 60 mm to 80 mm, for example.
- Such a reference value may be held by the sensor unit 30 and the distance information may be a distance rank (for example, “far”, “medium”, “near”).
- the reference device stores the reference value.
- the images acquired by the sensor unit 30 are, for example, taken 15 frames per second, and about 15 images are obtained in one operation of holding the palm over the sensor unit 30. In this way, the images acquired by the sensor unit 30 are not held in the palm of the hand, tilted, or returned to the wrist in the process of the holding operation. Not necessarily suitable for registration templates.
- FIG. 4 is a diagram illustrating a configuration of the registration apparatus according to the second embodiment.
- the registration device 20 includes a control unit 20a, an image grouping processing unit 20b, a collation processing unit 20c, a candidate image extraction unit 20d, a template image extraction unit 20e, a template creation unit 20f, a template registration unit 20g, a message display unit 20h, and a storage unit 20i.
- the communication unit 20j is provided.
- the control unit 20a comprehensively controls each processing unit.
- the image grouping processing unit 20b groups the plurality of pieces of image information acquired from the sensor unit 30 for each predetermined distance range based on distance information corresponding to each piece of image information acquired from the sensor unit 30.
- the collation processing unit 20c collates a plurality of pieces of image information with each other, calculates a similarity score, and performs mutual evaluation. Details of the mutual evaluation will be described later with reference to FIGS.
- the candidate image extraction unit 20d performs mutual evaluation with the group being matched by the collation processing unit 20c for each piece of image information grouped by the image grouping processing unit 20b, and extracts image information as template candidates for each group.
- the template image extraction unit 20e performs mutual evaluation on the image information extracted for each group by the candidate image extraction unit 20d without changing the group by the collation processing unit 20c, and extracts image information to be a template for each group. . Thereby, the registration apparatus 20 extracts appropriate image information as a template candidate for each distance, and then extracts more appropriate image information as a whole.
- the template creation unit 20f processes the image information extracted by the template image extraction unit 20e as a registered template.
- the template registration unit 20g records (registers) the registration template in the storage unit of the processing device 21, the storage unit of the center server 60, or the storage unit of the user's IC card 41.
- the message display unit 20h generates a required message such as guidance for the operation of holding the palm of the sensor unit 30 over the person to be registered and notification of success or failure of template registration, and displays the message on the display 22.
- the storage unit 20i stores and holds image information and distance information acquired from the sensor unit 30, work information such as grouping data and similarity score, and a generated template.
- the communication unit 20j performs communication with the sensor unit 30, communication with the IC card reader / writer 40, and communication with the center server 60 via the network 2.
- FIG. 5 is a diagram illustrating a configuration of a sensor unit according to the second embodiment.
- the sensor unit 30 includes a control unit 30a, a photographing unit 30b, a distance measuring unit 30c, a storage unit 30d, and a communication unit 30e.
- the control unit 30a comprehensively controls each processing unit.
- the imaging unit 30b acquires image information from a living body that is a subject.
- the imaging unit 30b includes an image sensor (for example, a complementary metal oxide semiconductor (CMOS) sensor, a charge coupled device (CCD) sensor), a condenser lens, and a plurality of near-infrared light emitting elements (irradiating a subject).
- CMOS complementary metal oxide semiconductor
- CCD charge coupled device
- LED Light Emitting Diode
- the near-infrared light emitting element is provided, for example, around the image sensor, emits near-infrared light toward the subject direction (upward), and the image sensor photographs the subject irradiated with the near-infrared ray.
- the imaging unit 30b can continuously shoot a subject and, for example, shoots 15 frames per second. Note that the shooting speed may be changed by setting. Further, the photographing timing may be taken according to the distance to the subject based on the output of the distance measuring unit 30c, regardless of the time. Note that the imaging unit 30b has a configuration suitable for imaging a palm vein, and when imaging other living bodies such as an iris, a configuration suitable for the subject may be employed.
- the distance measuring unit 30c acquires distance information with respect to a living body as a subject.
- the storage unit 30d stores the image information acquired by the photographing unit 30b and the distance information acquired by the distance measuring unit 30c in association with the image information.
- the communication unit 30e is connected to the communication unit 20j of the registration device 20, and receives an instruction from the registration device 20, and transmits image information and distance information.
- the image photographed by the sensor unit 30 is an image obtained by irradiating a living body (palm) as a subject with near infrared rays and photographing reflected light. Since hemoglobin in red blood cells flowing in the veins has lost oxygen, this hemoglobin (reduced hemoglobin) has a property of absorbing near infrared rays in the vicinity of 700 nm to 1000 nm. Therefore, when near infrared rays are applied to the palm, only a portion where the vein is present is less reflected, and the position of the vein can be recognized by the intensity of reflected light of the near infrared ray. The photographed image becomes an achromatic image although it is easy to extract characteristic information by using a specific light source.
- FIG. 6 is a diagram illustrating a hardware configuration example of the registration apparatus according to the second embodiment.
- the registration device 20 includes a processing device 21, a display 22, a keyboard 23, a mouse 24, a sensor unit 30, and an IC card reader / writer 40.
- the entire processing apparatus 21 is controlled by a CPU (Central Processing Unit) 101.
- a RAM (Random Access Memory) 102, an HDD (Hard Disk Drive) 103, a communication interface 104, a graphic processing device 105, and an input / output interface 106 are connected to the CPU 101 via a bus 107.
- the RAM 102 temporarily stores at least part of an OS (Operating System) program and application programs to be executed by the CPU 101.
- the RAM 102 stores various data necessary for processing by the CPU 101.
- the HDD 103 stores an OS and application programs.
- a display 22 is connected to the graphic processing device 105.
- the graphic processing device 105 displays an image on the screen of the display 22 in accordance with a command from the CPU 101.
- the input / output interface 106 is connected with a keyboard 23, a mouse 24, a sensor unit 30, and an IC card reader / writer 40.
- the input / output interface 106 can be connected to a portable recording medium interface that can write information to the portable recording medium 110 and read information from the portable recording medium 110.
- the input / output interface 106 transmits signals sent from the keyboard 23, mouse 24, sensor unit 30, IC card reader / writer 40, and portable recording medium interface to the CPU 101 via the bus 107.
- the communication interface 104 is connected to the network 2.
- the communication interface 104 transmits and receives data to and from the verification device 50 and the center server 60.
- the processing functions of the present embodiment can be realized.
- the verification device 50 and the center server 60 can also be realized with the same hardware configuration.
- the processing device 21 can be configured to include modules each composed of an FPGA (Field Programmable Gate Array), a DSP (Digital Signal Processor), or the like, or can be configured without the CPU 101.
- each of the processing devices 21 includes a nonvolatile memory (for example, EEPROM (Electrically Erasable and Programmable Read Only Memory), flash memory, flash memory type memory card, etc.) and stores module firmware.
- the nonvolatile memory can write firmware via the portable recording medium 110 or the communication interface 104.
- the processing device 21 can also update the firmware by rewriting the firmware stored in the nonvolatile memory.
- FIG. 7 is a flowchart of template registration processing according to the second embodiment.
- the template registration process is a process for acquiring shooting information (captured image) and distance information from the sensor unit 30 and generating and registering a template.
- Step S11 The processing device 21 guides the person to be registered to the point of the operation of holding the palm over the sensor unit 30 by the image displayed on the display 22 or the sound output from the speaker (not shown).
- Step S12 The processing device 21 instructs the sensor unit 30 to shoot.
- Step S13 The processing device 21 waits for reception of image information and distance information corresponding to the image information from the sensor unit 30, and proceeds to step S14 when the image information and distance information are received.
- the image information received by the processing device 21 is image information acquired when the person to be registered performs one hand-holding operation, and is, for example, about 15 biological images on average.
- Step S14 When the processing device 21 acquires image information for three hand-holding operations from the sensor unit 30 (collects biological images up to the third hand), the processing device 21 proceeds to Step S15. If the processing device 21 has not acquired image information for three hand-holding operations from the sensor unit 30, the processing device 21 proceeds to step S11. Therefore, the processing device 21 obtains image information for approximately 45 biological images from the sensor unit 30.
- the processing device 21 groups the image information for each predetermined distance range based on the corresponding distance information. For example, the processing device 21 divides image information for 45 biological images into three groups of “far”, “medium”, and “near” according to the distance from the sensor unit.
- the processing device 21 calculates a mutual similarity score for each group (different groups).
- the evaluation value is calculated using a known collation evaluation method such as comparison of feature points or pattern matching.
- the processing device 21 extracts image information as template candidates based on the evaluation result of the mutual similarity score for each group. For example, when extracting two images, the processing device 21 extracts the top two matching images as image information that is a template candidate. Therefore, when two images are extracted from three groups, the processing device 21 extracts six images.
- the processing device 21 may extract, as a template candidate, image information in an evaluation band different from the collation image having the higher evaluation result in addition to the collation image having the higher evaluation result.
- image information in a different evaluation zone may be image information that is inferior in evaluation by a predetermined value compared to a collation image having a higher evaluation result.
- the image information in different evaluation bands may be a collation image having a higher evaluation result than a reference image having a higher evaluation result and a collation image.
- Step S18 The processing device 21 collectively calculates a mutual similarity score for the image information extracted in Step S17 (without changing the group).
- Step S19 The processing device 21 extracts one collation image having a higher evaluation result for each distance (group grouped for each distance range). Therefore, when there are three groups, the processing device 21 extracts three images.
- the processing device 21 creates a template from the extracted three pieces of image information.
- the template performs processing such as compression and deletion of information that becomes noise, contrast adjustment for emphasizing features, compression of data amount (for example, obtaining a difference for a plurality of images), and the like.
- the processing device 21 records (registers) the generated template (registered template) in the storage unit of the processing device 21, the storage unit of the center server 60, or the storage unit of the user's IC card 41.
- FIG. 8 is a flowchart of imaging processing according to the second embodiment.
- the imaging process is a process of receiving an imaging instruction from the processing device 21, imaging a living body, and outputting imaging information and distance information to the processing device 21.
- Step S31 The sensor unit 30 determines whether there is a living body in the imaging range based on the output of the distance measuring unit 30c. If there is a living body in the imaging range, the sensor unit 30 proceeds to step S32.
- the sensor unit 30 captures image information of a living body within the imaging range.
- the sensor unit 30 associates the distance information output by the distance measuring unit 30c with the image information.
- the association (association) of the distance information with the image information is performed by writing the distance information in the header of the image information, for example.
- the sensor unit 30 stores and holds the image information in the storage unit 30d.
- the sensor unit 30 determines whether or not there is a living body in the imaging range based on the output of the distance measuring unit 30c.
- the sensor unit 30 proceeds to step S36 when there is no living body in the imaging range, and proceeds to step S32 when there is a living body in the imaging range. In this way, the sensor unit 30 continuously photographs a living body within a predetermined range.
- the sensor unit 30 may determine the end of shooting by shooting for a predetermined time, detecting the stillness of a living body, or the like.
- Step S36 The sensor unit 30 outputs the image information stored and held in the storage unit 30d to the processing device 21.
- FIG. 9 is a table showing the similarity score and similarity grouping of the long-distance image according to the second embodiment.
- FIG. 10 is a table showing the similarity score and similarity grouping of the middle-range image according to the second embodiment.
- FIG. 11 is a table showing similarity scores and similarity groupings of short-distance images according to the second embodiment.
- the processing device 21 obtains the image information of 15 images in total, the image information of 5 images in the group corresponding to the long-distance shooting, the image information of 5 images in the group corresponding to the mid-range shooting, and the short-distance shooting. Assume that the image information of five images is distributed to the corresponding group.
- the group corresponding to the long-distance shooting includes F_01, F_02, F_03, F_04, and F_05.
- the evaluation results obtained in this way are as follows. Image information having a mutual similarity score of less than 1000 is assigned to similarity group A, and image information having a value in the range of less than 1000 to 3000 is assigned to similarity group B. Image information having a value of 3000 or more is classified into similarity group C.
- the similarity group A is a group that can be evaluated as being very similar
- the similarity group B is a group that can be evaluated as being similar
- the similarity group C is a group that can be evaluated as not being similar.
- the similarity group A and the similarity group B can be used as templates, but the similarity group C cannot be used as a template.
- the top two image combinations in the similarity score are a combination of the similarity score 890 (F_02, F_01) and a combination of the similarity score 950 (F_01, F_02). Therefore, as an image extracted as a template candidate from a group corresponding to long-distance shooting, F_01 and F_02 are extracted from the upper two image combinations.
- the images extracted as template candidates from the group corresponding to the long-distance shooting are not limited to the top two image combinations, but F_01, the similarity, as a collation image with the highest evaluation result in the similarity group A to the similarity group A.
- F_03 may be extracted from the group B as a collation image having a higher evaluation result in the similarity group B.
- the group corresponding to the middle distance shooting includes M_01, M_02, M_03, M_04, and M_05.
- the top two image combinations in the similarity score are a combination of similarity score 890 (M_03, M_04) and a combination of similarity score 895 (M_02, M_01). Therefore, the images extracted as the template candidates from the group corresponding to the middle-distance shooting are extracted as the matching images M_01 and M_04 from the upper two image combinations.
- all images M_01, M_02, M_03, and M_04 constituting the combination may be extracted from the upper two image combinations as the images extracted as the template candidates from the group corresponding to the medium-distance shooting.
- images extracted as template candidates from the group corresponding to medium-distance shooting are not limited to the top two image combinations, but M_04, which is the highest collation image in the similarity group A to similarity group A, and the similarity level.
- M_03 may be extracted from the group B as a collation image having a higher evaluation result in the similarity group B.
- the group corresponding to the short-distance shooting includes N_01, N_02, N_03, N_04, and N_05.
- the top two image combinations in the similarity score are a combination of the similarity score 850 (N_05, N_04) and a combination of the similarity score 893 (N_02, N_01). Therefore, the images extracted as the template candidates from the group corresponding to the short-distance shooting are extracted as the matching images N_04 and N_01 from the upper two image combinations.
- all images N_01, N_02, N_04, and N_05 constituting the combination may be extracted from the upper two image combinations as images extracted from the group corresponding to the short-distance shooting.
- the images extracted as template candidates from the group corresponding to the short-distance shooting are not limited to the top two image combinations, but N_04 as the collation image with the highest evaluation result within the similarity group A to the similarity group A, the similarity degree N_03 may be extracted from the group B as a collation image having a higher evaluation result in the similarity group B.
- the processing device 21 extracts six images F_01, F_02, M_01, M_04, N_01, and N_04 as template candidates. .
- 6 ⁇ 5 30 different similarity scores are calculated for these six images.
- the calculated mutual similarity score is, for example, F_01 ⁇ F_02 ⁇ M_01 ⁇ M_04 ⁇ N_01 ⁇ N_04, three images F_01, M_01, and N_01 are extracted as registered templates as higher ranks for each distance group.
- the processing device 21 does not necessarily extract the image information of the highest evaluation for each distance group as a registration template, and registers the image information for each distance group reflecting mutual evaluation with other distance groups. Extract as a template.
- FIG. 12 is a diagram illustrating the configuration of the collation device according to the second embodiment.
- the collation device 50 includes a control unit 50a, a collation processing unit 50b, a template search unit 50e, a message display unit 50f, a storage unit 50g, and a communication unit 50h.
- the control unit 50a comprehensively controls each processing unit.
- the verification processing unit 50b includes a feature data extraction unit 50c and a similarity score calculation unit 50d.
- the collation processing unit 50b extracts a feature part from the biological image information to be collated by the feature data extraction unit 50c, and calculates a similarity score between the registered template and the image information from which the feature part is extracted by the similarity score calculation unit 50d. calculate.
- Template search unit 50e searches for a registered template used for verification.
- a user ID for example, a user ID recorded on the IC card 41, an account number recorded on a magnetic stripe of a bankbook or cash card, etc.
- search key for example, a user ID recorded on the IC card 41, an account number recorded on a magnetic stripe of a bankbook or cash card, etc.
- the message display unit 50f generates a required message such as guidance for the operation of placing the palm of the hand on the sensor unit included in the verification device 50, notification of success or failure of template registration, and displays the message on the display.
- the storage unit 50g stores and holds image information and distance information acquired from the sensor unit included in the verification device 50, and work information such as a similarity score.
- the communication unit 50h communicates with the center server 60 via the network 2, and also communicates with required devices (for example, a sensor unit provided in the verification device 50, an IC card reader / writer provided in the verification device 50, etc.). .
- FIG. 13 is a flowchart of the matching process according to the second embodiment.
- the collation device 50 receives a user ID input by the user, notifies the received user ID to the center server 60, and acquires a registration template corresponding to the user ID.
- the collation device 50 stores the acquired registered template in the storage unit 50g.
- the collation device 50 acquires the registration template from the IC card 41.
- the verification device 50 also serves as a registration device, the verification device 50 acquires a registration template from the storage unit 50g.
- the collation device 50 acquires all registered templates.
- the collation apparatus 50 may acquire the registration template used for collation for every collation, without acquiring all the registration templates at once.
- the matching device 50 guides the user to hold the hand by displaying an appropriate message on the message display unit 50f.
- the collation device 50 acquires distance information (hereinafter referred to as collation target distance information) corresponding to the biological image information used for collation from a sensor unit included in the collation device 50.
- the collation device 50 extracts feature data (feature part) from the image information of the living body to be collated by the feature data extraction unit 50c.
- the collation device 50 calculates the similarity between the registered template corresponding to the collation target distance information and the image information from which the feature data is extracted (calculation of the similarity score).
- Step S45 The collation device 50 determines whether the calculation result of the similarity is within a predetermined threshold.
- the collation device 50 proceeds to step S46 if the similarity calculation result is within the predetermined threshold value, and proceeds to step S47 if it is not within the predetermined threshold value.
- the predetermined threshold is a value set in advance for each installation environment of the verification device 50 based on the permissible setting of the principal rejection rate and the other person acceptance rate.
- the collation device 50 performs the identity verification process and ends the collation process.
- the verification device 50 displays a personal confirmation message by the message display unit 50f and permits execution of the process after the personal authentication.
- the collation device 50 calculates the similarity between the registered template that does not correspond to the collation target distance information and the image information from which the feature portion is extracted (calculation of the similarity score). [Step S48] The collation device 50 determines whether the calculation result of the similarity is within a predetermined threshold. The collation device 50 proceeds to step S46 if the similarity calculation result is within the predetermined threshold value, and proceeds to step S49 if it is not within the predetermined threshold value.
- Step S49 The collation device 50 determines whether or not collation with the image information from which the feature portion has been extracted is completed for all registered templates. The collation device 50 proceeds to step S50 when collation is completed for all registered templates, and proceeds to step S47 when collation is not completed for all registered templates.
- the collation device 50 performs the principal rejection process and ends the collation process.
- the collation apparatus 50 displays the identity rejection message by the message display unit 50f in the identity rejection process, and disallows execution of the processing after the identity authentication.
- FIG. 14 is a diagram illustrating a registered template and a collation image according to the second embodiment.
- the matching device 50 first performs matching (corresponding relationship indicated by a solid line) with a registered template corresponding to the matching target distance information.
- the collation apparatus 50 also performs collation (corresponding relationship shown with a broken line) with the registration template which does not correspond to collation object distance information, when collation with the registration template corresponding to collation object distance information fails.
- a long-distance template 71, a medium-distance template 72, and a short-distance template 73 are registered in the registration template 70.
- the long-distance collation image 74 collates with the long-distance template 71, and when this collation fails, collation with the medium-distance template 72 and the short-distance template 73 is performed.
- the medium distance collation image 75 is collated with the medium distance template 72, and when this collation fails, the long distance template 71 and the short distance template 73 are collated.
- the short-distance collation image 76 collates with the short-distance template 73, and when this collation fails, collation with the long-distance template 71 and the medium-distance template 72 is performed.
- the opportunity to perform the matching properly can be improved (smooth authentication).
- a user who does not properly perform the hand-holding operation in a part of the process of the hand-holding operation can improve the chance of performing the matching appropriately.
- the collation device 50 performs collation with a registration template that does not correspond to the collation target distance information when collation with the registration template corresponding to the collation target distance information fails. You may make it collate.
- the verification processing unit 50b may be provided in the center server 60 or the IC card 41 instead of the verification device 50.
- the verification device 50 notifies the center server 60 or the IC card 41 of the user ID and the biological image information (including the corresponding distance information) to be verified, and acquires the verification result.
- FIG. 15 is a diagram illustrating a state of template registration according to the third embodiment. Unlike the second embodiment, the third embodiment divides the actual distance into equal parts and performs grouping according to the distance. Note that the same configuration as that of the second embodiment will be described with the same reference numerals.
- the sensor unit 30 includes a distance measuring sensor and photographs a palm located within a predetermined range.
- the imaging range of the sensor unit 30 is 20 mm to 80 mm, the distance of 60 mm is equally divided in the second embodiment, but in the third embodiment, the distance of 60 mm is equally divided. To do.
- the threshold values used for grouping are set in advance such as “near” for 20 mm to 30 mm, “medium” for 30 mm to 50 mm, and “far” for 50 mm to 80 mm, for example.
- each hand such as the hands 91a, 91b, 91c, and 91d, can be decelerated as it approaches the sensor unit 30 as it approaches from a distance.
- the same number of palm images can be obtained at a distance.
- FIG. 16 is a diagram illustrating a state of template registration according to the fourth embodiment. Unlike the third embodiment, the fourth embodiment sets a reference value based on a photographed image and performs grouping. Note that the same configuration as that of the third embodiment will be described with the same reference numerals.
- the sensor unit 30 includes a distance measuring sensor and photographs a palm located within a predetermined range.
- a palm image with a height of 80 mm is not always obtained. This is because the palm does not necessarily move from the upper side to the lower side of the sensor unit 30 and may approach the sensor unit 30 with the movement in the left-right direction or the front-rear direction.
- the threshold values used for grouping are set after shooting, such as “near” for 20 to 30 mm, “medium” for 30 to 40 mm, and “far” for 40 to 50 mm.
- FIG. 17 is a diagram illustrating a registered template and a collation image according to the fifth embodiment. Unlike the second embodiment, the fifth embodiment has a plurality of registered templates for each distance. The configuration common to the second embodiment will be described with the same reference numerals.
- the registration template 80 includes a plurality of long-distance templates (long-distance template 82, long-distance template 83, long-distance template 84), a plurality of medium-distance templates (not shown), and a plurality of short-distance templates (not shown).
- the selection unit 81 may be provided in the collation device 50 or may be provided in a device that stores a registered template such as the center server 60.
- the collation device 50 inquires of the selection means 81 for a registered template corresponding to the collation target distance information and performs collation (correspondence relationship indicated by a solid line).
- the selection means 81 selects one registration template as a collation target from among the registration templates corresponding to a plurality of collation target distance information (indicated by a broken line).
- the long-distance collation image 85 is first collated with any one of the long-distance template 82, the long-distance template 83, and the long-distance template 84, and, if unsuccessful, collation with the other templates.
- the selection of the registered template by the selection unit 81 may be performed randomly, sequentially, or the order may be changed according to the matching results.
- the number of templates registered in the registration template 80 may be set according to various restrictions (such as storage capacity and processing capability) of the authentication system 1.
- the above processing functions can be realized by a computer.
- a program describing the processing contents of the functions that the registration device 20, the center server 60, and the verification device 50 should have is provided.
- the program describing the processing contents can be recorded on a computer-readable recording medium (including a portable recording medium).
- the computer-readable recording medium include a magnetic recording device, an optical disk, a magneto-optical recording medium, and a semiconductor memory.
- the magnetic recording device include a hard disk device (HDD), a flexible disk (FD), and a magnetic tape.
- Optical discs include DVD (Digital Versatile Disc), DVD-RAM, CD-ROM, CD-R (Recordable) / RW (ReWritable), and the like.
- Magneto-optical recording media include MO (Magneto-Optical disk).
- a portable recording medium such as a DVD or CD-ROM in which the program is recorded is sold. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
- the computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- General Physics & Mathematics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Signal Processing (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Analysis (AREA)
- Collating Specific Patterns (AREA)
Abstract
Description
撮影した生体の画像情報と撮影時の距離情報とを出力可能なセンサユニットから複数の画像情報を取得し、センサユニットから画像情報の各々に対応する距離情報を取得し、複数の画像情報について、画像情報の各々に対応する距離情報に応じたグループ分けをおこない、距離情報に応じたグループごとに画像情報を相互評価して、テンプレート候補となる画像情報を抽出し、抽出された画像情報を相互評価してテンプレートとなる画像情報を抽出する。
本発明の上記および他の目的、特徴および利点は本発明の例として好ましい実施の形態を表す添付の図面と関連した以下の説明により明らかになるであろう。
[第1の実施形態]
まず、第1の実施形態の登録装置について、図1を用いて説明する。図1は、第1の実施形態の登録装置の構成を示す図である。
[第2の実施形態]
図2は、第2の実施形態の認証システムの構成を示す図である。第2の実施形態として、認証システム1が手のひらの静脈を用いて認証をおこなうシステムを例示するが、これに限らず生体のその他の特徴検出部位で認証をおこなうシステムにも適用可能である。
制御部30aは、各処理部を統括的に制御する。撮影部30bは、被写体となる生体から画像情報を取得する。撮影部30bは、生体を撮影するイメージセンサ(たとえば、CMOS(Complementary Metal Oxide Semiconductor)センサ、CCD(Charge Coupled Device)センサなど)と、集光レンズと、被写体に照射する複数の近赤外線発光素子(LED:Light Emitting Diode)を備える。近赤外線発光素子は、たとえば、イメージセンサの周囲に設けられ、近赤外線を被写体方向(上方)に向けて発光し、イメージセンサが近赤外線を照射された被写体を撮影する。撮影部30bは、被写体を連続撮影可能であって、たとえば、秒速15コマの撮影をおこなう。なお、撮影速度は、設定により変更可能としてもよい。また、撮影タイミングを時間によらず、測距部30cの出力にもとづいて、被写体との距離に応じて撮影するようにしてもよい。なお、撮影部30bは、手のひら静脈の撮影に適した構成であり、虹彩など、その他の生体を撮影する場合には、被写体に適した構成を採用すればよい。
登録装置20は、処理装置21、ディスプレイ22、キーボード23、マウス24、センサユニット30、ICカードリーダライタ40を備える。
以上のようなハードウェア構成によって、本実施の形態の処理機能を実現することができる。なお、照合装置50、センタサーバ60も同様のハードウェア構成で実現できる。
テンプレート登録処理は、センサユニット30から撮影情報(撮影画像)と距離情報を取得して、テンプレートの生成、登録をおこなう処理である。
[ステップS13]処理装置21は、センサユニット30から画像情報と、画像情報に対応する距離情報の受信を待ち受けて、画像情報と距離情報を受信するとステップS14にすすむ。ここで処理装置21が受信する画像情報は、登録対象者が1回の手かざし動作をおこなうことで取得された画像情報であり、たとえば、平均で生体画像15枚分ほどになる。
[ステップS19]処理装置21は、距離(距離範囲ごとにグルーピングされたグループ)ごとに評価結果上位の照合画像を1つ抽出する。したがって、3グループある場合、処理装置21は、3画像を抽出することとなる。
撮影処理は、処理装置21から撮影指示を受けて、生体を撮影し、処理装置21への撮影情報と距離情報の出力をおこなう処理である。
[ステップS33]センサユニット30は、測距部30cが出力する距離情報を画像情報と対応付ける。画像情報への距離情報の関連付け(対応付け)は、たとえば、画像情報のヘッダに距離情報を書き込むことでおこなう。
[ステップS35]センサユニット30は、撮影範囲に生体があるか否かを、測距部30cの出力にもとづいて判定する。センサユニット30は、撮影範囲に生体がない場合、ステップS36にすすみ、撮影範囲に生体がある場合、ステップS32にすすむ。センサユニット30は、このようにして所定範囲にある生体を連続撮影する。なお、センサユニット30は、所定時間の撮影、生体の静止検出などによって撮影終了を判定してもよい。
次に、画像情報の相互評価について、図9から図11を用いて説明する。図9は、第2の実施形態の遠距離画像の類似度スコアと類似度グルーピングを示す表である。図10は、第2の実施形態の中距離画像の類似度スコアと類似度グルーピングを示す表である。図11は、第2の実施形態の近距離画像の類似度スコアと類似度グルーピングを示す表である。
距離グループごとの画像情報を登録テンプレートとして抽出する。
制御部50aは、各処理部を統括的に制御する。照合処理部50bは、特徴データ抽出部50cと類似度スコア計算部50dを含んで構成される。照合処理部50bは、特徴データ抽出部50cにより照合対象となる生体の画像情報から特徴部を抽出し、類似度スコア計算部50dにより登録テンプレートと特徴部を抽出した画像情報との類似度スコアを計算する。
[ステップS41]照合装置50は、利用者が入力するユーザIDを受け付け、受け付けたユーザIDをセンタサーバ60に通知し、ユーザIDに対応する登録テンプレートを取得する。照合装置50は、取得した登録テンプレートを記憶部50gに格納する。
[ステップS44]照合装置50は、照合対象距離情報に対応する登録テンプレートと、特徴データを抽出した画像情報との類似度を計算(類似度スコアの算出)する。
[ステップS48]照合装置50は、類似度の計算結果が所定の閾値内にあるか否かを判定する。照合装置50は、類似度の計算結果が所定の閾値内にある場合、ステップS46にすすみ、所定の閾値内にない場合、ステップS49にすすむ。
照合装置50は、まず、照合対象距離情報に対応する登録テンプレートでの照合(実線で示す対応関係)をおこなう。そして、照合装置50は、照合対象距離情報に対応する登録テンプレートでの照合に失敗した場合に、照合対象距離情報に対応しない登録テンプレートでの照合(破線で示す対応関係)もおこなう。
センサユニット30は、測距センサを備え、所定範囲の距離にある手のひらを撮影する。センサユニット30が撮影範囲を20mmから80mmとするとき、第2の実施形態では、この60mmの距離を等分していたが、第3の実施形態では、この60mmの距離を不等分に分割する。
第4の実施形態は、第3の実施形態と異なり、撮影した画像にもとづいて基準値を設定して、グループ分けをおこなう。なお、第3の実施形態と共通の構成については、符番を同じにして説明する。
センサユニット30は、測距センサを備え、所定範囲の距離にある手のひらを撮影する。センサユニット30が撮影範囲を20mmから80mmとするとき、必ずしも80mmの高さの手のひら画像を得られるとは限らない。これは、手のひらがセンサユニット30の上方から下方に向かって動作するとは限らず、左右方向、あるいは前後方向の動作を伴ってセンサユニット30に接近する場合があるからである。
登録テンプレート80は、複数の遠距離テンプレート(遠距離テンプレート82、遠距離テンプレート83、遠距離テンプレート84)、複数の中距離テンプレート(図示せず)、複数の近距離テンプレート(図示せず)を備える。選択手段81は、照合装置50に備えてもよいし、センタサーバ60などの登録テンプレートを記憶する装置に備えるようにしてもよい。
なお、登録テンプレート80に登録するテンプレートの数は、認証システム1の各種制約(記憶容量や処理能力など)に応じて、設定可能にしてもよい。
さらに、上述の実施の形態は、多数の変形、変更が当業者にとって可能であり、説明した正確な構成および応用例に限定されるものではない。
2 ネットワーク
10 登録装置
10a 画像情報取得手段
10b 距離情報取得手段
10c テンプレート候補抽出手段
10d テンプレート抽出手段
20 登録装置
20a 制御部
20b 画像グルーピング処理部
20c 照合処理部
20d 候補画像抽出部
20e テンプレート画像抽出部
20f テンプレート作成部
20g テンプレート登録部
20h メッセージ表示部
20i 記憶部
20j 通信部
21 処理装置
30 センサユニット
30a 制御部
30b 撮影部
30c 測距部
30d 記憶部
30e 通信部
40 ICカードリーダライタ
41 ICカード
50 照合装置
50a 制御部
50b 照合処理部
50c 特徴データ抽出部
50d 類似度スコア計算部
50e テンプレート検索部
50f メッセージ表示部
50g 記憶部
50h 通信部
60 センタサーバ
Claims (10)
- 生体認証に用いるテンプレートを登録する処理をコンピュータに実行させるための登録プログラムであって、
前記コンピュータに、
撮影した生体の画像情報と前記撮影時の距離情報とを出力可能なセンサユニットから複数の前記画像情報を取得し、
前記センサユニットから前記画像情報の各々に対応する前記距離情報を取得し、
複数の前記画像情報について、前記画像情報の各々に対応する前記距離情報に応じたグループ分けをおこない、前記距離情報に応じたグループごとに前記画像情報を相互評価して、テンプレート候補となる前記画像情報を抽出し、
抽出された前記画像情報を相互評価してテンプレートとなる前記画像情報を抽出する、
処理を実行させることを特徴とする登録プログラム。 - 前記センサユニットから複数の前記画像情報を取得するときには、前記センサユニットに接近する前記生体を連続撮影することで、前記センサユニットが1回の登録動作から生成した複数の前記画像情報を取得することを特徴とする請求の範囲第1項記載の登録プログラム。
- 前記センサユニットから複数の前記画像情報を取得するときには、前記センサユニットが複数回の登録動作から生成した複数の前記画像情報を取得することを特徴とする請求の範囲第2項記載の登録プログラム。
- 前記距離情報に応じたグループ分けは、あらかじめ設定した前記センサユニットと前記生体との距離基準にもとづいておこなうことを特徴とする請求の範囲第1項記載の登録プログラム。
- 前記距離情報に応じたグループ分けは、前記センサユニットが生成した複数の前記画像情報に対応付けられた前記距離情報から生成する距離基準にもとづいておこなうことを特徴とする請求の範囲第1項記載の登録プログラム。
- 前記距離情報に応じたグループ分けは、前記センサユニットが生成した複数の前記画像情報の数にもとづいておこなうことを特徴とする請求の範囲第1項記載の登録プログラム。
- テンプレート候補となる前記画像情報を抽出するときには、前記距離情報に応じたグループごとに、前記相互評価で上位評価となる複数の前記画像情報を抽出することを特徴とする請求の範囲第1項乃至請求の範囲第6項記載の登録プログラム。
- テンプレート候補となる前記画像情報を抽出するときには、前記距離情報に応じたグループごとに、前記相互評価における異なる評価帯毎に前記画像情報を抽出することを特徴とする請求の範囲第1項乃至請求の範囲第6項記載の登録プログラム。
- 生体認証に用いるテンプレートを登録する登録装置であって、
撮影した生体の画像情報と前記撮影時の距離情報とを出力可能なセンサユニットから複数の前記画像情報を取得する画像情報取得手段と、
前記センサユニットから前記画像情報の各々に対応する前記距離情報を取得する距離情報取得手段と、
複数の前記画像情報について、前記画像情報の各々に対応する前記距離情報に応じたグループ分けをおこない、前記距離情報に応じたグループごとに前記画像情報を相互評価して、テンプレート候補となる前記画像情報を抽出するテンプレート候補抽出手段と、
抽出された前記画像情報を相互評価してテンプレートとなる前記画像情報を抽出するテンプレート抽出手段と、
を備えることを特徴とする登録装置。 - 生体認証に用いるテンプレートの登録方法であって、
撮影した生体の画像情報と前記撮影時の距離情報とを出力可能なセンサユニットから複数の前記画像情報を取得し、
前記センサユニットから前記画像情報の各々に対応する前記距離情報を取得し、
複数の前記画像情報について、前記画像情報の各々に対応する前記距離情報に応じたグループ分けをおこない、前記距離情報に応じたグループごとに前記画像情報を相互評価して、テンプレート候補となる前記画像情報を抽出し、
抽出された前記画像情報を相互評価してテンプレートとなる前記画像情報を抽出する、
ことを特徴とする登録方法。
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2012536077A JP5509335B2 (ja) | 2010-09-30 | 2010-09-30 | 登録プログラム、登録装置、および登録方法 |
| PCT/JP2010/067044 WO2012042631A1 (ja) | 2010-09-30 | 2010-09-30 | 登録プログラム、登録装置、および登録方法 |
| KR1020137001320A KR101384446B1 (ko) | 2010-09-30 | 2010-09-30 | 컴퓨터 판독가능한 기록 매체, 등록 장치, 및 등록 방법 |
| EP10857844.4A EP2624205A1 (en) | 2010-09-30 | 2010-09-30 | Registration program, registration device, and registration method |
| CN2010800681281A CN103003840A (zh) | 2010-09-30 | 2010-09-30 | 登记程序、登记装置以及登记方法 |
| US13/728,685 US20130114863A1 (en) | 2010-09-30 | 2012-12-27 | Registration program, registration apparatus, and method of registration |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2010/067044 WO2012042631A1 (ja) | 2010-09-30 | 2010-09-30 | 登録プログラム、登録装置、および登録方法 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US13/728,685 Continuation US20130114863A1 (en) | 2010-09-30 | 2012-12-27 | Registration program, registration apparatus, and method of registration |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012042631A1 true WO2012042631A1 (ja) | 2012-04-05 |
Family
ID=45892135
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2010/067044 Ceased WO2012042631A1 (ja) | 2010-09-30 | 2010-09-30 | 登録プログラム、登録装置、および登録方法 |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US20130114863A1 (ja) |
| EP (1) | EP2624205A1 (ja) |
| JP (1) | JP5509335B2 (ja) |
| KR (1) | KR101384446B1 (ja) |
| CN (1) | CN103003840A (ja) |
| WO (1) | WO2012042631A1 (ja) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014085708A (ja) * | 2012-10-19 | 2014-05-12 | Fujitsu Ltd | 画像処理装置、画像処理方法および画像処理プログラム |
| JP2015524968A (ja) * | 2012-07-18 | 2015-08-27 | ジェムアルト エスアー | 非接触チップカードのユーザを認証する方法 |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2546773B1 (en) * | 2010-03-08 | 2020-10-28 | Fujitsu Limited | Biometric authentication device, biometric authentication program and method |
| EA036943B1 (ru) | 2011-11-07 | 2021-01-19 | Дали Системз Ко., Лтд. | Мягкая передача обслуживания и маршрутизация данных в виртуализованной распределенной антенной системе |
| CH707230B1 (de) * | 2012-11-20 | 2016-02-29 | Frank Türen Ag | Türsystem mit berührungsloser Zutrittskontrolle und berührungsloser Türbedienung. |
| US20170250927A1 (en) | 2013-12-23 | 2017-08-31 | Dali Systems Co. Ltd. | Virtual radio access network using software-defined network of remotes and digital multiplexing switches |
| MY189687A (en) | 2013-12-23 | 2022-02-26 | Dali Systems Co Ltd | Digital multilexer in a distributed antenna system |
| JP6962458B2 (ja) * | 2018-04-24 | 2021-11-05 | 三菱電機株式会社 | 認証装置 |
| US11275820B2 (en) * | 2019-03-08 | 2022-03-15 | Master Lock Company Llc | Locking device biometric access |
| CN111582228B (zh) * | 2020-05-20 | 2024-10-22 | 深圳前海微众银行股份有限公司 | 活体掌纹的识别方法、装置、设备及存储介质 |
| CN114049687A (zh) * | 2021-10-25 | 2022-02-15 | 珠海格力电器股份有限公司 | 基于人脸识别的活体验证方法、装置、设备及储存介质 |
| CN115512427B (zh) * | 2022-11-04 | 2023-04-25 | 北京城建设计发展集团股份有限公司 | 一种结合配合式活检的用户人脸注册方法与系统 |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000055811A1 (en) * | 1999-03-12 | 2000-09-21 | Sony Corporation | Data processor, data processing method, and recorded medium |
| JP2006006753A (ja) | 2004-06-28 | 2006-01-12 | Fujitsu Ltd | 生体認証システムの登録方法、生体認証システム及びそのプログラム |
| JP2006099614A (ja) * | 2004-09-30 | 2006-04-13 | Toshiba Corp | 生体判別装置および生体判別方法 |
| JP2008217358A (ja) * | 2007-03-02 | 2008-09-18 | Ricoh Co Ltd | 生体認証装置および生体認証装置を用いた認証方法 |
| JP2008243093A (ja) * | 2007-03-29 | 2008-10-09 | Toshiba Corp | 辞書データの登録装置及び辞書データの登録方法 |
| JP2008250601A (ja) * | 2007-03-30 | 2008-10-16 | Hitachi Omron Terminal Solutions Corp | 生体情報読取装置および生体情報読取システム |
| WO2010086993A1 (ja) * | 2009-01-30 | 2010-08-05 | 富士通フロンテック株式会社 | 認証装置、撮像装置、認証方法および認証プログラム |
Family Cites Families (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE60119418T2 (de) * | 2000-03-22 | 2007-05-24 | Kabushiki Kaisha Toshiba, Kawasaki | Gesichtsbildaufnehmendes Erkennungsgerät und Passüberprüfungsgerät |
| US10242255B2 (en) * | 2002-02-15 | 2019-03-26 | Microsoft Technology Licensing, Llc | Gesture recognition system using depth perceptive sensors |
| US7308112B2 (en) * | 2004-05-14 | 2007-12-11 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
| WO2006059481A1 (ja) * | 2004-11-30 | 2006-06-08 | Matsushita Electric Industrial Co., Ltd. | プリントシステム |
| JP2007052720A (ja) * | 2005-08-19 | 2007-03-01 | Fujitsu Ltd | 生体認証による情報アクセス方法及び生体認証による情報処理システム |
| JP2008102780A (ja) * | 2006-10-19 | 2008-05-01 | Sony Corp | パターン識別方法、登録装置、照合装置及びプログラム |
| JP4577580B2 (ja) * | 2007-04-10 | 2010-11-10 | ソニー株式会社 | 位置合わせ方法、位置合わせ装置及びプログラム |
| DE102007046579B3 (de) * | 2007-09-27 | 2009-01-29 | Siemens Ag | Verfahren zur Detektion von Bewegungen und Korrektur von Bewegungen in tomographischen und projektiven Aufnahmeserien und Tomographie- beziehungsweise Projektionssystem zur Durchführung dieses Verfahrens |
| EP2618290A3 (en) * | 2008-04-02 | 2014-08-06 | Google, Inc. | Method and apparatus to incorporate automatic face recognition in digital image collections |
| US8406491B2 (en) * | 2008-05-08 | 2013-03-26 | Ut-Battelle, Llc | Image registration method for medical image sequences |
| WO2010123043A1 (ja) * | 2009-04-21 | 2010-10-28 | 日本電気株式会社 | 癌の評価方法 |
| JP2011243093A (ja) | 2010-05-20 | 2011-12-01 | Canon Inc | 情報処理装置、ユーザ認証方法、及びコンピュータプログラム |
| US8285074B2 (en) * | 2010-09-01 | 2012-10-09 | Palo Alto Research Center Incorporated | Finding low variance regions in document images for generating image anchor templates for content anchoring, data extraction, and document classification |
| US8867804B2 (en) * | 2010-11-08 | 2014-10-21 | Cranial Technologies, Inc. | Method and apparatus for automatically generating trim lines for cranial remodeling devices |
-
2010
- 2010-09-30 JP JP2012536077A patent/JP5509335B2/ja active Active
- 2010-09-30 KR KR1020137001320A patent/KR101384446B1/ko active Active
- 2010-09-30 WO PCT/JP2010/067044 patent/WO2012042631A1/ja not_active Ceased
- 2010-09-30 EP EP10857844.4A patent/EP2624205A1/en not_active Withdrawn
- 2010-09-30 CN CN2010800681281A patent/CN103003840A/zh active Pending
-
2012
- 2012-12-27 US US13/728,685 patent/US20130114863A1/en not_active Abandoned
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2000055811A1 (en) * | 1999-03-12 | 2000-09-21 | Sony Corporation | Data processor, data processing method, and recorded medium |
| JP2006006753A (ja) | 2004-06-28 | 2006-01-12 | Fujitsu Ltd | 生体認証システムの登録方法、生体認証システム及びそのプログラム |
| JP2006099614A (ja) * | 2004-09-30 | 2006-04-13 | Toshiba Corp | 生体判別装置および生体判別方法 |
| JP2008217358A (ja) * | 2007-03-02 | 2008-09-18 | Ricoh Co Ltd | 生体認証装置および生体認証装置を用いた認証方法 |
| JP2008243093A (ja) * | 2007-03-29 | 2008-10-09 | Toshiba Corp | 辞書データの登録装置及び辞書データの登録方法 |
| JP2008250601A (ja) * | 2007-03-30 | 2008-10-16 | Hitachi Omron Terminal Solutions Corp | 生体情報読取装置および生体情報読取システム |
| WO2010086993A1 (ja) * | 2009-01-30 | 2010-08-05 | 富士通フロンテック株式会社 | 認証装置、撮像装置、認証方法および認証プログラム |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2015524968A (ja) * | 2012-07-18 | 2015-08-27 | ジェムアルト エスアー | 非接触チップカードのユーザを認証する方法 |
| JP2014085708A (ja) * | 2012-10-19 | 2014-05-12 | Fujitsu Ltd | 画像処理装置、画像処理方法および画像処理プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| CN103003840A (zh) | 2013-03-27 |
| EP2624205A1 (en) | 2013-08-07 |
| JPWO2012042631A1 (ja) | 2014-02-03 |
| KR20130018997A (ko) | 2013-02-25 |
| US20130114863A1 (en) | 2013-05-09 |
| JP5509335B2 (ja) | 2014-06-04 |
| KR101384446B1 (ko) | 2014-04-10 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP5509335B2 (ja) | 登録プログラム、登録装置、および登録方法 | |
| US12223760B2 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
| JP5671607B2 (ja) | 生体認証装置、生体認証システム、および生体認証方法 | |
| JP5622928B2 (ja) | 照合装置、照合プログラム、および照合方法 | |
| KR100769101B1 (ko) | 생체 정보를 이용하는 개인 인증 시스템, 방법 및프로그램을 기록한 컴퓨터 판독 가능한 기록 매체 | |
| CN101145199B (zh) | 用于生物认证装置的活体引导控制方法和生物认证装置 | |
| US9361507B1 (en) | Systems and methods for performing fingerprint based user authentication using imagery captured using mobile devices | |
| JP5503725B2 (ja) | 認証装置、認証プログラム、および認証方法 | |
| EP2629240A2 (en) | Verification object specifying apparatus, verification object specifying program, and verification object specifying method | |
| WO2012111664A1 (ja) | 認証装置、認証プログラム、および認証方法 | |
| JP2019117579A (ja) | 生体認証システム | |
| JP2013137590A (ja) | 認証装置、認証プログラム、および認証方法 | |
| JP5685272B2 (ja) | 認証装置、認証プログラム、および認証方法 | |
| WO2022091325A1 (ja) | 認証方法、制御方法、情報処理装置、及び認証プログラム | |
| JP5655155B2 (ja) | 情報処理装置、情報処理方法、および情報処理プログラム | |
| Akinnuwesi Boluwaji et al. | AUTOMATED STUDENTS'ATTEDANCE TAKING IN TERTIARY INSTITUTION USING HYBRIDIZED FACIAL RECOGNITION ALGORITHM | |
| JP2013148988A (ja) | 生体情報処理装置、生体情報処理プログラム、および生体情報処理方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10857844 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2012536077 Country of ref document: JP |
|
| ENP | Entry into the national phase |
Ref document number: 20137001320 Country of ref document: KR Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2010857844 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |