WO2021199234A1 - 認証制御装置、認証システム、認証制御方法及び記録媒体 - Google Patents
認証制御装置、認証システム、認証制御方法及び記録媒体 Download PDFInfo
- Publication number
- WO2021199234A1 WO2021199234A1 PCT/JP2020/014738 JP2020014738W WO2021199234A1 WO 2021199234 A1 WO2021199234 A1 WO 2021199234A1 JP 2020014738 W JP2020014738 W JP 2020014738W WO 2021199234 A1 WO2021199234 A1 WO 2021199234A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- authentication
- projection
- face
- area
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/60—Static or dynamic means for assisting the user to position a body part for biometric acquisition
- G06V40/67—Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/60—Protecting data
- G06F21/62—Protecting access to data via a platform, e.g. using keys or access control rules
- G06F21/6218—Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
- G06F21/6245—Protecting personal data, e.g. for financial or medical purposes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
- G06V40/25—Recognition of walking or running movements, e.g. gait recognition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/50—Maintenance of biometric data or enrolment thereof
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/04—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
Definitions
- This disclosure relates to an authentication control device, an authentication system, an authentication control method, and a recording medium.
- Patent Document 1 describes a walk-through type authentication system that extracts a face area image from an image of a walking person and authenticates based on the face area image and the registered face image.
- Patent Document 1 Patent Document 1
- An object of the present disclosure is to provide an authentication control device, an authentication system, an authentication control method, and a recording medium capable of notifying an authentication result in association with an authentication target person in view of the above-mentioned problems.
- the authentication control device executes an image acquisition means for acquiring an image including an authentication target person walking in the surveillance area taken by a camera that captures the surveillance area, and face recognition.
- An authentication control means for causing an authentication device to execute face authentication of the person to be authenticated included in the image acquired by the image acquisition means, and a projection area specifying means for specifying a projection area on which the result of the face authentication is projected.
- the projection device is provided with a projection control means for projecting the result of the face authentication on the projection area specified by the projection area specifying means.
- the authentication control device includes an image acquisition means for acquiring an image including an authentication target person walking toward a projection zone in the surveillance area photographed by a camera that photographs the surveillance area. It is included in the authentication control means for causing the authentication device that executes face authentication to execute the face authentication of the authentication target person included in the image acquired by the image acquisition means, and the image acquired by the image acquisition means.
- the projection device is provided with a projection control means for projecting the result of the face authentication into the projection zone.
- the authentication system includes a camera for photographing a surveillance area, a projection device, an authentication device for performing face authentication, and an authentication target person walking in the surveillance area photographed by the camera.
- the image acquisition means for acquiring the image including the above
- the authentication control means for causing the authentication device to execute the face authentication of the authentication target person included in the image acquired by the image acquisition means, and the result of the face authentication.
- the projection area specifying means for specifying the projection area to be projected, and the projection control means for causing the projection device to project the result of the face authentication on the projection area specified by the projection area identification means.
- the authentication system includes a camera for photographing the surveillance area, a projection device, an authentication device for performing face authentication, and the inside of the surveillance area photographed by the camera toward the projection zone.
- An image acquisition means for acquiring an image including a walking authentication target person, an authentication control means for causing the authentication device to execute face authentication of the authentication target person included in the image acquired by the image acquisition means, and the above.
- the projection device includes a projection control means for projecting the result of the face authentication on the projection zone.
- the authentication control method executes an image acquisition step of acquiring an image including an authentication target person walking in the surveillance area taken by a camera that captures the surveillance area, and face recognition.
- An authentication control step for causing the authentication device to execute face authentication of the authentication target person included in the image acquired by the image acquisition step, and a projection area specifying step for specifying a projection area on which the result of the face authentication is projected.
- the projection device is provided with a projection control step for projecting the result of the face authentication on the projection area specified by the projection area identification step.
- the authentication control method includes an image acquisition step of acquiring an image including an authentication target person walking toward a projection zone in the surveillance area taken by a camera that captures the surveillance area. It is included in the authentication control step of causing the authentication device that executes face authentication to execute the face authentication of the authentication target person included in the image acquired by the image acquisition step, and the image acquired by the image acquisition step.
- the projection device includes a projection control step of projecting the result of the face authentication into the projection zone.
- the recording medium acquires an image including an authentication target person walking in the surveillance area taken by a camera that captures the surveillance area on an electronic device including at least one processor.
- the result of the face authentication and the authentication control process of causing the authentication device that executes the image acquisition process and the face authentication to execute the face authentication of the authentication target person included in the image acquired by the image acquisition process is projected.
- a program for executing a projection area specifying process for specifying a projection area and a projection control process for causing the projection device to project the result of the face authentication on the projection area specified by the projection area identification process is recorded. It is a computer-readable recording medium.
- the recording medium is an electronic device provided with at least one processor, and an authentication target person walking toward a projection zone in the surveillance area photographed by a camera that photographs the surveillance area.
- An image acquisition process for acquiring an image including the image, an authentication control process for causing an authentication device that executes face authentication to execute face authentication of the person to be authenticated included in the image acquired by the image acquisition process, and the image acquisition.
- the projection device is made to execute a projection control process for projecting the result of the face authentication on the projection zone.
- an authentication control device an authentication system, an authentication control method, and a recording medium capable of notifying an authentication result in association with an authentication target person.
- Embodiment 4 It is a flowchart of an example of a face recognition process (S70). It is a flowchart of an example of the authentication target person tracking process (S72). It is a flowchart of an example of a projection process (step S74). It is an operation flow (outline) of Embodiment 4. It is a figure which shows an example which projected the result M1 and M2 of the face recognition of the authentication target person U1 and U2 walking in the monitoring area. This is an example in which the avatar image G (privacy protection information) is projected instead of the projection information (for example, “ ⁇ ”) indicating that the face recognition is successful. It is a schematic block diagram of the authentication control device 20 (a modification).
- FIG. 1 is a schematic configuration diagram of the authentication control device 20.
- the authentication control device 20 executes face recognition with an image acquisition means 22a that acquires an image including an authentication target person walking in the surveillance area photographed by the camera 30 that photographs the surveillance area.
- the authentication control means 22d that causes the authentication device 10 to execute the face authentication of the person to be authenticated included in the image acquired by the image acquisition means 22a, and the projection area specifying means 22f that specifies the projection area on which the result of the face authentication is projected.
- the projection device 40 is provided with a projection control means 22g for projecting the result of the face authentication onto the projection area specified by the projection area specifying means 22f.
- FIG. 2 is a flowchart of an example of the operation of the authentication control device 20.
- the image acquisition means 22a acquires a surveillance image including an authentication target person walking in the surveillance area photographed by the camera 30 that photographs the surveillance area (step S1).
- the authentication control means 22d causes the authentication device 10 that executes face recognition to execute face recognition of the person to be authenticated included in the monitoring image acquired by the image acquisition means 22a (step S2).
- the projection area specifying means 22f specifies a projection area on which the result of face recognition is projected (step S3).
- the projection control means 22g causes the projection device 40 to project the result of face recognition onto the projection area specified by the projection area identification means 22f (step S4).
- the authentication result can be projected (notified) in association with the authentication target person.
- the monitoring burden on the guards is reduced.
- the image acquisition unit will be used as the image acquisition means 22a.
- the authentication control unit is used as the authentication control means 22d.
- the projection area specifying unit is used as the projection area specifying means 22f.
- a projection control unit is used as the projection control means 22g.
- a projection control unit 22 g is used as the projection control means 22 g.
- FIG. 3 is a block diagram showing the configuration of the authentication system 1 according to the second embodiment.
- the authentication system 1 includes an authentication device 10, an authentication control device 20, a camera 30, and a projection device 40 that can communicate with each other via a network NW (for example, the Internet).
- NW for example, the Internet
- a part or all of the authentication control device 20, the camera 30, and the projection device 40 may be integrated.
- FIG. 4 is an operation flow (outline) of the second embodiment.
- FIG. 5 is a diagram showing an example of the authentication target person U1 walking in the monitoring area.
- FIG. 6 is a diagram showing an example in which the result M1 of the face authentication of the authentication target person U1 walking in the monitoring area is projected.
- step S10 the face recognition process of the authentication target person U1 (see FIG. 5) walking in the monitoring area is executed.
- the projection area is specified, and the projection process of projecting the face recognition result M1 (see FIG. 6) onto the specified projection area is executed (step S11). The same process is executed even when there are a plurality of authentication target persons walking in the monitoring area.
- FIG. 7 is a schematic configuration diagram of the authentication device 10.
- the authentication device 10 includes a storage unit 11, a control unit 12, a memory 13, and a communication unit 14.
- the storage unit 11 is, for example, a non-volatile storage unit such as a hard disk device or a ROM.
- the program 11a and the face information DB 11b are stored in the storage unit 11.
- Program 11a is a program executed by the control unit 12 (processor).
- the face information DB 11b the user ID (plurality) and the face feature information of the user (authentication target person) are stored (registered) in association with each other.
- the authentication device 10 collates the face image or face feature information included in the request with the face feature information of each authentication target person in response to the face recognition request received from the outside (for example, the authentication control device 20). And return the collation result to the requester.
- control unit 12 includes a processor.
- the processor is, for example, a CPU (Central Processing Unit). There may be one processor or multiple processors.
- the processor serves as an image acquisition unit 12a, a face detection unit 12b, a feature point extraction unit 12c, a registration unit 12d, and an authentication unit 12e. Function. Some or all of these may be implemented in hardware.
- the image acquisition unit 12a acquires an image including the face of the person to be authenticated.
- the image acquisition unit 12a acquires an image received by the communication unit 14.
- the images received by the communication unit 14 include a registration image transmitted from a user terminal (not shown) and an authentication (verification) image transmitted from the authentication control device 20.
- the face detection unit 12b detects a face region from the image acquired by the image acquisition unit 12a and outputs it to the feature point extraction unit 12c.
- the feature point extraction unit 12c extracts feature points (for example, facial feature points such as eyes, nose, and mouth edge) from the face region detected by the face detection unit 12b.
- feature points for example, facial feature points such as eyes, nose, and mouth edge
- the feature point extraction unit 12c When the image acquired by the image acquisition unit 12a is an image for registration, the feature point extraction unit 12c outputs the face feature information to the registration unit 12d.
- the face feature information is a set of extracted feature points.
- the feature point extraction unit 12c outputs the face feature information to the authentication unit 12e.
- the registration unit 12d newly issues a user ID when registering facial feature information.
- the registration unit 12d registers the issued user ID and the face feature information extracted from the image for registration in the face information DB 11b in association with each other.
- the authentication unit 12e collates the face feature information extracted from the face area detected from the authentication image with the face feature information in the face information DB 11b.
- the authentication unit 12e returns to the authentication control device 20 whether or not the facial feature information matches.
- the presence or absence of matching of facial feature information corresponds to the success or failure of authentication.
- the communication unit 14 is a communication device that communicates with the authentication control device 20 via the network NW.
- FIG. 8 is a flowchart of an example of the operation of the authentication device 10 (face information registration process).
- the authentication device 10 acquires an image (image for registration) including the face of the authentication target person included in the face information registration request (step S10).
- the authentication device 10 receives the face information registration request from the user terminal (not shown) via the network NW.
- the authentication device 10 (face detection unit 12b) detects the face area from the registration image acquired in step S10 (step S11).
- the authentication device 10 (feature point extraction unit 12c) extracts facial feature points from the face region detected in step S11 (step S12), and outputs face feature information to the registration unit 12d.
- the authentication device 10 (registration unit 12d) issues a user ID, associates the user ID with the face feature information, and registers the user ID in the face information DB 11b (step S13).
- the authentication device 10 may receive face feature information from a face authentication terminal or the like and register it in the face information DB 11b in association with the user ID.
- FIG. 9 is a flowchart of an example of the operation (face recognition processing) of the authentication device 10.
- the authentication device 10 acquires an image (image for authentication) including the face of the authentication target person included in the face authentication request (step S20).
- the authentication device 10 receives the face recognition request from the authentication control device 20 via the network NW.
- the authentication device 10 face detection unit 12b) detects the face region from the authentication image acquired in step S20 (step S21).
- the feature point extraction unit 12c extracts facial feature points from the face region detected in step S21 (step S22).
- the authentication device 10 may receive face feature information from the authentication control device 20.
- the authentication device 10 (authentication unit 12e) collates the acquired face feature information with the face information DB 11b (step S23).
- step S24: Yes the authentication unit 12e identifies the user ID of the authentication target person whose face feature information matches (step S25), and identifies that the face recognition was successful. Is returned to the authentication control device 20 (step S26).
- step S24: No the authentication unit 12e returns to the authentication control device 20 that the face authentication has failed (step S27).
- FIG. 10 is a schematic configuration diagram of the authentication control device 20.
- the authentication control device 20 is an information processing device that processes an image taken by the camera 30 and gives a projection instruction to the projection device 40 according to the processing content, and is, for example, a server device realized by a computer.
- the authentication control device 20 includes a storage unit 21, a control unit 22, a memory 23, and a communication unit 24.
- the storage unit 21 is a non-volatile storage unit such as a hard disk device or a ROM.
- the program 21a is stored in the storage unit 21.
- the program 21a is a program executed by the control unit 22 (processor).
- control unit 22 includes a processor.
- the processor is, for example, a CPU (Central Processing Unit). There may be one processor or multiple processors.
- the processor executes the image acquisition unit 22a, the face area detection unit 22b, the non-face area detection unit 22c, the authentication control unit 22d, and the face. It functions as an authentication result acquisition unit 22e, a projection area identification unit 22f, and a projection control unit 22g. Some or all of these may be implemented in hardware.
- the image acquisition unit 22a acquires an image (hereinafter, also referred to as a surveillance image) including an authentication target person walking in the surveillance area photographed by the camera 30 that captures the surveillance area.
- the communication unit 24 receives the surveillance image transmitted from the camera 30, and the image acquisition unit 22a acquires the surveillance image received by the communication unit 24.
- the face area detection unit 22b executes a face area detection process for detecting the face area of the authentication target person from the monitoring image acquired by the image acquisition unit 22a.
- the non-face area detection unit 22c detects a non-face area other than the face of the authentication target person from the monitoring image acquired by the image acquisition unit 22a.
- the authentication control unit 22d causes the authentication device 10 that executes face authentication to execute the face authentication of the authentication target person included in the monitoring image acquired by the image acquisition unit 22a. Specifically, the authentication control unit 22d transmits the monitoring image acquired by the image acquisition unit 22a to the authentication device 10 via the communication unit 24. Instead of the surveillance image, the face region (or the feature point extracted from the face region) detected from the surveillance image may be transmitted to the authentication device 10.
- the face authentication result acquisition unit 22e acquires the result of the face authentication executed by the authentication device 10. Specifically, the communication unit 24 receives the face authentication result transmitted from the authentication device 10, and the face authentication result acquisition unit 22e acquires the face authentication result received by the communication unit 24.
- the projection area specifying unit 22f specifies a projection area on which the result of face recognition is projected.
- the projection area is an area defined by coordinates and the like in the passage.
- the projection area specifying unit 22f specifies an area as a projection area, which is included in the image acquired by the image acquisition unit 22a and is separated from the authentication target person by a predetermined distance.
- the area separated by a predetermined distance is, for example, an area on the floor surface separated by a predetermined distance in the walking direction from the person to be authenticated.
- the predetermined distance can be determined, for example, by acquiring the walking speed of the person to be authenticated from the surveillance image.
- the projection area specifying unit 22f may specify an area including the authentication target person included in the image acquired by the image acquisition unit 22a as the projection area.
- the projection control unit 22g causes the projection device 40 to project the result of face recognition on the projection area specified by the projection area identification unit 22f. Specifically, the projection control unit 22g transmits a projection instruction to the projection device 40 via the communication unit 24.
- FIG. 11 is a flowchart of an example of the operation (authentication control processing) of the authentication control device 20.
- the authentication control device 20 acquires an image including an authentication target person walking in the monitoring area taken by the camera 30 (step S30).
- the communication unit 24 receives the surveillance image transmitted from the camera 30, and the image acquisition unit 22a acquires the surveillance image received by the communication unit 24.
- the authentication control device 20 executes a face area detection process for detecting the face area of the authentication target person from the monitoring image acquired in step S30 (step S31).
- step S31 When the face area is detected as a result of the face area detection process in step S31 (step S32: YES), the authentication control device 20 (authentication control unit 22d) is the authentication target person included in the monitoring image acquired in step S30.
- the face authentication request for requesting face authentication is transmitted to the authentication device 10 via the communication unit 24 (step S33).
- This face recognition request includes the surveillance image (image for authentication) acquired in step S30.
- the authentication control device 20 receives the face authentication result and the user ID transmitted from the authentication device 10, and the face authentication result acquisition unit 22e receives the face authentication result received by the communication unit 24. And the user ID is acquired (step S34).
- step S35 YES
- the authentication control device 20 indicates that the face authentication is successful. Is generated (step S36).
- the projection information indicating that the face recognition is successful is, for example, the graphic information of “ ⁇ ”.
- the projection information indicating that the face recognition was successful may be read from the storage unit 21.
- step S35 NO
- the authentication control device 20 has failed the face authentication. Is generated (step S37).
- the projection information indicating that the face recognition has failed is, for example, the graphic information of “x”.
- the projection information indicating that the face recognition has failed may be read from the storage unit 21.
- the authentication control device 20 specifies a projection area on which the result of face recognition is projected (step S38). For example, among the surveillance images acquired in step S30, a region separated from the face region detected in step S32 by a predetermined distance is specified as a projection region.
- the authentication control device 20 causes the projection device 40 to project the result of face recognition on the projection area specified in step S38 (step S39).
- the projection control unit 22g transmits a projection instruction for displaying the result of face recognition to the projection device 40 via the communication unit 24.
- This projection instruction includes the projection information generated in step S36 or step S37 and the projection area specified in step S38.
- step S32 NO
- the authentication control device 20 non-face area detection unit 22c
- step S40 A non-face region other than the person's face is detected.
- the authentication control device 20 generates projection information indicating that the face detection has failed (step S41).
- the projection information indicating that the face detection has failed may be read from the storage unit 21.
- the projection information indicating that the face detection has failed is, for example, the graphic information of “ ⁇ ”.
- the authentication control device 20 specifies a projection area on which projection information indicating that face detection has failed is projected (step S42). For example, among the surveillance images acquired in step S30, a region separated from the non-face region detected in step S40 by a predetermined distance is specified as a projection region.
- the authentication control device 20 causes the projection device 40 to project projection information indicating that face detection has failed in the projection area specified in step S42 (step S39).
- the projection control unit 22g transmits a projection instruction for projecting projection information indicating that the face detection has failed to the projection device 40 via the communication unit 24.
- This projection instruction includes the projection information generated in step S41 and the projection area specified in step S42.
- the camera 30 captures an image including an authentication target person walking in the monitoring area.
- the camera 30 may photograph the face of the person to be authenticated, that is, the face of the person to be authenticated facing the traveling direction from the front or substantially from the front (that is, at an angle suitable for face recognition). (As possible), for example, it is provided near the passage through which the person to be authenticated passes.
- the camera 30 is remotely controlled by the authentication control device 20, continuously photographs the monitoring area, and transmits the captured images (and camera identification information (point ID, etc.)) to the authentication control device 20 via the network NW. It is a digital camera that outputs.
- the projection device 40 is provided in the vicinity of the passage through which the person to be authenticated passes.
- the projection device 40 is, for example, a liquid crystal projector, but is not limited thereto.
- the projection device 40 receives a projection instruction (including projection information and a projection area) from the authentication control device 20 via the network NW, and is generated as a result of face authentication included in the projection instruction (for example, in step S36 or S37).
- the projection information is projected on the projection area included in the projection instruction.
- FIG. 12 is a sequence diagram of the authentication system 1.
- the authentication control device 20 receives the monitoring image transmitted from the camera 30 by the communication unit 24, and the image acquisition unit 22a is received by the communication unit 24. Acquire the surveillance image (step S50). Here, it is assumed that the monitoring image including the authentication target person U1 shown in FIG. 5 is acquired.
- the authentication control device 20 (face area detection unit 22b) executes a face area detection process for detecting the face area of the authentication target person U1 included in the monitoring image acquired by the image acquisition unit 22a (step S51). .. Here, it is assumed that the face area is successfully detected.
- the authentication control device 20 (authentication control unit 22d) makes a face recognition request for face recognition of the authentication target person U1 included in the surveillance image acquired in step S50 via the communication unit 24. (Step S52).
- This face recognition request includes the surveillance image acquired in step S50.
- step S53 when the communication unit 14 receives the face authentication request transmitted in step S52, the authentication device 10 executes the face authentication process (see FIG. 9) (step S53).
- the authentication device 10 (authentication unit 12e) transmits the authentication result to the authentication control device 20 of the face authentication request transmission source via the communication unit 14 (step S54).
- the authentication result the fact that the authentication is successful and the user ID of the authentication target person U1 who has succeeded in the authentication are transmitted to the authentication control device 20.
- the authentication control device 20 receives the face authentication result and the user ID transmitted in step S54, and the face authentication result acquisition unit 22e receives the face authentication result and the face authentication result received by the communication unit 24. Acquire the user ID (step S55).
- the authentication control device 20 when the face authentication is successful (when the result of the face authentication acquired in step S55 is that the authentication is successful), the authentication control device 20 generates projection information indicating that the face authentication is successful. (Step S56). Here, it is assumed that the graphic information of " ⁇ " is generated as the projection information indicating that the face recognition is successful.
- the authentication control device 20 specifies a projection area on which the result of face recognition is projected (step S57).
- a projection area on which the result of face recognition is projected As shown in FIG. 13, of the monitoring images acquired in step S50, the area FL10 on the floor surface separated from the face area (authentication target person U1) detected in step S51 by a predetermined distance in the walking direction is projected. Suppose it is identified as an area.
- FIG. 13 is an example of a projection area.
- the authentication control device 20 transmits a projection instruction for displaying the result of face recognition to the projection device 40 via the communication unit 24 (step S58).
- This projection instruction includes the projection information generated in step S56 and the projection area specified in step S57.
- the projection device 40 uses the projection information included in the received projection instruction as the projection instruction. Project to the included projection area (step S59). For example, it is projected as shown in FIG. FIG. 14 is an example of the result of face recognition projected on the projection area.
- step S56 when the face authentication fails (when the result of the face authentication acquired in step S55 is that the authentication has failed), projection information indicating that the face authentication has failed is generated (step S56).
- the graphic information of "x" is generated as the projection information indicating that the face recognition has failed.
- the authentication control device 20 specifies a projection area on which the result of face recognition is projected (step S57).
- the area FL10 on the floor surface separated from the face area (authentication target person U1) detected in step S51 by a predetermined distance in the walking direction is projected. Suppose it is identified as an area.
- the authentication control device 20 transmits a projection instruction for displaying the result of face recognition to the projection device 40 via the communication unit 24 (step S58).
- This projection instruction includes the projection information generated in step S56 and the projection area specified in step S57.
- the projection device 40 uses the projection information included in the received projection instruction as the projection instruction. Project to the included projection area (step S59). For example, it is projected as shown in FIG. FIG. 15 is another example of the result of face recognition projected on the projection area.
- FIG. 16 is a sequence diagram of the authentication system 1.
- the authentication control device 20 receives the monitoring image transmitted from the camera 30 by the communication unit 24, and the image acquisition unit 22a is received by the communication unit 24. Acquire the surveillance image (step S60). Here, it is assumed that the monitoring image including the authentication target person U1 shown in FIG. 5 is acquired.
- the authentication control device 20 (face area detection unit 22b) executes a face area detection process for detecting the face area of the authentication target person U1 from the monitoring image acquired by the image acquisition unit 22a (step S61).
- a face area detection process for detecting the face area of the authentication target person U1 from the monitoring image acquired by the image acquisition unit 22a.
- the authentication control device 20 detects a non-face area other than the face of the authentication target person from the monitoring image acquired in step S60 (step S62).
- the authentication control device 20 generates projection information indicating that the face detection has failed (step S63).
- the graphic information of “ ⁇ ” is generated as the projection information indicating that the face detection has failed.
- the authentication control device 20 specifies a projection area on which projection information indicating that face detection has failed is projected (step S64).
- the region FL10 on the floor surface separated from the non-face region (authentication target person U1) detected in step S62 by a predetermined distance in the walking direction is It is assumed that it is specified as a projection area.
- the authentication control device 20 transmits a projection instruction for displaying projection information indicating that face detection has failed to the projection device 40 via the communication unit 24 (step S65).
- This projection instruction includes the projection information generated in step S63 and the projection area specified in step S64.
- the projection device 40 uses the projection information included in the received projection instruction as the projection instruction. Project to the included projection area (step S66).
- the authentication result can be projected (notified) in association with the authentication target person (see FIGS. 14 and 15). As a result, for example, the monitoring burden on the guards is reduced.
- FIG. 17 is an operation flow (outline) of the third embodiment.
- FIG. 18 is a diagram showing an example of authentication target persons U1 and U2 walking in the monitoring area (authentication zone Z1).
- FIG. 19 is a diagram showing an example in which M1 and M2 are projected as a result of face recognition of authentication target persons U1 and U2 walking in the monitoring area (authentication zone Z1).
- the authentication zone Z1 for executing the face recognition of the authentication target persons U1 and U2 and the projection zone Z2 for projecting the authentication result are separated.
- the face recognition process of the authentication target persons U1 and U2 is executed in the authentication zone Z1 (step S70).
- step S72 a tracking process for tracking the person to be authenticated is executed (step S72).
- step S73 when the person to be authenticated reaches the projection zone Z2 (step S73: YES), the projection area is specified, and the face recognition results M1 and M2 (see FIG. 19) in the projection zone Z2 (identified projection area). Is executed (step S74).
- the specification of the projection area may be omitted.
- step S73: NO the process of step S72 is repeatedly executed (step S73: NO).
- FIG. 20 is a flowchart of an example of the face recognition process (S70).
- S70 face recognition process
- the authentication control device 20 acquires a surveillance image including an authentication target person walking in the monitoring area (authentication zone Z1) taken by the camera 30 (step S701).
- the communication unit 24 receives the surveillance image transmitted from the camera 30, and the image acquisition unit 22a acquires the surveillance image received by the communication unit 24.
- the monitoring image including the authentication target person U1 shown in FIG. 18 is acquired.
- the authentication control device 20 executes a face area detection process for detecting the face area of the authentication target person U1 included in the monitoring image acquired in step S701 (step S702).
- the authentication control device 20 (authentication control unit 22d) makes a face recognition request for face recognition of the authentication target person U1 included in the surveillance image acquired in step S701 via the communication unit 24. (Step S703).
- This face recognition request includes the surveillance image (image for authentication) acquired in step S701.
- the authentication control device 20 receives the face authentication result and the user ID transmitted from the authentication device 10, and the face authentication result acquisition unit 22e receives the face authentication result received by the communication unit 24. And the user ID is acquired (step S704).
- step S705 YES
- the authentication control device 20 indicates that the face authentication is successful.
- graphic information of “ ⁇ ” is generated (step S706).
- step S705 NO
- the authentication control device 20 has failed the face authentication.
- graphic information of "x" is generated (step S707).
- projection information for example, graphic information of " ⁇ " indicating that face recognition was successful is generated.
- steps S708 and S709 are executed in parallel with the above processes of steps S702 to S707.
- the authentication control device 20 detects a non-face area (hereinafter, referred to as body shape area B1) other than the face of the authentication target person from the monitoring image acquired in step S701 (step S708). ).
- the authentication control device 20 extracts body shape feature information from the body shape region B1 detected in step S708 (step S709). Here, it is assumed that the body shape feature information of the person to be authenticated U1 is extracted.
- the authentication control device 20 stores (registers) the projection information generated in step S706 (or step S707) and the body shape feature information extracted in step S709 in association with each other in the storage unit 21. (Step S710).
- This step S710 corresponds to "save the result of face recognition" in step S71 in FIG.
- step S706 the projection information generated in step S706 (or step S707) and the body shape feature information of the authentication target person U1 extracted in step S79 are stored in association with each other. ..
- This authentication target person tracking process (S72) is an example of the authentication target person tracking means of the present invention.
- FIG. 21 is a flowchart of an example of the authentication target person tracking process (S72).
- the authentication control device 20 acquires a surveillance image (hereinafter referred to as a surveillance image X1) including an authentication target person walking in the monitoring area (authentication zone Z1) photographed by the camera 30.
- a surveillance image X1 a surveillance image
- the communication unit 24 receives the surveillance image X1 transmitted from the camera 30, and the image acquisition unit 22a acquires the surveillance image X1 received by the communication unit 24.
- the monitoring image including the authentication target person U1 shown in FIG. 18 is acquired.
- the authentication control device 20 sets a non-face area (hereinafter, referred to as a body shape area B1) other than the face of the authentication target person U1 included in the monitoring image X1 acquired in step S721. Detect (step S722).
- the authentication control device 20 extracts body shape feature information from the body shape region B1 detected in step S722 (step S723).
- the authentication control device 20 acquires an image (hereinafter, referred to as a monitoring image X2) including an authentication target person U1 walking in the monitoring area (authentication zone Z1) taken by the camera 30.
- a monitoring image X2 an image including an authentication target person U1 walking in the monitoring area (authentication zone Z1) taken by the camera 30.
- the communication unit 24 receives the surveillance image X2 transmitted from the camera 30, and the image acquisition unit 22a acquires the surveillance image X2 received by the communication unit 24.
- the authentication control device 20 covers a non-face area (hereinafter, referred to as a body shape area B2) other than the face of the authentication target person U1 included in the monitoring image X2 acquired in step S724. Detect (step S725).
- the authentication control device 20 extracts body shape feature information from the body shape region B2 detected in step S725 (step S726).
- step S723 the body shape feature information extracted in step S723 is collated with the body shape feature information extracted in step S726, and if the collation results match (step S727: YES), the process proceeds to step S73. On the other hand, if the collation results do not match (step S727: NO), the process ends.
- step S74 the projection process (step S74) is executed (see FIG. 17).
- FIG. 22 is a flowchart of an example of the projection process (step S74).
- the authentication control device 20 specifies a projection area on which the result of face recognition is projected (step S741).
- the area on the floor surface that is a predetermined distance away from the body shape area B2 (authentication target person U1), which is the non-face area detected in step S725, in the walking direction is the projection area.
- the projection area Suppose it is identified as.
- the authentication control device 20 causes the projection device 40 to project the face recognition result saved in step S710 into the projection zone Z2 (projection area specified in step S741) (step S742). ).
- the projection control unit 22g transmits a projection instruction for displaying the result of face authentication saved in step S710 to the projection device 40 via the communication unit 24.
- This projection instruction includes the projection information saved in step S710 and the projection area specified in step S741.
- the projection device 40 uses the projection information included in the received projection instruction as the projection instruction. Project to the included projection area. For example, it is projected as shown in FIG.
- the authentication result can be projected (notified) in association with the authentication target person (see FIG. 19).
- the result of face recognition is not projected in the authentication zone Z1 but is displayed in the projection zone Z2. Therefore, it is only necessary to monitor the projection zone Z2, and the monitoring burden on the guards is reduced.
- step S710 since the face recognition result is saved in step S710, it is not suitable for face recognition because the authentication target person faces downward in the subsequent tracking process of the authentication target person (step S72). Even if the situation arises, the face recognition result (result saved in step S710) can be correctly projected onto the projection zone Z2.
- the face authentication result is saved in step S710, and the face authentication is not executed thereafter, so that the authentication cost can be suppressed.
- FIG. 23 is an operation flow (outline) of the fourth embodiment.
- FIG. 24 is a diagram showing an example in which M1 and M2 are projected as a result of face recognition of the authentication target persons U1 and U2 walking in the monitoring area.
- step S70 the face recognition process of the authentication target persons U1 and U2 is executed.
- step S72 a tracking process for tracking the person to be authenticated is executed (step S72).
- the projection area is specified, and the projection process of projecting the results of face recognition M1 and M2 (see FIG. 24) on the specified projection area is executed (step S74).
- the authentication target person U1 whose face recognition is successful, even if the authentication target person U1 walks, the authentication target person U1 is always on the floor surface separated from the authentication target person U1 by a predetermined distance in the walking direction.
- M1 (“ ⁇ ” indicating successful authentication) can be projected on the area.
- the result of face recognition M2 is always in the area on the floor surface which is a predetermined distance in the walking direction from the authentication target person U1.
- X indicating authentication failure
- steps S72 and S74 is repeatedly executed until the authentication target persons U1 and U2 pass through the monitoring area (step S75: NO). Then, when the authentication target persons U1 and U2 pass through the monitoring area (step S76: YES), the process ends.
- the authentication result can be projected (notified) in association with the authentication target person (see FIG. 24).
- the face recognition results M1 and M2 are always in the area on the floor surface separated from the authentication target persons U1 and U2 in the walking direction by a predetermined distance. Can be projected (Fig. 24). This facilitates the association between the person to be authenticated while walking and the authentication result, which reduces the monitoring burden on the guards.
- step S710 since the face recognition result is saved in step S710, it is not suitable for face recognition because the authentication target person faces downward in the subsequent tracking process of the authentication target person (step S72). Even if the situation arises, the face recognition result (result saved in step S710) can be correctly projected.
- the face authentication result is saved in step S710, and the face authentication is not executed thereafter, so that the authentication cost can be suppressed.
- the projection area specifying unit 22f specifies a region separated from the authentication target person included in the image acquired by the image acquisition unit 22a as the projection area
- the present invention is limited to this.
- the projection area specifying unit 22f may specify an area including the authentication target person included in the image acquired by the image acquisition unit 22a as the projection area.
- the projection device 40 may project a specific color on the specified area. For example, if the face recognition is successful, blue or green may be projected on the specified area, and if the face recognition is unsuccessful, the red color may be projected on the specified area.
- the projection information (for example, " ⁇ ") or the face recognition indicating that the result of the face recognition projected on the projection area specified by the projection area identification means has succeeded in the face recognition is the projection information (for example, " ⁇ ") or the face recognition.
- An example of projection information (for example, “x”) indicating that the above has failed has been described, but the present invention is not limited to this.
- the avatar image G may be projected instead of the projection information (for example, “ ⁇ ”) indicating that the face recognition has been successful.
- the avatar image G can be specified and projected as follows.
- the privacy protection information DB 21b is added to the storage unit 21 of the authentication control device 20 shown in FIG.
- the user IDs (plurality) and the privacy protection information of the user (authentication target person) are stored (registered) in association with each other.
- the privacy protection information DB 21b is not limited to the authentication control device 20, and may be provided outside the authentication control device 20.
- the privacy protection information is information for notifying the person to be authenticated who has succeeded in face recognition that his / her face recognition has been successful, and does not include his / her personal information (for example, his / her name and company name to which he / she belongs).
- the privacy protection information includes information registered by the authentication target person (user) in order to know that the self-face authentication has been successful.
- the privacy protection information is an image such as an avatar image.
- the avatar image can be registered in the privacy protection information DB 21b by being designated (or selected) by the authentication target person on a registration terminal (not shown) connected to the network NW, for example.
- the avatar image is an image including a character that is the alter ego of the authentication target person.
- the character may be any character.
- the character may be a living thing (eg, a person, an animal, a plant) or an inanimate object (eg, a building, a landscape). Living things and inanimate objects may or may not exist.
- the character may or may not be anthropomorphic.
- the character may be represented in two dimensions or may be represented in three dimensions.
- the character may be a moving image or a still image.
- the privacy protection information may be "information associated with the person to be authenticated in advance", "information determined for each person to be authenticated", or "information unique to the person to be authenticated”.
- the avatar image G (privacy protection information) can be specified as follows.
- the user ID of the authentication target person who has succeeded in face recognition is acquired (see step S34 in FIG. 11), so that the storage unit 21 (privacy protection information DB 21b) It is possible to specify the privacy protection information associated with the user ID acquired in step S34.
- the authentication control device 20 (projection control unit 22g) transmits the projection instruction for displaying the specified privacy protection information to the projection device 40 via the communication unit 24, thereby causing the projection device 40 to receive the privacy protection information. Can be projected.
- This projection instruction includes the privacy protection information (projection information) specified above and the projection area specified in step S38 in FIG.
- the user ID of the authentication target person who has succeeded in face recognition is acquired (see step S704 in FIG. 20), it is acquired in step S704 in the storage unit 21 (privacy protection information DB 21b). It is possible to specify the privacy protection information associated with the user ID.
- the identified privacy protection information is stored (registered) in the storage unit 21 in association with the body shape feature information extracted in step S709 in FIG. 20 (see step S710 in FIG. 20).
- the authentication control device 20 (projection control unit 22g) transmits the projection instruction for displaying the specified privacy protection information to the projection device 40 via the communication unit 24, thereby causing the projection device 40 to receive the privacy protection information. Is projected (see step S742 in FIG. 22).
- This projection instruction includes the privacy protection information (projection information) specified above and the projection area specified in step S741 in FIG.
- the authentication system 1 is configured by an authentication device 10, an authentication control device 20, a camera 30, and a projection device 40 capable of communicating with each other via a network NW (for example, the Internet) will be described.
- NW for example, the Internet
- the configuration or function of all or part of the authentication device 10, the camera 30, and the projection device 40 may be added to the authentication control device 20.
- Non-temporary computer-readable media include various types of tangible storage mediums.
- Examples of non-temporary computer-readable media include magnetic recording media (eg, flexible disks, magnetic tapes, hard disk drives), magneto-optical recording media (eg, magneto-optical disks), CD-ROMs (Read Only Memory), CD-Rs, It includes a CD-R / W and a semiconductor memory (for example, a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, and a RAM (RandomAccessMemory)).
- a semiconductor memory for example, a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, and a RAM (RandomAccessMemory)
- the program may also be supplied to the computer by various types of temporary computer readable medium.
- temporary computer-readable media include electrical, optical, and electromagnetic waves.
- the temporary computer-readable medium can supply the program to the computer via a wired communication path such as an electric wire and an optical fiber, or a wireless communication path.
- An image acquisition means for acquiring an image including an authentication target person walking in the surveillance area taken by a camera that captures the surveillance area, and an image acquisition means.
- An authentication control means for causing an authentication device that executes face authentication to execute face authentication of the person to be authenticated included in the image acquired by the image acquisition means.
- a projection area specifying means for specifying a projection area on which the result of the face recognition is projected, and
- An authentication control device including a projection control means for projecting the result of the face recognition on the projection area specified by the projection area specifying means on the projection device.
- Appendix 2 Further provided with an authentication target person tracking means for tracking the authentication target person, The authentication control device according to Appendix 1, wherein the projection area specifying means identifies the projection area of the authentication target person tracked by the authentication target person tracking means.
- Appendix 3 The authentication control device according to Appendix 2, wherein the authentication target person tracking means tracks a non-face area other than the face of the authentication target person included in the image acquired by the image acquisition means.
- a feature information extracting means for extracting feature information of the non-face region from a non-face region other than the face among the authentication target persons included in the image acquired by the image acquisition means.
- a registration means for registering the feature information of the non-face region extracted by the feature information extraction means in association with the result of the face authentication, and a registration means. Further provided with a collating means for collating the feature information extracted by the feature information extracting means with the feature information registered by the registration means.
- the image acquisition means acquires a first image and a second image taken after the first image is taken as the image, and obtains the first image.
- the authentication control means causes the authentication device to execute face recognition of the authentication target person included in the first image acquired by the image acquisition means.
- the feature information extracting means extracts the first feature information of the non-face region from the non-face region other than the face among the authentication target persons included in the first image acquired by the image acquisition means, and also The second feature information of the non-face region is extracted from the non-face region other than the face among the authentication target persons included in the second image acquired by the image acquisition means.
- the registration means registers the first feature information extracted by the feature information extraction means in association with the result of the face authentication.
- the collation means collates the second feature information extracted by the feature information extraction means with the first feature information registered by the registration means.
- the authentication control device according to Appendix 1, wherein the projection control means projects the result of the face recognition registered by the registration means onto the projection area specified by the projection area specifying means.
- a face area detecting means for executing a face area detecting process for detecting the face area of the authentication target person included in the image acquired by the image acquiring means is further provided.
- the projection control means cannot detect the face area of the person to be authenticated as a result of executing the face area detection process by the face area detection means, the projection device indicates that the face area could not be detected.
- the authentication control device according to any one of Supplementary Provisions 1 to 4, which is projected onto the projection area specified by the projection area specifying means.
- Appendix 7 The authentication control device according to Appendix 6, wherein the area separated by a predetermined distance is an area on the floor surface separated from the authentication target person by a predetermined distance in the walking direction.
- the projection control means gives the projection device the privacy protection information of the authentication target person whose face recognition is successful as a result of the face recognition, and the projection area is specified by the projection area identification means.
- the authentication control device according to any one of Appendix 1 to 8, which is projected on the screen.
- An image acquisition means for acquiring an image including an authentication target person walking toward a projection zone in the surveillance area taken by a camera that captures the surveillance area.
- An authentication control means for causing an authentication device that executes face authentication to execute face authentication of the person to be authenticated included in the image acquired by the image acquisition means.
- the projection device includes a projection control means for projecting the result of the face authentication on the projection zone. Authentication control device.
- Appendix 11 Further provided with a projection area specifying means for specifying a projection area in which the result of the face recognition is projected in the projection zone.
- the projection control means identifies the result of the face authentication to the projection device by the projection area specifying means.
- the authentication control device according to Appendix 10, which is projected onto the projection area.
- Appendix 12 Further provided with an authentication target person tracking means for tracking the authentication target person, The authentication control device according to Appendix 10 or 11, wherein the projection area specifying means identifies the projection area of the authentication target person tracked by the authentication target person tracking means.
- Appendix 13 The authentication control device according to Appendix 12, wherein the authentication target person tracking means tracks a non-face area other than the face of the authentication target person included in the image acquired by the image acquisition means.
- a feature information extracting means for extracting feature information of the non-face region from a non-face region other than the face among the authentication target persons included in the image acquired by the image acquisition means.
- a registration means for registering the feature information of the non-face region extracted by the feature information extraction means in association with the result of the face authentication, and a registration means. Further provided with a collating means for collating the feature information extracted by the feature information extracting means with the feature information registered by the registration means.
- the image acquisition means acquires a first image and a second image taken after the first image is taken as the image, and obtains the first image.
- the authentication control means causes the authentication device to execute face recognition of the authentication target person included in the first image acquired by the image acquisition means.
- the feature information extracting means extracts the first feature information of the non-face region from the non-face region other than the face among the authentication target persons included in the first image acquired by the image acquisition means, and also The second feature information of the non-face region is extracted from the non-face region other than the face among the authentication target persons included in the second image acquired by the image acquisition means.
- the registration means registers the first feature information extracted by the feature information extraction means in association with the result of the face authentication.
- the collation means collates the second feature information extracted by the feature information extraction means with the first feature information registered by the registration means.
- the projection control means identifies the result of the face authentication registered by the registration means in the projection device to specify the projection area.
- the authentication control device according to Appendix 11, which is projected onto the projection area specified by means.
- Appendix 15 The authentication control device according to any one of Appendix 10 to 14, wherein the projection area specifying means identifies an area separated from the authentication target person who has reached the projection zone as the projection area by a predetermined distance.
- Appendix 16 The authentication control device according to Appendix 15, wherein the area separated by a predetermined distance is an area on the floor surface separated from the authentication target person by a predetermined distance in the walking direction.
- the projection control means gives the projection device the privacy protection information of the authentication target person whose face recognition is successful as a result of the face recognition, and the projection area is specified by the projection area identification means.
- the authentication control device according to any one of Supplementary note 7 to 17, which is projected on the screen.
- a camera that captures the surveillance area and Projector and An authentication device that performs face recognition and An image acquisition means for acquiring an image including an authentication target person walking in the monitoring area taken by the camera, and an image acquisition means.
- An authentication control means for causing the authentication device to execute face recognition of the authentication target person included in the image acquired by the image acquisition means.
- a projection area specifying means for specifying a projection area on which the result of the face recognition is projected, and
- An authentication system comprising the projection device with a projection control means for projecting the result of the face recognition onto the projection area specified by the projection area specifying means.
- Appendix 20 The camera, the projection device, the authentication device, and the authentication control device capable of communicating with each other via a network are provided.
- a camera that captures the surveillance area and Projector and An authentication device that performs face recognition and An image acquisition means for acquiring an image including an authentication target person walking toward a projection zone in the monitoring area taken by the camera, and an image acquisition means.
- An authentication control means for causing the authentication device to execute face recognition of the authentication target person included in the image acquired by the image acquisition means.
- the projection device includes a projection control means for projecting the result of the face recognition on the projection zone.
- Appendix 22 The camera, the projection device, the authentication device, and the authentication control device capable of communicating with each other via a network are provided.
- An authentication control method comprising a projection control step of projecting the result of the face recognition on the projection area specified by the projection area specifying step on the projection device.
- the projection device includes a projection control step of projecting the result of the face authentication into the projection zone.
- An image acquisition process for acquiring an image including an authentication target person walking in the surveillance area taken by a camera that captures the surveillance area, and An authentication control process that causes an authentication device that executes face authentication to execute face authentication of the person to be authenticated included in the image acquired by the image acquisition process.
- the projection area identification process for specifying the projection area on which the result of the face recognition is projected, and the projection area identification process.
- a computer-readable recording medium in which a program for executing a projection control process for projecting the result of the face recognition on the projection area on the projection area specified by the projection area identification process and a program for executing the projection control process are recorded on the projection device.
- An image acquisition process for acquiring an image including an authentication target person walking toward a projection zone in the surveillance area taken by a camera that captures the surveillance area.
- An authentication control process that causes an authentication device that executes face authentication to execute face authentication of the person to be authenticated included in the image acquired by the image acquisition process.
- the projection control process of causing the projection device to project the result of the face authentication into the projection zone is executed.
- a computer-readable recording medium on which a program for recording is recorded.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Security & Cryptography (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioethics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Collating Specific Patterns (AREA)
- Alarm Systems (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Description
まず、図1を用いて、実施形態1の認証システム1を構成する認証制御装置20の構成例について説明する。
以下、本発明の実施形態2として、認証システム1について詳細に説明する。以下、画像取得手段22aとして画像取得部を用いる。以下、画像取得部22aと記載する。また、認証制御手段22dとして認証制御部を用いる。以下、認証制御部22dと記載する。また、映写領域特定手段22fとして映写領域特定部を用いる。以下、映写領域特定部22fと記載する。また、映写制御手段22gとして映写制御部を用いる。以下、映写制御部22gと記載する。
以下、本発明の実施形態3として、認証システム1の他の動作例について詳細に説明する。
以下、本発明の実施形態4として、認証システム1の他の動作例について詳細に説明する。
監視領域を撮影するカメラにより撮影された前記監視領域内を歩行する認証対象者を含む画像を取得する画像取得手段と、
顔認証を実行する認証装置に、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御手段と、
前記顔認証の結果が映写される映写領域を特定する映写領域特定手段と、
映写装置に、前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる映写制御手段と、を備える認証制御装置。
前記認証対象者を追跡する認証対象者追跡手段をさらに備え、
前記映写領域特定手段は、前記認証対象者追跡手段により追跡される前記認証対象者の前記映写領域を特定する付記1に記載の認証制御装置。
前記認証対象者追跡手段は、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔以外の非顔領域を追跡する付記2に記載の認証制御装置。
前記画像取得手段により取得された前記画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の特徴情報を抽出する特徴情報抽出手段と、
前記特徴情報抽出手段により抽出された前記非顔領域の特徴情報と前記顔認証の結果とを対応づけて登録する登録手段と、
前記特徴情報抽出手段により抽出された前記特徴情報と前記登録手段により登録された前記特徴情報とを照合する照合手段と、をさらに備え、
前記画像取得手段は、前記画像として第1画像及び当該第1画像が撮影された後に撮影された第2画像を取得し、
前記認証制御手段は、前記認証装置に、前記画像取得手段により取得された前記第1画像に含まれる前記認証対象者の顔認証を実行させ、
前記特徴情報抽出手段は、前記画像取得手段により取得された前記第1画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の第1特徴情報を抽出し、かつ、前記画像取得手段により取得された前記第2画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の第2特徴情報を抽出し、
前記登録手段は、前記特徴情報抽出手段により抽出された前記第1特徴情報と前記顔認証の結果とを対応づけて登録し、
前記照合手段は、前記特徴情報抽出手段により抽出された前記第2特徴情報と前記登録手段により登録された前記第1特徴情報とを照合し、
前記映写制御手段は、前記映写装置に、前記登録手段により登録された前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる付記1に記載の認証制御装置。
前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔領域を検出する顔領域検出処理を実行する顔領域検出手段をさらに備え、
前記映写制御手段は、前記顔領域検出手段による前記顔領域検出処理の実行の結果、前記認証対象者の顔領域を検出できなかった場合、前記映写装置に、顔領域が検出できなかった旨を前記映写領域特定手段により特定された前記映写領域に映写させる付記1から4のいずれか1項に記載の認証制御装置。
前記映写領域特定手段は、前記映写領域として前記画像取得手段により取得された前記画像に含まれる前記認証対象者から所定距離離れた領域を特定する付記1から5のいずれか1項に記載の認証制御装置。
前記所定距離離れた領域は、前記認証対象者から歩行方向に所定距離離れた床面上の領域である付記6に記載の認証制御装置。
前記映写領域特定手段は、前記映写領域として前記画像取得手段により取得された前記画像に含まれる前記認証対象者を含む領域を特定する付記1から5のいずれか1項に記載の認証制御装置。
映写制御手段は、前記顔認証が成功した場合、映写装置に、前記顔認証の結果として前記顔認証が成功した前記認証対象者のプライバシ保護情報を前記映写領域特定手段により特定された前記映写領域に映写させる付記1から8のいずれか1項に記載の認証制御装置。
監視領域を撮影するカメラにより撮影された前記監視領域内を映写ゾーンに向かって歩行する認証対象者を含む画像を取得する画像取得手段と、
顔認証を実行する認証装置に、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御手段と、
前記画像取得手段により取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写ゾーンに映写させる映写制御手段と、を備える認証制御装置。
前記映写ゾーンのうち前記顔認証の結果が映写される映写領域を特定する映写領域特定手段をさらに備え、
前記映写制御手段は、前記画像取得手段により取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる付記10に記載の認証制御装置。
前記認証対象者を追跡する認証対象者追跡手段をさらに備え、
前記映写領域特定手段は、前記認証対象者追跡手段により追跡される前記認証対象者の前記映写領域を特定する付記10又は11に記載の認証制御装置。
前記認証対象者追跡手段は、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔以外の非顔領域を追跡する付記12に記載の認証制御装置。
前記画像取得手段により取得された前記画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の特徴情報を抽出する特徴情報抽出手段と、
前記特徴情報抽出手段により抽出された前記非顔領域の特徴情報と前記顔認証の結果とを対応づけて登録する登録手段と、
前記特徴情報抽出手段により抽出された前記特徴情報と前記登録手段により登録された前記特徴情報とを照合する照合手段と、をさらに備え、
前記画像取得手段は、前記画像として第1画像及び当該第1画像が撮影された後に撮影された第2画像を取得し、
前記認証制御手段は、前記認証装置に、前記画像取得手段により取得された前記第1画像に含まれる前記認証対象者の顔認証を実行させ、
前記特徴情報抽出手段は、前記画像取得手段により取得された前記第1画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の第1特徴情報を抽出し、かつ、前記画像取得手段により取得された前記第2画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の第2特徴情報を抽出し、
前記登録手段は、前記特徴情報抽出手段により抽出された前記第1特徴情報と前記顔認証の結果とを対応づけて登録し、
前記照合手段は、前記特徴情報抽出手段により抽出された前記第2特徴情報と前記登録手段により登録された前記第1特徴情報とを照合し、
前記映写制御手段は、前記照合手段の照合結果が一致した前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記登録手段により登録された前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる付記11に記載の認証制御装置。
前記映写領域特定手段は、前記映写領域として前記映写ゾーンに到達した前記認証対象者から所定距離離れた領域を特定する付記10から14のいずれか1項に記載の認証制御装置。
前記所定距離離れた領域は、前記認証対象者から歩行方向に所定距離離れた床面上の領域である付記15に記載の認証制御装置。
前記映写領域特定手段は、前記映写領域として前記映写ゾーンに到達した前記認証対象者を含む領域を特定する付記10から14のいずれか1項に記載の認証制御装置。
映写制御手段は、前記顔認証が成功した場合、映写装置に、前記顔認証の結果として前記顔認証が成功した前記認証対象者のプライバシ保護情報を前記映写領域特定手段により特定された前記映写領域に映写させる付記7から17のいずれか1項に記載の認証制御装置。
監視領域を撮影するカメラと、
映写装置と、
顔認証を実行する認証装置と、
前記カメラにより撮影された前記監視領域内を歩行する認証対象者を含む画像を取得する画像取得手段と、
前記認証装置に、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御手段と、
前記顔認証の結果が映写される映写領域を特定する映写領域特定手段と、
前記映写装置に、前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる映写制御手段と、を備える認証システム。
ネットワークを介して互いに通信可能な前記カメラ、前記映写装置、前記認証装置及び認証制御装置を備え、
前記画像取得手段、前記認証制御手段、前記映写領域特定手段及び前記映写制御手段は、前記認証制御装置に設けられている付記19に記載の認証システム。
監視領域を撮影するカメラと、
映写装置と、
顔認証を実行する認証装置と、
前記カメラにより撮影された前記監視領域内を映写ゾーンに向かって歩行する認証対象者を含む画像を取得する画像取得手段と、
前記認証装置に、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御手段と、
前記画像取得手段により取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写ゾーンに映写させる映写制御手段と、を備える認証システム。
ネットワークを介して互いに通信可能な前記カメラ、前記映写装置、前記認証装置及び認証制御装置を備え、
前記画像取得手段、前記認証制御手段及び前記映写制御手段は、前記認証制御装置に設けられている付記21に記載の認証システム。
監視領域を撮影するカメラにより撮影された前記監視領域内を歩行する認証対象者を含む画像を取得する画像取得ステップと、
顔認証を実行する認証装置に、前記画像取得ステップにより取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御ステップと、
前記顔認証の結果が映写される映写領域を特定する映写領域特定ステップと、
映写装置に、前記顔認証の結果を前記映写領域特定ステップにより特定された前記映写領域に映写させる映写制御ステップと、を備える認証制御方法。
監視領域を撮影するカメラにより撮影された前記監視領域内を映写ゾーンに向かって歩行する認証対象者を含む画像を取得する画像取得ステップと、
顔認証を実行する認証装置に、前記画像取得ステップにより取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御ステップと、
前記画像取得ステップにより取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写ゾーンに映写させる映写制御ステップと、を備える認証制御方法。
少なくとも1つのプロセッサを備えた電子デバイスに、
監視領域を撮影するカメラにより撮影された前記監視領域内を歩行する認証対象者を含む画像を取得する画像取得処理と、
顔認証を実行する認証装置に、前記画像取得処理により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御処理と、
前記顔認証の結果が映写される映写領域を特定する映写領域特定処理と、
映写装置に、前記顔認証の結果を前記映写領域特定処理により特定された前記映写領域に映写させる映写制御処理と、を実行させるためのプログラムを記録したコンピュータ読取可能な記録媒体。
少なくとも1つのプロセッサを備えた電子デバイスに、
監視領域を撮影するカメラにより撮影された前記監視領域内を映写ゾーンに向かって歩行する認証対象者を含む画像を取得する画像取得処理と、
顔認証を実行する認証装置に、前記画像取得処理により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御処理と、
前記画像取得処理により取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写ゾーンに映写させる映写制御処理と、を実行させるためのプログラムを記録したコンピュータ読取可能な記録媒体。
10 認証装置
11 記憶部
11a プログラム
11b 顔情報DB
12 制御部
12a 画像取得部
12b 顔検出部
12c 特徴点抽出部
12d 登録部
12e 認証部
13 メモリ
14 通信部
20 認証制御装置
21 記憶部
21a プログラム
21b プライバシ保護情報DB
22 制御部
22a 画像取得部(画像取得手段)
22b 顔領域検出部
22c 非顔領域検出部
22d 認証制御部(認証制御手段)
22e 顔認証結果取得部
22f 映写領域特定部(映写領域特定手段)
22g 映写制御部(映写制御手段)
23 メモリ
24 通信部
30 カメラ
40 映写装置
G アバター画像
NW ネットワーク
U1、U2 認証対象者
Z1 認証ゾーン
Z2 映写ゾーン
Claims (26)
- 監視領域を撮影するカメラにより撮影された前記監視領域内を歩行する認証対象者を含む画像を取得する画像取得手段と、
顔認証を実行する認証装置に、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御手段と、
前記顔認証の結果が映写される映写領域を特定する映写領域特定手段と、
映写装置に、前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる映写制御手段と、を備える認証制御装置。 - 前記認証対象者を追跡する認証対象者追跡手段をさらに備え、
前記映写領域特定手段は、前記認証対象者追跡手段により追跡される前記認証対象者の前記映写領域を特定する請求項1に記載の認証制御装置。 - 前記認証対象者追跡手段は、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔以外の非顔領域を追跡する請求項2に記載の認証制御装置。
- 前記画像取得手段により取得された前記画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の特徴情報を抽出する特徴情報抽出手段と、
前記特徴情報抽出手段により抽出された前記非顔領域の特徴情報と前記顔認証の結果とを対応づけて登録する登録手段と、
前記特徴情報抽出手段により抽出された前記特徴情報と前記登録手段により登録された前記特徴情報とを照合する照合手段と、をさらに備え、
前記画像取得手段は、前記画像として第1画像及び当該第1画像が撮影された後に撮影された第2画像を取得し、
前記認証制御手段は、前記認証装置に、前記画像取得手段により取得された前記第1画像に含まれる前記認証対象者の顔認証を実行させ、
前記特徴情報抽出手段は、前記画像取得手段により取得された前記第1画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の第1特徴情報を抽出し、かつ、前記画像取得手段により取得された前記第2画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の第2特徴情報を抽出し、
前記登録手段は、前記特徴情報抽出手段により抽出された前記第1特徴情報と前記顔認証の結果とを対応づけて登録し、
前記照合手段は、前記特徴情報抽出手段により抽出された前記第2特徴情報と前記登録手段により登録された前記第1特徴情報とを照合し、
前記映写制御手段は、前記映写装置に、前記登録手段により登録された前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる請求項1に記載の認証制御装置。 - 前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔領域を検出する顔領域検出処理を実行する顔領域検出手段をさらに備え、
前記映写制御手段は、前記顔領域検出手段による前記顔領域検出処理の実行の結果、前記認証対象者の顔領域を検出できなかった場合、前記映写装置に、顔領域が検出できなかった旨を前記映写領域特定手段により特定された前記映写領域に映写させる請求項1から4のいずれか1項に記載の認証制御装置。 - 前記映写領域特定手段は、前記映写領域として前記画像取得手段により取得された前記画像に含まれる前記認証対象者から所定距離離れた領域を特定する請求項1から5のいずれか1項に記載の認証制御装置。
- 前記所定距離離れた領域は、前記認証対象者から歩行方向に所定距離離れた床面上の領域である請求項6に記載の認証制御装置。
- 前記映写領域特定手段は、前記映写領域として前記画像取得手段により取得された前記画像に含まれる前記認証対象者を含む領域を特定する請求項1から5のいずれか1項に記載の認証制御装置。
- 映写制御手段は、前記顔認証が成功した場合、映写装置に、前記顔認証の結果として前記顔認証が成功した前記認証対象者のプライバシ保護情報を前記映写領域特定手段により特定された前記映写領域に映写させる請求項1から8のいずれか1項に記載の認証制御装置。
- 監視領域を撮影するカメラにより撮影された前記監視領域内を映写ゾーンに向かって歩行する認証対象者を含む画像を取得する画像取得手段と、
顔認証を実行する認証装置に、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御手段と、
前記画像取得手段により取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写ゾーンに映写させる映写制御手段と、を備える認証制御装置。 - 前記映写ゾーンのうち前記顔認証の結果が映写される映写領域を特定する映写領域特定手段をさらに備え、
前記映写制御手段は、前記画像取得手段により取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる請求項10に記載の認証制御装置。 - 前記認証対象者を追跡する認証対象者追跡手段をさらに備え、
前記映写領域特定手段は、前記認証対象者追跡手段により追跡される前記認証対象者の前記映写領域を特定する請求項10又は11に記載の認証制御装置。 - 前記認証対象者追跡手段は、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔以外の非顔領域を追跡する請求項12に記載の認証制御装置。
- 前記画像取得手段により取得された前記画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の特徴情報を抽出する特徴情報抽出手段と、
前記特徴情報抽出手段により抽出された前記非顔領域の特徴情報と前記顔認証の結果とを対応づけて登録する登録手段と、
前記特徴情報抽出手段により抽出された前記特徴情報と前記登録手段により登録された前記特徴情報とを照合する照合手段と、をさらに備え、
前記画像取得手段は、前記画像として第1画像及び当該第1画像が撮影された後に撮影された第2画像を取得し、
前記認証制御手段は、前記認証装置に、前記画像取得手段により取得された前記第1画像に含まれる前記認証対象者の顔認証を実行させ、
前記特徴情報抽出手段は、前記画像取得手段により取得された前記第1画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の第1特徴情報を抽出し、かつ、前記画像取得手段により取得された前記第2画像に含まれる前記認証対象者のうち顔以外の非顔領域から当該非顔領域の第2特徴情報を抽出し、
前記登録手段は、前記特徴情報抽出手段により抽出された前記第1特徴情報と前記顔認証の結果とを対応づけて登録し、
前記照合手段は、前記特徴情報抽出手段により抽出された前記第2特徴情報と前記登録手段により登録された前記第1特徴情報とを照合し、
前記映写制御手段は、前記照合手段の照合結果が一致した前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記登録手段により登録された前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる請求項11に記載の認証制御装置。 - 前記映写領域特定手段は、前記映写領域として前記映写ゾーンに到達した前記認証対象者から所定距離離れた領域を特定する請求項10から14のいずれか1項に記載の認証制御装置。
- 前記所定距離離れた領域は、前記認証対象者から歩行方向に所定距離離れた床面上の領域である請求項15に記載の認証制御装置。
- 前記映写領域特定手段は、前記映写領域として前記映写ゾーンに到達した前記認証対象者を含む領域を特定する請求項10から14のいずれか1項に記載の認証制御装置。
- 映写制御手段は、前記顔認証が成功した場合、映写装置に、前記顔認証の結果として前記顔認証が成功した前記認証対象者のプライバシ保護情報を前記映写領域特定手段により特定された前記映写領域に映写させる請求項7から17のいずれか1項に記載の認証制御装置。
- 監視領域を撮影するカメラと、
映写装置と、
顔認証を実行する認証装置と、
前記カメラにより撮影された前記監視領域内を歩行する認証対象者を含む画像を取得する画像取得手段と、
前記認証装置に、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御手段と、
前記顔認証の結果が映写される映写領域を特定する映写領域特定手段と、
前記映写装置に、前記顔認証の結果を前記映写領域特定手段により特定された前記映写領域に映写させる映写制御手段と、を備える認証システム。 - ネットワークを介して互いに通信可能な前記カメラ、前記映写装置、前記認証装置及び認証制御装置を備え、
前記画像取得手段、前記認証制御手段、前記映写領域特定手段及び前記映写制御手段は、前記認証制御装置に設けられている請求項19に記載の認証システム。 - 監視領域を撮影するカメラと、
映写装置と、
顔認証を実行する認証装置と、
前記カメラにより撮影された前記監視領域内を映写ゾーンに向かって歩行する認証対象者を含む画像を取得する画像取得手段と、
前記認証装置に、前記画像取得手段により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御手段と、
前記画像取得手段により取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写ゾーンに映写させる映写制御手段と、を備える認証システム。 - ネットワークを介して互いに通信可能な前記カメラ、前記映写装置、前記認証装置及び認証制御装置を備え、
前記画像取得手段、前記認証制御手段及び前記映写制御手段は、前記認証制御装置に設けられている請求項21に記載の認証システム。 - 監視領域を撮影するカメラにより撮影された前記監視領域内を歩行する認証対象者を含む画像を取得する画像取得ステップと、
顔認証を実行する認証装置に、前記画像取得ステップにより取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御ステップと、
前記顔認証の結果が映写される映写領域を特定する映写領域特定ステップと、
映写装置に、前記顔認証の結果を前記映写領域特定ステップにより特定された前記映写領域に映写させる映写制御ステップと、を備える認証制御方法。 - 監視領域を撮影するカメラにより撮影された前記監視領域内を映写ゾーンに向かって歩行する認証対象者を含む画像を取得する画像取得ステップと、
顔認証を実行する認証装置に、前記画像取得ステップにより取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御ステップと、
前記画像取得ステップにより取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写ゾーンに映写させる映写制御ステップと、を備える認証制御方法。 - 少なくとも1つのプロセッサを備えた電子デバイスに、
監視領域を撮影するカメラにより撮影された前記監視領域内を歩行する認証対象者を含む画像を取得する画像取得処理と、
顔認証を実行する認証装置に、前記画像取得処理により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御処理と、
前記顔認証の結果が映写される映写領域を特定する映写領域特定処理と、
映写装置に、前記顔認証の結果を前記映写領域特定処理により特定された前記映写領域に映写させる映写制御処理と、を実行させるためのプログラムを記録したコンピュータ読取可能な記録媒体。 - 少なくとも1つのプロセッサを備えた電子デバイスに、
監視領域を撮影するカメラにより撮影された前記監視領域内を映写ゾーンに向かって歩行する認証対象者を含む画像を取得する画像取得処理と、
顔認証を実行する認証装置に、前記画像取得処理により取得された前記画像に含まれる前記認証対象者の顔認証を実行させる認証制御処理と、
前記画像取得処理により取得された前記画像に含まれる前記認証対象者が前記映写ゾーンに到達した場合、前記映写装置に、前記顔認証の結果を前記映写ゾーンに映写させる映写制御処理と、を実行させるためのプログラムを記録したコンピュータ読取可能な記録媒体。
Priority Applications (9)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022512959A JP7571785B2 (ja) | 2020-03-31 | 2020-03-31 | 認証制御装置、認証システム、認証制御方法及びプログラム |
| US17/911,717 US12444238B2 (en) | 2020-03-31 | 2020-03-31 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| PCT/JP2020/014738 WO2021199234A1 (ja) | 2020-03-31 | 2020-03-31 | 認証制御装置、認証システム、認証制御方法及び記録媒体 |
| US18/383,492 US20240054819A1 (en) | 2020-03-31 | 2023-10-25 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/283,294 US20250356695A1 (en) | 2020-03-31 | 2025-07-29 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/283,300 US20250356696A1 (en) | 2020-03-31 | 2025-07-29 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/284,869 US20250356697A1 (en) | 2020-03-31 | 2025-07-30 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/285,008 US20250356699A1 (en) | 2020-03-31 | 2025-07-30 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/284,878 US20250356698A1 (en) | 2020-03-31 | 2025-07-30 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2020/014738 WO2021199234A1 (ja) | 2020-03-31 | 2020-03-31 | 認証制御装置、認証システム、認証制御方法及び記録媒体 |
Related Child Applications (7)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/911,717 A-371-Of-International US12444238B2 (en) | 2020-03-31 | 2020-03-31 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US18/383,492 Continuation US20240054819A1 (en) | 2020-03-31 | 2023-10-25 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/283,300 Continuation US20250356696A1 (en) | 2020-03-31 | 2025-07-29 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/283,294 Continuation US20250356695A1 (en) | 2020-03-31 | 2025-07-29 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/285,008 Continuation US20250356699A1 (en) | 2020-03-31 | 2025-07-30 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/284,869 Continuation US20250356697A1 (en) | 2020-03-31 | 2025-07-30 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
| US19/284,878 Continuation US20250356698A1 (en) | 2020-03-31 | 2025-07-30 | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021199234A1 true WO2021199234A1 (ja) | 2021-10-07 |
Family
ID=77928233
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/014738 Ceased WO2021199234A1 (ja) | 2020-03-31 | 2020-03-31 | 認証制御装置、認証システム、認証制御方法及び記録媒体 |
Country Status (3)
| Country | Link |
|---|---|
| US (7) | US12444238B2 (ja) |
| JP (1) | JP7571785B2 (ja) |
| WO (1) | WO2021199234A1 (ja) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023281909A1 (ja) * | 2021-07-09 | 2023-01-12 | パナソニックIpマネジメント株式会社 | 通知装置および通知方法 |
| WO2024127955A1 (ja) * | 2022-12-16 | 2024-06-20 | パナソニックIpマネジメント株式会社 | 管理装置、管理システム及び管理方法 |
| WO2024154430A1 (ja) * | 2023-01-20 | 2024-07-25 | パナソニックIpマネジメント株式会社 | 人物照合システム、情報処理装置、人物照合方法、及び、人物照合プログラム |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009032116A (ja) * | 2007-07-27 | 2009-02-12 | Toshiba Corp | 顔認証装置、顔認証方法および入退場管理装置 |
| JP2013235373A (ja) * | 2012-05-08 | 2013-11-21 | Sony Corp | 画像処理装置、投影制御方法及びプログラム |
| WO2016103560A1 (ja) * | 2014-12-25 | 2016-06-30 | パナソニックIpマネジメント株式会社 | 投影装置 |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3825273B2 (ja) | 2001-04-09 | 2006-09-27 | 日本電信電話株式会社 | 車輌セキュリティシステム、車輌およびセキュリティセンタ |
| JP2009087232A (ja) * | 2007-10-02 | 2009-04-23 | Toshiba Corp | 人物認証装置および人物認証方法 |
| JP2012156817A (ja) | 2011-01-27 | 2012-08-16 | Mitsubishi Electric Corp | 映像監視システム |
| JP5766096B2 (ja) | 2011-11-09 | 2015-08-19 | セコム株式会社 | 顔画像認証装置 |
| JP6148065B2 (ja) * | 2013-04-30 | 2017-06-14 | セコム株式会社 | 顔認証システム |
-
2020
- 2020-03-31 WO PCT/JP2020/014738 patent/WO2021199234A1/ja not_active Ceased
- 2020-03-31 JP JP2022512959A patent/JP7571785B2/ja active Active
- 2020-03-31 US US17/911,717 patent/US12444238B2/en active Active
-
2023
- 2023-10-25 US US18/383,492 patent/US20240054819A1/en not_active Abandoned
-
2025
- 2025-07-29 US US19/283,294 patent/US20250356695A1/en active Pending
- 2025-07-29 US US19/283,300 patent/US20250356696A1/en active Pending
- 2025-07-30 US US19/284,869 patent/US20250356697A1/en active Pending
- 2025-07-30 US US19/284,878 patent/US20250356698A1/en active Pending
- 2025-07-30 US US19/285,008 patent/US20250356699A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2009032116A (ja) * | 2007-07-27 | 2009-02-12 | Toshiba Corp | 顔認証装置、顔認証方法および入退場管理装置 |
| JP2013235373A (ja) * | 2012-05-08 | 2013-11-21 | Sony Corp | 画像処理装置、投影制御方法及びプログラム |
| WO2016103560A1 (ja) * | 2014-12-25 | 2016-06-30 | パナソニックIpマネジメント株式会社 | 投影装置 |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023281909A1 (ja) * | 2021-07-09 | 2023-01-12 | パナソニックIpマネジメント株式会社 | 通知装置および通知方法 |
| JP2023010180A (ja) * | 2021-07-09 | 2023-01-20 | パナソニックIpマネジメント株式会社 | 通知装置および通知方法 |
| JP7702664B2 (ja) | 2021-07-09 | 2025-07-04 | パナソニックIpマネジメント株式会社 | 通知装置および通知方法 |
| US12494101B2 (en) | 2021-07-09 | 2025-12-09 | Panasonic Intellectual Property Management Co., Ltd. | Notification device and notification method |
| WO2024127955A1 (ja) * | 2022-12-16 | 2024-06-20 | パナソニックIpマネジメント株式会社 | 管理装置、管理システム及び管理方法 |
| WO2024154430A1 (ja) * | 2023-01-20 | 2024-07-25 | パナソニックIpマネジメント株式会社 | 人物照合システム、情報処理装置、人物照合方法、及び、人物照合プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| US20250356699A1 (en) | 2025-11-20 |
| US20250356697A1 (en) | 2025-11-20 |
| JPWO2021199234A1 (ja) | 2021-10-07 |
| JP7571785B2 (ja) | 2024-10-23 |
| US12444238B2 (en) | 2025-10-14 |
| US20250356698A1 (en) | 2025-11-20 |
| US20250356695A1 (en) | 2025-11-20 |
| US20250356696A1 (en) | 2025-11-20 |
| US20230116514A1 (en) | 2023-04-13 |
| US20240054819A1 (en) | 2024-02-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9959454B2 (en) | Face recognition device, face recognition method, and computer-readable recording medium | |
| JP7484985B2 (ja) | 認証システム、認証方法、及び、プログラム | |
| US20250356699A1 (en) | Authentication control device, authentication system, authentication control method and non-transitory computer readable medium | |
| CN107438173A (zh) | 视频处理装置、视频处理方法和存储介质 | |
| CN106937532B (zh) | 用于检测真正用户的系统和方法 | |
| JP2004213087A (ja) | 個人認証装置および個人認証方法 | |
| JP2012208610A (ja) | 顔画像認証装置 | |
| JP2013030078A (ja) | 顔画像認証装置 | |
| WO2021176593A1 (ja) | 滞在管理装置、滞在管理方法、プログラムが格納された非一時的なコンピュータ可読媒体、及び滞在管理システム | |
| JP7327923B2 (ja) | 情報処理装置、情報処理方法、システムおよびプログラム | |
| WO2021060256A1 (ja) | 顔認証装置、顔認証方法、及びコンピュータ読み取り可能な記録媒体 | |
| JP7494903B2 (ja) | 認証制御装置、認証システム、認証制御方法及びプログラム | |
| JP7605218B2 (ja) | 画像処理装置、画像処理方法、プログラム | |
| JP2012003686A (ja) | 認証装置、認証方法、及び認証プログラム、並びに記録媒体 | |
| JP7067593B2 (ja) | 情報処理システム、認証対象の管理方法、及びプログラム | |
| JP2017062757A (ja) | コントローラシステム、その支援装置 | |
| KR102301785B1 (ko) | 얼굴 연속 인증을 위한 방법 및 장치 | |
| JP2014071684A (ja) | 顔画像認証装置 | |
| JP7327571B2 (ja) | 情報処理システム、端末装置、認証対象の管理方法、及びプログラム | |
| WO2021192101A1 (ja) | 認証制御装置、情報処理装置、認証システム、認証制御方法及び記録媒体 | |
| WO2025046784A1 (ja) | 人物認証プログラム、人物認証方法、及び人物認証装置 | |
| JP2024103479A (ja) | 人物照合システム、情報処理装置、人物照合方法、及び、人物照合プログラム | |
| WO2025224788A1 (ja) | 情報処理装置、情報処理方法、及び記録媒体 | |
| WO2021125432A1 (ko) | 얼굴 연속 인증을 위한 방법 및 장치 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20929485 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2022512959 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20929485 Country of ref document: EP Kind code of ref document: A1 |
|
| WWG | Wipo information: grant in national office |
Ref document number: 17911717 Country of ref document: US |