[go: up one dir, main page]

AU2017203772B2 - A method of automatically calibrating an array of cameras, and a corresponding installation - Google Patents

A method of automatically calibrating an array of cameras, and a corresponding installation Download PDF

Info

Publication number
AU2017203772B2
AU2017203772B2 AU2017203772A AU2017203772A AU2017203772B2 AU 2017203772 B2 AU2017203772 B2 AU 2017203772B2 AU 2017203772 A AU2017203772 A AU 2017203772A AU 2017203772 A AU2017203772 A AU 2017203772A AU 2017203772 B2 AU2017203772 B2 AU 2017203772B2
Authority
AU
Australia
Prior art keywords
cameras
semantically
corner
characteristic points
defined characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2017203772A
Other versions
AU2017203772A1 (en
Inventor
Christelle Maria France BAUDRY
Vincent Despiegel
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Idemia Public Security France
Original Assignee
Idemia Public Security France
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Idemia Public Security France filed Critical Idemia Public Security France
Publication of AU2017203772A1 publication Critical patent/AU2017203772A1/en
Application granted granted Critical
Publication of AU2017203772B2 publication Critical patent/AU2017203772B2/en
Assigned to IDEMIA IDENTITY & SECURITY FRANCE reassignment IDEMIA IDENTITY & SECURITY FRANCE Request to Amend Deed and Register Assignors: SAFRAN IDENTITY & SECURITY
Assigned to IDEMIA PUBLIC SECURITY FRANCE reassignment IDEMIA PUBLIC SECURITY FRANCE Request for Assignment Assignors: IDEMIA IDENTITY & SECURITY FRANCE
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/165Detection; Localisation; Normalisation using facial parts and geometric relationships
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Signal Processing (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Vascular Medicine (AREA)
  • Biomedical Technology (AREA)
  • Collating Specific Patterns (AREA)
  • Image Analysis (AREA)

Abstract

A B S T R A C T A method of automatically calibrating at least one first camera and at least one second camera that have 5 fields of view covering a location and presenting an overlap zone, and that are connected to a processor unit executing a program arranged to detect semantically defined characteristic points in an element, the method comprising the steps of: 10 - capturing respective images simultaneously with two cameras; • detecting in each image, in the overlap zone, semantically-defined characteristic points; and - putting two semantically-defined characteristic 15 points of the same type, each detected in one of the images, into correspondence in order to deduce therefore a relationship between the cameras. 1/1 50 1012 1023 51 52 53 101 102 103 1 3 Sole figure

Description

A B S T R A C T
A method of automatically calibrating at least one first camera and at least one second camera that have fields of view covering a location and presenting an overlap zone, and that are connected to a processor unit executing a program arranged to detect semantically defined characteristic points in an element, the method comprising the steps of: - capturing respective images simultaneously with two cameras; • detecting in each image, in the overlap zone, semantically-defined characteristic points; and - putting two semantically-defined characteristic points of the same type, each detected in one of the images, into correspondence in order to deduce therefore a relationship between the cameras.
1/1
50 1012 1023
51 52
53
101 102 103
1 3
Sole figure
A METHOD OF AUTOMATICALLY CALIBRATING AN ARRAY OF CAMERAS, AND A CORRESPONDING INSTALLATION The present invention relates to managing an array of cameras, e.g. suitable for use for biometric recognition purposes.
STATE OF THE PRIOR ART There exist installations having cameras with fields of view that cover a location while presenting an overlap zone and that are connected to a processor unit running a program for biometric face recognition. The program is arranged to detect a face in the images supplied thereto by the cameras. The program is also arranged to detect semantically-defined points in that face, such as: the right corner of the mouth; the left corner of the mouth; the outside corner of the left eye; the inside corner of the left eye; the outside corner of the right eye; the inside corner of the right eye; the tip of the nose; These characteristic points and others, are used for putting the image of the face into a reference position and subsequently for determining whether the face corresponds to a face having its biometric characteristics stored in a database or in an identity document. By way of example, the location in question is an entry airlock giving access to a secure enclosure to which access is authorized only to people recognized by the biometric recognition program. To improve the reliability of such recognition, it is important to be able to determine the position of the person in the environment, which can be done with a calibrated array of cameras. This calibration operation is presently performed by an operator when the installation is initially put into service, and periodically during maintenance operations. This operation is difficult, and if it is not repeated regularly, it cannot prevent drift appearing over time during the operation of the installation.
OBJECT OF THE INVENTION An object of the invention is to simplify the calibration of such cameras.
BRIEF SUMMARY OF THE INVENTION To this end, the invention provides a method of automatically calibrating at least one first camera and at least one second camera that have fields of view covering a location and presenting an overlap zone, and that are connected to a processor unit executing a program arranged to detect semantically-defined characteristic points in an element. The method comprises the steps of: • capturing respective images simultaneously with two cameras; • detecting in each image, in the overlap zone, semantically-defined characteristic points; and - putting two semantically-defined characteristic points of the same type, each detected in one of the images, into correspondence in order to deduce therefore a relationship between the cameras. Thus, setting up correspondence in the overlap zones makes it possible to create a relationship between the fields of view of the two cameras by associating a given point in three-dimensional space with the same semantically-defined point as detected in each of the images. According to one embodiment, the semantically defined characteristic points comprise at least one of the following points: - right corner of the mouth; - left corner of the mouth; - outside corner of the left eye; - outside corner of the right eye;
- inside corner of the left eye; - inside corner of the right eye; - inside center of the left eye; - inside center of the right eye; - tip of the nose. Other characteristics and advantages of the invention appear from reading the following description of particular, non-limiting embodiments of the invention.
BRIEF DESCRIPTION OF THE FIGURE Reference is made to the sole accompanying figure, which is a diagrammatic plan view of an installation for recognizing faces that enables the method of the invention to be performed.
DETAILED DESCRIPTION OF THE INVENTION With reference to the figure, the installation of the invention comprises at least a first camera 1, a second camera 2, and a third camera 3, which are connected to a processor unit 4 that is executing a program for biometric face recognition that is capable of detecting, in an image representing a face, semantically defined characteristic points and biometric data of the face. By way of example, the semantically-defined characteristic points may comprise: the right corner of the mouth; the left corner of the mouth; the outside corner of the left eye; the inside corner of the left eye; the outside corner of the right eye; the inside corner of the right eye; the tip of the nose. Such a program is itself known. By way of example, the biometric recognition program makes use of an algorithm of the scale-invariant feature transform (SIFT) type or of the speeded-up robust features (SURF) type. The cameras are positioned in a passage 50 so as to have fields of view 101, 102, and 103 that, in pairs, present overlap zones 1012 and 1023. More precisely, the field of view 101 covers the field of view 102 in the zone 1012; the field of view 102 covers the field of view 103 in the zone 1023; and the fields of view 101 and 103 do not overlap. The passage 50 is closed at both ends, respectively by an entry door 51 and by an exit door 52 so as to form an airlock. Opening of the entry door 51 is controlled by presenting and processing an identity document to a detector 53 connected to the control unit 4. The detector 53 is arranged to extract from the identity document biometric data of the face of the bearer of the identity document and to transmit that data to the processor unit 4. The data is contained either in a photograph integral with the identity document or else in an integrated circuit incorporated in the identity document. The exit door 52 is opened under the control of the processor unit 4 when the biometric data extracted by the processor unit 4 from the images provided by the cameras 1, 2, and 3 corresponds to the biometric data from the identity document. This nominal mode of operation is itself known and is not described in greater detail herein. In order to make the automatic recognition of faces in this nominal mode of operation perform better, it is necessary for the cameras 1, 2, and 3 to be calibrated. By way of example, such calibration enables the processor unit 4 to be sure that a face detected in an image provided by each camera 1, 2, and 3 is indeed to be found in the passage and not outside it, and to estimate the size of the image in order to verify that it matches theoretical anthropometric data. This enables the processor unit to detect an attempted fraud. In the method of the invention, this calibration is performed automatically by the processor unit 4 during a calibration operation initiated by the installer when putting the installation into service, and then during calibration operations that are initiated periodically in autonomous manner by the processor unit 4 during normal operation of the installation. Each calibration operation comprises the steps of: • capturing respective images simultaneously with two cameras; • detecting in each image of a pair, in the overlap zone, semantically-defined characteristic points; and - putting two semantically-defined characteristic points of the same type, each detected in one of the images of each pair, into correspondence in order to deduce therefore a relationship between the cameras. For example, the processor unit 4 analyzes two images captured at the same instant respectively by the camera 1 and by the camera 2 in order to detect therein semantically-defined characteristic points. In the image from the camera 1 there are detected: a tip of a nose, an outside corner of a right eye, and a right corner of a mouth. In the image from the camera 2, there are detected: a tip of a nose, an outside corner of a right eye, an inside corner of a right eye, an outside corner of a left eye, an inside corner of a left eye, a right corner of a mouth, and a left corner of a mouth. The processor unit 4 thus establishes correspondences between the semantically-defined characteristic points that are common to both images by comparing the positions of the tip of the nose, of the outside corner of the right eye, and of the right corner of the mouth in the image supplied by the camera 1 and in the image supplied by the camera 2. These correspondences are established by associating a point detected in each of the cameras (e.g. the corner of the right eye) with the same point in three-dimensional space. This makes it possible to create a relationship between the fields of view of the two cameras. After performing the method several times over, a step of analyzing the previously-established correspondences is performed in order to deduce therefrom a quality factor for the correspondences. Naturally, the invention is not limited to the implementations described, but covers any variant coming within the ambit of the invention as defined by the claims. In particular, the invention is not limited to applications for biometric face recognition. The invention is applicable to an installation that does not have biometric recognition means, e.g. an installation for video surveillance. It should be observed that it is also possible, in a variant of the invention, to deduce the relative positions of the cameras and to deduce therefrom treatment for application to the images in order to improve biometric performance (e.g. to straighten up an image when it is known that it has been taken as a low angle shot by a camera). The installation may be of a structure that is different and may have some other number of cameras, providing there are at least two cameras. In this specification, the terms "comprise", "comprises", "comprising" or similar terms are intended to mean a non-exclusive inclusion, such that a system, method or apparatus that comprises a list of elements does not include those elements solely, but may well include other elements not listed. The reference to any prior art in this specification is not, and should not be taken as, an acknowledgement or any form of suggestion that the prior art forms part of the common general knowledge.

Claims (3)

  1. CLAIMS 1. A method of automatically calibrating at least one first camera and at least one second camera that have fields of view covering a location and presenting an overlap zone, and that are connected to a processor unit executing a program arranged to detect semantically defined characteristic points in an element, the element being a face and the program being a program for biometric face recognition, wherein the semantically defined characteristic points comprise at least one of the following points: • right corner of the mouth; • left corner of the mouth; • outside corner of the left eye; outside corner of the right eye; • inside corner of the left eye; • inside corner of the right eye; • inside center of the left eye; • inside center of the right eye; tip of the nose, the method comprising the steps of: • capturing respective images simultaneously with two cameras; • detecting in each image, in the overlap zone, semantically-defined characteristic points; and - putting two semantically-defined characteristic points of the same type, each detected in one of the images, into correspondence in order to deduce therefore a relationship between the cameras.
  2. 2. A method according to claim 1, wherein, after performing the method several times over, a step is performed of analyzing the previously-established correspondences in order to deduce therefrom a quality factor for the correspondences.
  3. 3. A method according to claim 1, wherein the program is a biometric recognition program.
AU2017203772A 2016-06-07 2017-06-05 A method of automatically calibrating an array of cameras, and a corresponding installation Active AU2017203772B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR1655215A FR3052278B1 (en) 2016-06-07 2016-06-07 METHOD FOR SELF-CALIBRATION OF A NETWORK OF CAMERAS AND CORRESPONDING INSTALLATION
FR1655215 2016-06-07

Publications (2)

Publication Number Publication Date
AU2017203772A1 AU2017203772A1 (en) 2017-12-21
AU2017203772B2 true AU2017203772B2 (en) 2022-06-16

Family

ID=57396516

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2017203772A Active AU2017203772B2 (en) 2016-06-07 2017-06-05 A method of automatically calibrating an array of cameras, and a corresponding installation

Country Status (3)

Country Link
EP (1) EP3255876B1 (en)
AU (1) AU2017203772B2 (en)
FR (1) FR3052278B1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070008405A1 (en) * 2003-07-28 2007-01-11 Ryad Benosman Method for calibrating at least two video cameras relatively to each other for stereoscopic filming and device therefor
EP2309451A1 (en) * 2009-09-25 2011-04-13 Deutsche Telekom AG Method and system for self-calibration of asynchronized camera networks
US9303525B2 (en) * 2010-03-26 2016-04-05 Alcatel Lucent Method and arrangement for multi-camera calibration

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8897502B2 (en) * 2011-04-29 2014-11-25 Aptina Imaging Corporation Calibration for stereoscopic capture system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070008405A1 (en) * 2003-07-28 2007-01-11 Ryad Benosman Method for calibrating at least two video cameras relatively to each other for stereoscopic filming and device therefor
EP2309451A1 (en) * 2009-09-25 2011-04-13 Deutsche Telekom AG Method and system for self-calibration of asynchronized camera networks
US9303525B2 (en) * 2010-03-26 2016-04-05 Alcatel Lucent Method and arrangement for multi-camera calibration

Also Published As

Publication number Publication date
EP3255876B1 (en) 2023-09-06
FR3052278B1 (en) 2022-11-25
FR3052278A1 (en) 2017-12-08
AU2017203772A1 (en) 2017-12-21
EP3255876A1 (en) 2017-12-13
EP3255876C0 (en) 2023-09-06

Similar Documents

Publication Publication Date Title
JP6483485B2 (en) Person authentication method
US8582833B2 (en) Method and apparatus for detecting forged face using infrared image
JP6151582B2 (en) Face recognition system
JP6409929B1 (en) Verification system
US20070291998A1 (en) Face authentication apparatus, face authentication method, and entrance and exit management apparatus
KR20190001066A (en) Face verifying method and apparatus
US20210056289A1 (en) Face authentication apparatus
JP6544404B2 (en) Matching system
JP7484985B2 (en) Authentication system, authentication method, and program
JP4521086B2 (en) Face image recognition apparatus and face image recognition method
CN104090656A (en) Eyesight protecting method and system for smart device
CN107333107A (en) Monitor image pickup method, device and its equipment
WO2019151116A1 (en) Information processing device
JP2008158678A (en) Person authentication device, person authentication method, and entrance / exit management system
JP2007249298A (en) Face authentication apparatus and face authentication method
JP6947202B2 (en) Matching system
US11048915B2 (en) Method and a device for detecting fraud by examination using two different focal lengths during automatic face recognition
JP2007206898A (en) Face authentication device and entrance / exit management device
JP2019087932A (en) Information processing system
JP7327923B2 (en) Information processing device, information processing method, system and program
US10990817B2 (en) Method of detecting fraud during iris recognition
JP2020064531A5 (en)
AU2017203772B2 (en) A method of automatically calibrating an array of cameras, and a corresponding installation
JP2020063659A (en) Information processing system
US20160275666A1 (en) Semiconductor device and camera apparatus

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
HB Alteration of name in register

Owner name: IDEMIA IDENTITY & SECURITY FRANCE

Free format text: FORMER NAME(S): SAFRAN IDENTITY & SECURITY

PC Assignment registered

Owner name: IDEMIA PUBLIC SECURITY FRANCE

Free format text: FORMER OWNER(S): IDEMIA IDENTITY & SECURITY FRANCE