WO2024202017A1 - 判定方法、判定プログラム、および情報処理装置 - Google Patents
判定方法、判定プログラム、および情報処理装置 Download PDFInfo
- Publication number
- WO2024202017A1 WO2024202017A1 PCT/JP2023/013594 JP2023013594W WO2024202017A1 WO 2024202017 A1 WO2024202017 A1 WO 2024202017A1 JP 2023013594 W JP2023013594 W JP 2023013594W WO 2024202017 A1 WO2024202017 A1 WO 2024202017A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- person
- certainty
- value
- services
- location information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- This matter relates to a determination method, a determination program, and an information processing device.
- a method has been disclosed for determining the authentication score threshold for each service type to allow use of the service (see, for example, Patent Document 1).
- the authentication processing device recognizes the type of service selected by an individual, determines an authentication score for permitting the use of that service, and acquires biometric information. The authentication processing device then determines whether the individual is permitted to use the type of service selected by that individual. However, in this case, the processing load required for the authentication process is large.
- the present invention aims to provide a determination method, a determination program, and an information processing device that can reduce the processing load required for authentication processing.
- the determination method executes a process in which, when a computer acquires identification information that identifies a person detected from an image, the computer calculates a probability value indicating the likelihood that the identification information corresponds to the person, and, when the computer acquires location information of the person, the computer references a storage unit that stores a probability threshold and location information associated with each of a plurality of services, and determines whether the calculated probability value is equal to or greater than the probability threshold of the service among the plurality of services that corresponds to the location information of the person.
- the processing load required for authentication processing can be reduced.
- FIG. 1 is a diagram illustrating an example of an authentication space.
- FIG. 1A is a block diagram illustrating an example of an overall configuration of a biometric authentication system according to a first embodiment
- FIG. 1B is a functional block diagram illustrating each function of an information processing device.
- 13 is a flowchart showing an ID linking process.
- 10 is a diagram illustrating an example of an ID table stored in a storage unit;
- 10 is a diagram illustrating a position information table stored in a storage unit;
- 11 is a flowchart showing a service providing process.
- 10A and 10B are diagrams illustrating an example of a case where feature information of a specific person is extracted from a plurality of images.
- FIG. 2 is a block diagram illustrating a hardware configuration of an information processing device.
- biometric authentication technology technology that uses biometric authentication technology to provide services such as payment is known.
- customers can shop smoothly without having to present cash or a payment code.
- biometric authentication used for payment requires a high certainty value indicating the certainty that the person is the actual person, so a strict certainty threshold may be set to reduce the false acceptance rate.
- biometric authentication used for electronic billboards may be tolerant of a lower certainty value than when used for payment, so a looser certainty threshold may be set to reduce the false rejection rate in order to reduce the burden of authentication processing, etc.
- the required accuracy value may differ depending on the service being provided. For example, when providing multiple services at an airport, a high accuracy value is required for whether or not the customer can pass through the entrance gate, whether or not the customer can check in their baggage, whether or not the customer can go through security, whether or not the customer can purchase items, etc. On the other hand, a lower accuracy value may be required for providing flight information, whether or not the customer can enter a lounge, providing campaign information, providing service items, etc.
- Fig. 1 is a diagram illustrating an example of an authentication space.
- a person who enters the authentication space in Fig. 1 registers feature information that can be acquired by a camera and an ID.
- the authentication space is a space that provides multiple services.
- the accuracy value required for each of the service providing terminals 130a to 130e is set individually and may be different or the same.
- the accuracy value is the degree of certainty that the person detected by the camera is the actual person. If the accuracy value is high, there is a high probability that the person detected by the camera is the actual person.
- Multiple tracking cameras 120 are installed in different positions in the authentication space. People can be detected from the images captured by each tracking camera 120. It can be estimated that the person in question is the person with the ID that has the highest similarity between the feature information extracted from the image and the registered feature information.
- the position information of each tracking camera 120 is stored, and the position of each person can be detected by detecting which tracking camera 120 a person appears in. By detecting the position of each person, it can be detected which service providing terminal each person is in a position to receive services from.
- person A by appearing in an image captured by a specific tracking camera 120, person A is detected as being located near the service providing terminal 130a and in a state in which he or she can receive services from the service providing terminal 130a. In this case, it is determined whether or not the accuracy value of person A is equal to or greater than the accuracy value required by the service providing terminal 130a. If the accuracy value of person A is equal to or greater than the accuracy threshold required by the service providing terminal 130a, person A can receive services from the service providing terminal 130a.
- person B is detected as being located near multiple service providing terminals 130b-130d and in a state in which he/she can receive services from the multiple service providing terminals 130b-130d.
- Fig. 2(a) is a block diagram illustrating an example of the overall configuration of a biometric authentication system 200 according to the first embodiment.
- the biometric authentication system 200 includes an information processing device 100, a linking camera 110, a tracking camera 120, and a service providing terminal 130. Each of these devices is connected via a telecommunications line.
- the linking camera 110 is a camera installed at the gate of the authentication space or the like, and is installed in a position where it is easy to obtain characteristic information of a person. There may be one or more linking cameras 110.
- the tracking camera 120 is a camera for tracking a person in the authentication space, and is installed on the ceiling or the like so that it is easy to track the person. There may be one or more tracking cameras 120.
- the service providing terminal 130 is a terminal that provides services to users in the authentication space. There may be multiple service providing terminals 130.
- FIG. 2(b) is a functional block diagram showing each function of the information processing device 100.
- the information processing device 100 functions as an acquisition unit 10, a feature extraction unit 20, a storage unit 30, a probability calculation unit 40, a position detection unit 50, a determination unit 60, an output unit 70, and the like.
- (ID Linking Process) 3 is a flowchart showing the ID linking process.
- the acquisition unit 10 acquires an image from the linking camera 110 (step S1).
- each user takes a posture that makes it easy to acquire the feature information. For example, each user faces the linking camera 110 directly.
- the feature extraction unit 20 extracts feature information x_(i,0), which is identification information for identifying a person, from the image acquired in step S1, and stores it in the storage unit 30 in a state linked to the ID (step S2).
- FIG. 4 is a diagram illustrating an example of an ID table stored in the storage unit 30. As illustrated in FIG. 4, in the ID table, feature information is linked to each ID. Note that since each user assumes a posture that makes it easy to obtain feature information, the feature information stored in the storage unit 30 becomes highly accurate feature information.
- FIG. 5 is a diagram illustrating a location information table stored in the storage unit 30. As illustrated in FIG. 5, in the location information table, the identification information (camera ID) of each tracking camera 120 is linked to the identification information (terminal ID) of each service providing terminal 130. In this manner, by detecting which tracking camera 120 a person appears in, it is possible to detect which service providing terminal the person is in a state where he or she can receive services from.
- (Service provision processing) 6 is a flowchart showing a service provision process when a service is provided to each user who has registered an ID. As shown in FIG. 6, the acquisition unit 10 acquires images from each tracking camera 120 at a predetermined period (step S11).
- the feature extraction unit 20 extracts feature information x_(t, j, k) of the kth person appearing in the image acquired by the jth tracking camera 120 at time t (step S12). By executing step S12, feature information of all people appearing in the images acquired by all tracking cameras 120 is extracted.
- the accuracy calculation unit 40 calculates the similarity D_(i, t, j, k) between each piece of feature information x_(i, 0) stored in the memory unit 30 and each piece of feature information x_(t, j, k) extracted in step S12 (step S13).
- cosine similarity can be used as the similarity.
- the accuracy value is an index indicating the likelihood that the characteristic information corresponds to the target person.
- the estimated value ID_(t,j,k) argmax_i(D_(i,t,j,k)) is the estimated ID.
- the position detection unit 50 refers to the position table in the storage unit 30 and detects the service providing terminal 130 linked to the tracking camera 120 that captured the image showing the jth person (step S15).
- step S16 the determination unit 60 determines whether or not one service providing terminal 130 was detected in step S15 (step S16). By executing step S16, it is possible to determine whether there is one or multiple service providing terminals 130 from which the jth person can receive a service.
- step S16 determines whether the accuracy value Pr_(t, j, k) of the jth person is equal to or greater than the accuracy threshold set for the service providing terminal 130 detected in step S15 (step S17).
- step S17 returns "Yes," the determination unit 60 grants authorization to the jth person for the service providing terminal 130 detected in step S15 (step S18). This allows the kth person to receive the service authorized by the determination unit 60 using the corresponding service providing terminal 130. Thereafter, execution of the flowchart ends.
- step S17 returns "No"
- the output unit 70 causes the service providing terminal 130 detected in step S15 to display a message to improve the accuracy value (step S19). For example, a message to pass through less crowded areas or to move to an area with a good image may be displayed. Alternatively, a message encouraging a change in posture to adopt a more accurate posture may be displayed. Alternatively, a change in posture may be encouraged by varying the size of the displayed characters to guide the person toward or away from the display screen. Thereafter, execution of the flowchart ends.
- step S16 determines whether the accuracy value Pr_(t, j, k) of the jth person is equal to or greater than the highest accuracy threshold among the accuracy thresholds of the multiple candidate service providing terminals 130 detected in step S15 (step S20).
- step S20 is judged as "Yes”, step S18 is executed. As a result, the kth person can receive the provision of the service approved by the judgment unit 60 using the corresponding service providing terminal 130. If step S20 is judged as "No”, step S19 is executed.
- characteristic information of a specific person may be extracted from multiple images acquired by multiple tracking cameras 120.
- FIG. 7 is a diagram illustrating an example of a case where characteristic information of a specific person is extracted from the multiple images. As illustrated in FIG. 7, when an ID and characteristic information are linked and stored in the storage unit 30, the characteristic information obtained from each camera (Cam1, Cam2, Cam3) may be stored individually.
- the ID may be compared with the feature information of each ID. For example, as shown in the lower part of Figure 7, for an ID that does not have feature information for Cam2, the ID may be compared with the average feature information of the other cameras (feature information for Cam1 and Cam2).
- each tracking camera 120 is linked to the terminal ID of each service providing terminal 130 to identify the service providing terminal 130 that is ready to receive each person service, but this is not limited to this.
- the relationship between each position in the authentication space and the service providing terminals 130 within a specified range from that position e.g., a radius of 10 m
- the service providing terminal 130 corresponding to the position n of each person detected from the image acquired by each tracking camera 120 may be identified.
- FIG. 8 is a block diagram illustrating an example of the hardware configuration of the information processing device 100.
- the information processing device 100 includes a CPU 101, a RAM 102, a storage device 103, and the like.
- the CPU (Central Processing Unit) 101 is a central processing unit.
- the RAM (Random Access Memory) 102 is a volatile memory that temporarily stores programs executed by the CPU 101 and data processed by the CPU 101.
- the storage device 103 is a non-volatile storage device. For example, a ROM (Read Only Memory), a solid state drive (SSD) such as a flash memory, or a hard disk driven by a hard disk drive can be used as the storage device 103.
- the function of each part of the information processing device 100 is realized by the CPU 101 executing the determination program stored in the storage device 103. Note that the function of each part of the information processing device 100 may be configured by a dedicated circuit or the like.
- the accuracy calculation unit 40 is an example of an accuracy calculation unit that calculates an accuracy value indicating the likelihood that identification information identifying a person detected from an image corresponds to the person.
- the determination unit 60 acquires the location information of the person, it refers to a storage unit that associates and stores an accuracy threshold with each of a plurality of services, and determines whether the calculated accuracy value is equal to or greater than the accuracy threshold of the service among the plurality of services that corresponds to the location information of the person.
- the output unit 70 does not determine that the accuracy value is equal to or greater than the accuracy threshold, it outputs a message to the person to improve the accuracy value.
- REFERENCE SIGNS LIST 10 Acquisition unit 20 Feature extraction unit 30 Storage unit 40 Accuracy calculation unit 50 Position detection unit 60 Determination unit 70 Output unit 100 Information processing device 110 Linking camera 120 Tracking camera 130 Service providing terminal 200 Biometric authentication system
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Collating Specific Patterns (AREA)
Abstract
Description
図3は、ID紐付処理を表すフローチャートである。取得部10は、各ユーザが認証空間に入る際に、紐付用カメラ110から画像を取得する(ステップS1)。各ユーザは、自身の特徴情報を登録するため、特徴情報を取得しやすい姿勢などを取ることになる。例えば、各ユーザは、紐付用カメラ110に対して正対する。
図6は、IDを登録した各ユーザに対して、サービスを提供する場合のサービス提供処理を表すフローチャートである。図6で例示するように、取得部10は、それぞれの追跡用カメラ120から、所定の周期で画像を取得する(ステップS11)。
20 特徴抽出部
30 記憶部
40 確度算出部
50 位置検出部
60 判定部
70 出力部
100 情報処理装置
110 紐付用カメラ
120 追跡用カメラ
130 サービス提供端末
200 生体認証システム
Claims (15)
- コンピュータが、
画像から検出される人物を識別する識別情報を取得した場合、前記識別情報が前記人物に対応することの確からしさを示す確度値を算出し、
前記人物の位置情報を取得した場合、複数のサービス各々に確度閾値と位置情報とを関連付けて記憶する記憶部を参照し、算出した前記確度値が前記複数のサービスのうち前記人物の位置情報に対応するサービスの確度閾値以上であるか否かを判定する、処理を実行することを特徴とする判定方法。 - 前記コンピュータが、
前記確度値が前記確度閾値以上であると判定されなかった場合、前記人物に前記確度値を向上させるためのメッセージを出力する、処理を実行することを特徴とする請求項1に記載の判定方法。 - 前記コンピュータが、
前記複数のサービスのうち前記人物の位置情報に対応するサービスが複数ある場合、前記確度値が、前記対応するサービスのうち最も高い確度閾値以上であるか否かを判定する、処理を実行することを特徴とする請求項1に記載の判定方法。 - 前記コンピュータが、
前記確度値が前記最も高い確度値以上であると判定されなかった場合、前記人物に前記確度値を向上させるためのメッセージを出力する、処理を実行することを特徴とする請求項3に記載の判定方法。 - 前記コンピュータが、
前記人物が検出されるそれぞれ異なる複数の画像を用いて、前記識別情報が前記人物に対応することの確からしさを示す確度値を算出する、処理を実行することを特徴とする請求項1に記載の判定方法。 - コンピュータに、
画像から検出される人物を識別する識別情報を取得した場合、前記識別情報が前記人物に対応することの確からしさを示す確度値を算出し、
前記人物の位置情報を取得した場合、複数のサービス各々に確度閾値と位置情報とを関連付けて記憶する記憶部を参照し、算出した前記確度値が前記複数のサービスのうち前記人物の位置情報に対応するサービスの確度閾値以上であるか否かを判定する、処理を実行させることを特徴とする判定プログラム。 - 前記コンピュータに、
前記確度値が前記確度閾値以上であると判定されなかった場合、前記人物に前記確度値を向上させるためのメッセージを出力する、処理を実行させることを特徴とする請求項6に記載の判定プログラム。 - 前記コンピュータに、
前記複数のサービスのうち前記人物の位置情報に対応するサービスが複数ある場合、前記確度値が、前記対応するサービスのうち最も高い確度閾値以上であるか否かを判定する、処理を実行させることを特徴とする請求項6に記載の判定プログラム。 - 前記コンピュータに、
前記確度値が前記最も高い確度値以上であると判定されなかった場合、前記人物に前記確度値を向上させるためのメッセージを出力する、処理を実行させることを特徴とする請求項8に記載の判定プログラム。 - 前記コンピュータに、
前記人物が検出されるそれぞれ異なる複数の画像を用いて、前記識別情報が前記人物に対応することの確からしさを示す確度値を算出する、処理を実行させることを特徴とする請求項6に記載の判定プログラム。 - 画像から検出される人物を識別する識別情報が前記人物に対応することの確からしさを示す確度値を算出する確度算出部と、
前記人物の位置情報を取得した場合、複数のサービス各々に確度閾値と位置情報とを関連付けて記憶する記憶部を参照し、算出した前記確度値が前記複数のサービスのうち前記人物の位置情報に対応するサービスの確度閾値以上であるか否かを判定する判定部と、を備えることを特徴とする情報処理装置。 - 前記確度値が前記確度閾値以上であると判定されなかった場合、前記人物に前記確度値を向上させるためのメッセージを出力する出力部を備えることを特徴とする請求項11に記載の情報処理装置。
- 前記判定部は、前記複数のサービスのうち前記人物の位置情報に対応するサービスが複数ある場合、前記確度値が、前記対応するサービスのうち最も高い確度閾値以上であるか否かを判定することを特徴とする請求項11に記載の情報処理装置。
- 前記確度値が前記最も高い確度値以上であると判定されなかった場合、前記人物に前記確度値を向上させるためのメッセージを出力する出力部を備えることを特徴とする請求項13に記載の情報処理装置。
- 前記確度算出部は、前記人物が検出されるそれぞれ異なる複数の画像を用いて、前記識別情報が前記人物に対応することの確からしさを示す確度値を算出することを特徴とする請求項11に記載の情報処理装置。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202380096650.8A CN120981832A (zh) | 2023-03-31 | 2023-03-31 | 判定方法、判定程序以及信息处理装置 |
| JP2025509602A JPWO2024202017A1 (ja) | 2023-03-31 | 2023-03-31 | |
| PCT/JP2023/013594 WO2024202017A1 (ja) | 2023-03-31 | 2023-03-31 | 判定方法、判定プログラム、および情報処理装置 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2023/013594 WO2024202017A1 (ja) | 2023-03-31 | 2023-03-31 | 判定方法、判定プログラム、および情報処理装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024202017A1 true WO2024202017A1 (ja) | 2024-10-03 |
Family
ID=92904605
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/013594 Pending WO2024202017A1 (ja) | 2023-03-31 | 2023-03-31 | 判定方法、判定プログラム、および情報処理装置 |
Country Status (3)
| Country | Link |
|---|---|
| JP (1) | JPWO2024202017A1 (ja) |
| CN (1) | CN120981832A (ja) |
| WO (1) | WO2024202017A1 (ja) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014067315A (ja) * | 2012-09-27 | 2014-04-17 | Canon Inc | 認証装置、認証方法、および、そのプログラム |
| JP2020135369A (ja) * | 2019-02-19 | 2020-08-31 | キヤノン株式会社 | 情報処理装置、システム、情報処理方法及びプログラム |
| JP2020173775A (ja) * | 2019-04-10 | 2020-10-22 | 株式会社バカン | 物体検出装置及び混雑状況管理装置 |
-
2023
- 2023-03-31 WO PCT/JP2023/013594 patent/WO2024202017A1/ja active Pending
- 2023-03-31 JP JP2025509602A patent/JPWO2024202017A1/ja active Pending
- 2023-03-31 CN CN202380096650.8A patent/CN120981832A/zh active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2014067315A (ja) * | 2012-09-27 | 2014-04-17 | Canon Inc | 認証装置、認証方法、および、そのプログラム |
| JP2020135369A (ja) * | 2019-02-19 | 2020-08-31 | キヤノン株式会社 | 情報処理装置、システム、情報処理方法及びプログラム |
| JP2020173775A (ja) * | 2019-04-10 | 2020-10-22 | 株式会社バカン | 物体検出装置及び混雑状況管理装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| JPWO2024202017A1 (ja) | 2024-10-03 |
| CN120981832A (zh) | 2025-11-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11948398B2 (en) | Face recognition system, face recognition method, and storage medium | |
| US11960586B2 (en) | Face recognition system, face matching apparatus, face recognition method, and storage medium | |
| JP3584334B2 (ja) | 人間検出追跡システムおよび人間検出追跡方法 | |
| EP3467709B1 (en) | Face recognition system for personal identification and authentication | |
| US20170316271A1 (en) | Monitoring device and method | |
| US20170068945A1 (en) | Pos terminal apparatus, pos system, commodity recognition method, and non-transitory computer readable medium storing program | |
| WO2016088369A1 (ja) | 情報処理装置、言動評価方法およびプログラム記憶媒体 | |
| CN111919233A (zh) | 店铺管理设备和店铺管理方法 | |
| JP2013003817A (ja) | 顔認識による環境理解型制御方式 | |
| JP5785667B1 (ja) | 人物特定システム | |
| US11861993B2 (en) | Information processing system, customer identification apparatus, and information processing method | |
| JP2015090579A (ja) | 行動分析システム | |
| KR101957677B1 (ko) | 얼굴인식을 통한 학습형 실시간 안내 시스템 및 그 안내 방법 | |
| US20230073167A1 (en) | Registration checking apparatus, control method, and non-transitory storage medium | |
| JP2023050826A (ja) | 情報処理プログラム、情報処理方法、および情報処理装置 | |
| US20230394556A1 (en) | Information processing method, information processing device, and recording medium | |
| CN112183167A (zh) | 考勤方法、认证方法、活体检测方法、装置及设备 | |
| US20220269890A1 (en) | Method and system for visual analysis and assessment of customer interaction at a scene | |
| WO2024202017A1 (ja) | 判定方法、判定プログラム、および情報処理装置 | |
| JP2022020661A (ja) | 監視装置及びプログラム | |
| US20250291888A1 (en) | Information processing apparatus, information processing method, and non-tranisitory recording medium | |
| US20200234338A1 (en) | Content selection apparatus, content selection method, content selection system, and program | |
| WO2020178893A1 (ja) | 辞書生成装置、生体認証装置、監視システム、辞書生成方法、および記録媒体 | |
| CN111723616B (zh) | 一种人员相关性度量方法及装置 | |
| Prasanth Vaidya et al. | Finding Numbers of Occurrences and Duration of a Particular Face in Video Stream |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23930652 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2025509602 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2025509602 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023930652 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2023930652 Country of ref document: EP Effective date: 20251031 |
|
| ENP | Entry into the national phase |
Ref document number: 2023930652 Country of ref document: EP Effective date: 20251031 |
|
| ENP | Entry into the national phase |
Ref document number: 2023930652 Country of ref document: EP Effective date: 20251031 |
|
| ENP | Entry into the national phase |
Ref document number: 2023930652 Country of ref document: EP Effective date: 20251031 |
|
| ENP | Entry into the national phase |
Ref document number: 2023930652 Country of ref document: EP Effective date: 20251031 |