[go: up one dir, main page]

WO2021171395A1 - Dispositif de détection de position de pupille, dispositif de détection de ligne visuelle et procédé de détection de position de pupille - Google Patents

Dispositif de détection de position de pupille, dispositif de détection de ligne visuelle et procédé de détection de position de pupille Download PDF

Info

Publication number
WO2021171395A1
WO2021171395A1 PCT/JP2020/007601 JP2020007601W WO2021171395A1 WO 2021171395 A1 WO2021171395 A1 WO 2021171395A1 JP 2020007601 W JP2020007601 W JP 2020007601W WO 2021171395 A1 WO2021171395 A1 WO 2021171395A1
Authority
WO
WIPO (PCT)
Prior art keywords
pupil position
pupil
reliability
unit
calculation unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2020/007601
Other languages
English (en)
Japanese (ja)
Inventor
遼平 村地
亮介 虎間
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2020/007601 priority Critical patent/WO2021171395A1/fr
Publication of WO2021171395A1 publication Critical patent/WO2021171395A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Definitions

  • the present disclosure relates to a pupil position detection device, a line-of-sight detection device, and a pupil position detection method.
  • the conventional pupil position detection device executes template matching to detect pupil position candidates from the image of the eye region, calculates the likelihood of each detected pupil position candidate, and selects the pupil position candidate having the maximum likelihood. It was determined to be the position (see, for example, Patent Document 1).
  • the calculated likelihood may not be an appropriate value depending on the imaging environment, and there is a problem that the reliability of the pupil position cannot be evaluated correctly.
  • the present disclosure has been made to solve the above problems, and an object of the present disclosure is to evaluate the reliability of the pupil position without being affected by the imaging environment.
  • the pupil position detection device is composed of an image acquisition unit that acquires an image captured by a subject, an eye area extraction unit that extracts an eye area from an image acquired by the image acquisition unit, and an eye area extraction unit.
  • the reliability of the pupil position can be evaluated without being affected by the imaging environment.
  • FIG. 2A and 2B are diagrams showing a hardware configuration example of the line-of-sight detection device according to the first embodiment. It is a figure which shows the example of the eye area image extracted by the eye area extraction part. It is explanatory drawing which shows the example of a filter. It is explanatory drawing which shows the example of the sweep by a filter. It is explanatory drawing which shows the example of the vector corresponding to the 1st luminance gradient value, the example of the vector corresponding to the 2nd luminance gradient value, and the example of the luminance gradient vector.
  • FIG. 17A is a diagram showing an example of three pupil positions having a small degree of spread
  • FIG. 17B is an explanatory diagram showing an example of a method of calculating the second reliability in the example of FIG. 17A.
  • FIG. 18A is a diagram showing an example of three pupil positions having a large degree of spread
  • FIG. 18B is an explanatory diagram showing an example of a method of calculating the second reliability in the example of FIG. 18A. It is explanatory drawing which shows the example of the calculation method of the 3rd reliability. It is a flowchart which shows the operation example of the line-of-sight detection apparatus which concerns on Embodiment 2.
  • FIG. 1 is a block diagram showing a configuration example of the line-of-sight detection device 10 according to the first embodiment.
  • the line-of-sight detection device 10 includes a pupil position detection device 1 and a line-of-sight angle calculation unit 11.
  • the pupil position detection device 1 includes an image acquisition unit 2, an eye region extraction unit 3, a pupil position detection unit 4, and a first reliability calculation unit 5. Further, the image pickup device 12 is connected to the pupil position detection device 1.
  • FIGS. 2A and 2B are diagrams showing a hardware configuration example of the line-of-sight detection device 10 according to the first embodiment.
  • the functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the first reliability calculation unit 5, and the line-of-sight angle calculation unit 11 in the line-of-sight detection device 10 are realized by a processing circuit. That is, the line-of-sight detection device 10 includes a processing circuit for realizing the above functions.
  • the processing circuit may be a processing circuit 100 as dedicated hardware, or may be a processor 101 that executes a program stored in the memory 102.
  • the processing circuit 100 may be, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, or an ASIC (Application Specific Integrated Circuit). ), FPGA (Field Processor Gate Array), or a combination thereof.
  • the functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the first reliability calculation unit 5, and the line-of-sight angle calculation unit 11 may be realized by a plurality of processing circuits 100, or the functions of each unit may be realized. May be collectively realized by one processing circuit 100.
  • the processing circuit is the processor 101
  • the functions of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the first reliability calculation unit 5, and the line-of-sight angle calculation unit 11 Is realized by software, firmware, or a combination of software and firmware.
  • the software or firmware is described as a program and stored in the memory 102.
  • the processor 101 realizes the functions of each part by reading and executing the program stored in the memory 102. That is, the line-of-sight detection device 10 includes a memory 102 for storing a program in which the step shown in the flowchart of FIG. 15 described later will be executed as a result when executed by the processor 101.
  • this program causes a computer to execute the procedures or methods of the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the first reliability calculation unit 5, and the line-of-sight angle calculation unit 11. I can say.
  • the processor 101 is a CPU (Central Processing Unit), a processing device, an arithmetic unit, a microprocessor, or the like.
  • the memory 102 may be a non-volatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), an EPROM (Erasable Program ROM), or a flash memory, and may be a non-volatile or volatile semiconductor memory such as a hard disk or a flexible disk. It may be an optical disc such as a CD (Compact Disc) or a DVD (Digital Versaille Disc).
  • the processing circuit in the line-of-sight detection device 10 can realize the above-mentioned functions by hardware, software, firmware, or a combination thereof.
  • the imaging device 12 images the periphery of the face of the subject whose pupil position is detected and the line of sight is detected in chronological order.
  • the image pickup device 12 outputs the captured image I1 to the image acquisition unit 2.
  • the image acquisition unit 2 acquires the image I1 of the subject imaged by the image pickup device 12 from the image pickup device 12 and outputs it to the eye region extraction unit 3.
  • the eye region extraction unit 3 extracts the target person's eye region from the image I1 acquired by the image acquisition unit 2, and transfers the extracted image of the eye region (hereinafter referred to as “eye region image I2”) to the pupil position detection unit 4. Output.
  • FIG. 3 is a diagram showing an example of the eye region image I2 extracted by the eye region extraction unit 3.
  • the eye area extraction unit 3 detects the face area of the subject from the image I1 acquired by the image acquisition unit 2, and from the detected face area, the outer corner FP1, the inner corner of the eye FP2, the upper eyelid FP3, and the lower eyelid, which are the feature points of the right eye.
  • FP4 is detected, and the right eye region image I2 is extracted from the image I1 based on the positions of the detected eye feature points.
  • the eye region extraction unit 3 detects the outer corners of the eyes, the inner corners of the eyes, the upper eyelid, and the lower eyelid, which are the feature points of the left eye, from the face region, and based on the positions of the detected eye feature points, the left eye from the image I1. Extract the eye area image.
  • the eye region image I2 is composed of a plurality of units (hereinafter referred to as "image units") U arranged in two directions (that is, the X direction and the Y direction) orthogonal to each other.
  • each image unit U is composed of one pixel.
  • each image unit U is composed of a plurality of pixels adjacent to each other.
  • the eye region extraction unit 3 extracts either the right eye region or the left eye region from the image I1.
  • the pupil position detection unit 4 detects the pupil position of the one eye
  • the first reliability calculation unit 5 The first reliability of the pupil position of the one eye detected by the pupil position detection unit 4 is calculated.
  • the eye region extraction unit 3 may extract the right eye region and the left eye region.
  • the pupil position detection unit 4 detects the pupil position of the right eye and the pupil position of the left eye
  • the first reliability calculation unit 5 determines the first reliability of the pupil position of the right eye and the first of the pupil positions of the left eye. Calculate the reliability.
  • the pupil position detection unit 4 detects one or more positions (hereinafter, referred to as “pupil position”) indicating that it is within the pupil region from the eye region image I2 extracted by the eye region extraction unit 3.
  • the pupil position detection unit 4 outputs information including the position of the eye feature point and one or more pupil positions to the first reliability calculation unit 5.
  • the pupil position detection unit 4 may detect the pupil position from the eye region image I2 by a well-known method, and the method is not limited.
  • the template matching as described above may be used, or the method based on the luminance gradient vector as described below may be used.
  • the likelihood in template matching corresponds to the evaluation value E in the method based on the luminance gradient vector described later.
  • FIG. 4 is an explanatory diagram showing an example of the filter F.
  • FIG. 5 is an explanatory diagram showing an example of sweeping by the filter F.
  • the filter F shown in FIG. 4 is used for calculating the luminance gradient vector V2.
  • the filter F is applied so as to sweep the eye region image I2 as shown in FIG.
  • the filter F has a luminance value B_L and a attention image in the image unit U_L arranged to the left of the attention image unit U_I for each image unit (hereinafter, may be referred to as "attention image unit”) U_I in the eye area image I2.
  • the difference value (hereinafter referred to as “first luminance gradient value”) ⁇ B_X from the luminance value B_R in the image unit U_R arranged to the right of the unit U_I is calculated.
  • the filter F is a difference value between the luminance value B_U in the image unit U_U arranged above the attention image unit U_I and the luminance value B_D in the image unit U_D arranged below the attention image unit U_I (hereinafter, "second”). It is called “luminance gradient value”.) ⁇ B_Y is calculated.
  • FIG. 6 is an explanatory diagram showing an example of the vector V2_X corresponding to the first luminance gradient value ⁇ B_X, an example of the vector V2_Y corresponding to the second luminance gradient value ⁇ B_Y, and an example of the luminance gradient vector V2.
  • the luminance gradient vector V2 is represented by the sum of the vector V2_X corresponding to the first luminance gradient value ⁇ B_X and the vector V2_Y corresponding to the second luminance gradient value ⁇ B_Y.
  • the pupil position detection unit 4 calculates the first luminance gradient value ⁇ B_X and the second luminance gradient value ⁇ B_Y, and uses the calculated first luminance gradient value ⁇ B_X and the second luminance gradient value ⁇ B_Y to obtain the luminance gradient vector V2.
  • the angle corresponding to the direction hereinafter referred to as “luminance gradient angle”) ⁇ is calculated.
  • FIG. 7 shows an example of the luminance value B_I in the attention image unit U_I and an example of the luminance values B_D, B_L, B_R, B_U in the individual image units U_D, U_L, U_R, U_U arranged around the attention image unit U_I.
  • the luminance value B_L is 44
  • the luminance value B_R is 16
  • the luminance value B_U is 47
  • the luminance value B_D is 18.
  • the filter F to the attention image unit U_I
  • the first luminance gradient value ⁇ B_X is calculated to ⁇ 28
  • the second luminance gradient value ⁇ B_Y is calculated to ⁇ 29.
  • the luminance gradient angle ⁇ is calculated to be 46 °.
  • FIG. 8 is an explanatory diagram showing an example of the luminance gradient vector V2 corresponding to the attention image unit U_I.
  • the pupil position detection unit 4 uses the luminance gradient vector V2 to calculate the evaluation value E corresponding to each image unit U in the eye region image I2.
  • the evaluation value E is based on the number n of the luminance gradient vectors V2 toward the individual image unit U (that is, the attention image unit U_I). A method of calculating the evaluation value E will be described with reference to FIGS. 9 and 10.
  • FIG. 9 is an explanatory diagram showing an example of the evaluation area EA.
  • the pupil position detection unit 4 sets an evaluation region (hereinafter referred to as “evaluation region”) EA including the attention image unit U_I.
  • the evaluation area EA includes the attention image unit U_I and includes N image units (hereinafter, may be referred to as “evaluation image unit”) U_E different from the attention image unit U_I.
  • N is any integer greater than or equal to 2.
  • FIG. 9 shows an example of the evaluation area EA.
  • the shape of the evaluation region EA is square, and the image unit U_I of interest is arranged at the center of the evaluation region EA.
  • the evaluation area EA has a size smaller than a predetermined size (hereinafter referred to as "reference iris size").
  • the reference iris size corresponds to the size of a standard human iris.
  • the pupil position detection unit 4 calculates the angle ⁇ 'corresponding to the inclination of the straight line connecting the attention image unit U_I and the individual evaluation image units U_E.
  • the pupil position detection unit 4 calculates the difference value ⁇ between the corresponding luminance gradient angle ⁇ and the corresponding angle ⁇ 'for each evaluation image unit U_E.
  • the pupil position detection unit 4 determines that the corresponding luminance gradient vector V2 has a direction toward the attention image unit U_I.
  • the calculated difference value ⁇ is equal to or greater than the threshold value ⁇ th, the pupil position detection unit 4 determines that the corresponding luminance gradient vector V2 does not have a direction toward the attention image unit U_I.
  • the pupil position detection unit 4 calculates the number n of the luminance gradient vectors V2 toward the attention image unit U_I based on the result of such determination.
  • the number n is calculated as a value of 0 or more and N or less.
  • the pupil position detection unit 4 calculates an evaluation value E according to the calculated number n. That is, the larger the number n, the higher the evaluation value E is calculated. In other words, the smaller the number n, the lower the evaluation value E is calculated.
  • FIG. 10 is an explanatory diagram showing an example of the luminance gradient vector V2 corresponding to each evaluation image unit U_E.
  • the solid line arrow indicates the luminance gradient vector V2 having a direction toward the attention image unit U_I.
  • the arrow by the broken line indicates the luminance gradient vector V2 having no direction toward the attention image unit U_I.
  • the pupil position detection unit 4 calculates the number n to 19. Further, the pupil position detection unit 4 calculates an evaluation value E according to the calculated number n. For example, the evaluation value E based on n / N is calculated.
  • the pupil position detection unit 4 detects the pupil position PP in the eye region image I2 based on the calculated evaluation value E. More specifically, the pupil position detection unit 4 extracts an evaluation value E that is equal to or higher than a predetermined threshold value from a plurality of evaluation values E corresponding to the plurality of image units U in the eye region image I2. , The position corresponding to the extracted evaluation value E is detected as the pupil position PP.
  • the pupil position detection unit 4 may detect a plurality of pupil position PPs when there are a plurality of evaluation values E that are equal to or higher than the threshold value. Further, the pupil position detection unit 4 detects coordinate values C_X and C_Y indicating the position of the image unit U corresponding to the pupil position PP in the eye region image I2 for each of the detected one or more pupil position PPs.
  • the pupil position detection unit 4 detects the position corresponding to the maximum value (that is, the maximum value) of the plurality of evaluation values E corresponding to the plurality of image units U in the eye region image I2 as the pupil position PP. You may.
  • the eye region image I2 is usually composed of a region corresponding to the inside of the ocular fissure (hereinafter referred to as “intraocular fissure region”) and a region corresponding to the outside of the ocular fissure (hereinafter referred to as “extraocular fissure region”). It is configured.
  • the extraocular region is located around the intraocular region.
  • the intraocular fissure region includes a region corresponding to the pupil (hereinafter referred to as “pupil region”), a region corresponding to the iris (hereinafter referred to as “iris region”), and a region corresponding to the sclera (hereinafter referred to as “sclera region”). ".) Is included.
  • the scleral region is located around the iris region.
  • the iris region is located around the pupil region.
  • the shape of the iris region is circular.
  • the shape of the pupil region is circular.
  • the brightness in the pupil region is usually lower than the brightness in the iris region. Therefore, in the intraocular fissure region, an edge based on the brightness discontinuity is generated at the boundary between the pupil region and the iris region. Moreover, the brightness in the iris region is lower than the brightness in the sclera region. Therefore, in the intraocular fissure region, an edge based on the brightness discontinuity is generated at the boundary between the iris region and the scleral region.
  • the number n corresponding to the image unit U in the pupil region is larger than the number n corresponding to the image unit U outside the pupil region. Therefore, the evaluation value E corresponding to the image unit U in the pupil region is likely to be higher than the evaluation value E corresponding to the image unit U outside the pupil region. Therefore, the pupil position PP can be detected by detecting the evaluation value E which is equal to or higher than the threshold value among the evaluation values E.
  • the first reliability calculation unit 5 calculates the brightness difference between the pixels at the pupil position and the pixels around the pupil position for each one or more pupil positions detected by the pupil position detection unit 4.
  • the first reliability calculation unit 5 calculates the first reliability, which is an index of the certainty of the pupil position, based on the calculated luminance difference.
  • the first reliability calculation unit 5 outputs information including the position of the eye feature point, one or more pupil positions, and the first reliability of each pupil position to the line-of-sight angle calculation unit 11.
  • the pupil position evaluation method using the first reliability is an evaluation method based on the idea that the pupil is darker than the periphery (iris, etc.). An example of the first reliability calculation method will be described with reference to FIGS. 11 and 12.
  • FIG. 11 is an explanatory diagram showing an example of the pupil position image unit U_11 in the first reliability calculation region U_10 and the brightness value distribution around it.
  • the pupil position image unit U_11 is an image unit U corresponding to the pupil position PP detected by the pupil position detection unit 4. The larger the value, the brighter the brightness value in the image unit U, and the smaller the value, the darker the brightness value.
  • the first reliability calculation unit 5 sets an image unit (hereinafter, referred to as “first reliability calculation area U_10”) of a predetermined region centered on the pupil position image unit U_11 with respect to the eye region image I2. do.
  • the first reliability calculation area U_10 is a circular area having a radius of 4 image units. Since the pupil is circular and expands or contracts depending on the external light environment, it is desirable that the first reliability calculation region U_10 is also circular, and the size is at least equal to or larger than the size of a standard human pupil. Desirably the size.
  • FIG. 12 is an explanatory diagram showing an example of concentric image unit groups U_12, U_13, U_14, and U_15 set in the first reliability calculation area U_10.
  • the first reliability calculation unit 5 sets m concentric image unit groups centered on the pupil position image unit U_11 with respect to the first reliability calculation region U_10. m is any integer greater than or equal to 1.
  • m 4
  • the first reliability calculation unit 5 sets four concentric image unit groups U_12, U_13, U_14, and U_15 centered on the pupil position image unit U_11. If one image unit U is one pixel, the m image unit group becomes the "m pixel group".
  • the first reliability calculation unit 5 selects the darkest image unit U_12a from the concentric image unit group U_12.
  • the brightness value of the image unit U_12a is 21.
  • the first reliability calculation unit 5 selects the darkest image unit U_13a from the concentric image unit group U_13.
  • the brightness value of the image unit U_13a is 24.
  • the first reliability calculation unit 5 selects the darkest image unit U_14a from the concentric image unit group U_14.
  • the brightness value of the image unit U_14a is 26.
  • the first reliability calculation unit 5 selects the darkest image unit U_15a from the concentric image unit group U_15.
  • the brightness value of the image unit U_15a is 51. In this way, the first reliability calculation unit 5 selects m image units from the m image unit group.
  • the first reliability calculation unit 5 may use the value of the luminance difference as it is as the first reliability, or may calculate the first reliability according to the value of the luminance difference. When calculating the first reliability according to the luminance difference value, the larger the luminance difference value is, the larger the first reliability is calculated, and the smaller the luminance difference value is, the smaller the first reliability is. Is calculated to.
  • the first reliability calculation unit 5 calculates the first reliability for each of the plurality of pupil positions.
  • the line-of-sight angle calculation unit 11 acquires information output by the pupil position detection device 1 including the position of the eye feature point, one or more pupil positions, and the first reliability of each of the pupil positions.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ of the subject using the acquired information. For example, when the pupil position detection device 1 outputs the above information regarding only the right eye, the line-of-sight angle calculation unit 11 selects the pupil position having the highest first reliability from one or more pupil positions included in this information. If the first reliability of the selected pupil position is equal to or higher than a predetermined threshold value, the line-of-sight angle ⁇ is calculated based on the selected pupil position.
  • the line-of-sight angle calculation unit 11 has the first reliability from among the two or more pupil positions included in the information. The pupil position having the highest degree is selected, and if the first reliability of the selected pupil position is equal to or higher than the above threshold value, the line-of-sight angle ⁇ is calculated based on the selected pupil position. The line-of-sight angle calculation unit 11 does not calculate the line-of-sight angle ⁇ when there is no pupil position having a first reliability equal to or higher than the above threshold value.
  • the line-of-sight angle ⁇ in this example includes the line-of-sight angle ⁇ _X with respect to the yaw direction and the line-of-sight angle ⁇ _Y with respect to the pitch direction.
  • a method of calculating the line-of-sight angle ⁇ _X will be described with reference to FIG.
  • FIG. 13 is an explanatory diagram showing an example of a method of calculating the line-of-sight angle ⁇ _X with respect to the yaw direction. Further, a method of calculating the line-of-sight angle ⁇ _Y will be described with reference to FIG.
  • FIG. 14 is an explanatory diagram showing an example of a method of calculating the line-of-sight angle ⁇ _Y with respect to the pitch direction.
  • the line-of-sight angle calculation unit 11 calculates the positions P_FP1_X and P_FP2_X in the X direction using the information of the eye feature points.
  • the position P_FP1_X corresponds to the outer corner of the eye FP1.
  • the position P_FP2_X corresponds to the inner corner FP2.
  • the line-of-sight angle calculation unit 11 calculates the position (hereinafter referred to as "first reference position") P_C_X in the X direction based on the calculated positions P_FP1_X and P_FP2_X.
  • the first reference position P_C_X corresponds to an intermediate position between the outer corner FP1 and the inner corner FP2. That is, the first reference position P_C_X corresponds to the central portion of the intraocular fissure region with respect to the X direction.
  • the line-of-sight angle calculation unit 11 calculates the position P_PP_X with respect to the X direction using the information on the pupil position.
  • the position P_PP_X corresponds to the pupil position PP.
  • the line-of-sight angle calculation unit 11 calculates the interval L_X_1 with respect to the first reference position P_C_X and the interval L_X_2 with respect to the position P_PP_X for the position P_FP1_X or the position P_FP2_X.
  • FIG. 13 shows an example when the position P_FP2_X is used as a reference for the intervals L_X_1 and L_X_2.
  • the line-of-sight angle calculation unit 11 is preset with a value indicating the maximum value ⁇ max_X of the line-of-sight angle ⁇ _X.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ _X by the following equation (1) using the set maximum value ⁇ max_X and the calculated intervals L_X_1 and L_X_2.
  • ⁇ _X ⁇ max_X ⁇ (L_X_1-L_X_2) / L_X_1 (1)
  • the maximum value ⁇ max_X is based on, for example, the following model M_X. That is, in the model M_X, if the position P_PP_X is the same as the first reference position P_C_X, the line-of-sight angle ⁇ _X is 0 °. Further, in the model M_X, if the position P_PP_X is the same as the position P_FP1_X, the line-of-sight angle ⁇ _X becomes a value (for example, ⁇ 60 °) corresponding to the maximum value ⁇ max_X.
  • the line-of-sight angle ⁇ _X becomes a value (for example, + 60 °) corresponding to the maximum value ⁇ max_X.
  • the line-of-sight angle calculation unit 11 calculates the positions P_FP3_Y and P_FP4_Y in the Y direction by using the information of the eye feature points.
  • the position P_FP3_Y corresponds to the upper eyelid FP3.
  • the position P_FP4_Y corresponds to the lower eyelid FP4.
  • the line-of-sight angle calculation unit 11 calculates the position P_C_Y in the Y direction (hereinafter referred to as "second reference position") based on the calculated positions P_FP3_Y and P_FP4_Y.
  • the second reference position P_C_Y corresponds to an intermediate position between the upper eyelid FP3 and the lower eyelid FP4. That is, the second reference position P_C_Y corresponds to the central portion of the intraocular fissure region with respect to the Y direction.
  • the line-of-sight angle calculation unit 11 calculates the position P_PP_Y with respect to the Y direction using the information on the pupil position.
  • the position P_PP_Y corresponds to the pupil position PP.
  • the line-of-sight angle calculation unit 11 calculates the interval L_Y_1 with respect to the second reference position P_C_Y and the interval L_Y_2 with respect to the position P_PP_Y for the position P_FP3_Y or the position P_FP4_Y.
  • FIG. 14 shows an example when the position P_FP3_Y is used as a reference for the intervals L_Y_1 and L_Y_2.
  • the line-of-sight angle calculation unit 11 is preset with a value indicating the maximum value ⁇ max_Y of the line-of-sight angle ⁇ _Y.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ _Y by the following equation (2) using the set maximum value ⁇ max_Y and the calculated intervals L_Y_1 and L_Y_2.
  • ⁇ _Y ⁇ max_Y ⁇ (L_Y_1-L_Y_2) / L_Y_1 (2)
  • the maximum value ⁇ max_Y is based on, for example, the following model M_Y. That is, in the model M_Y, if the position P_PP_Y is the same as the second reference position P_C_Y, the line-of-sight angle ⁇ _Y is 0 °. Further, in the model M_Y, if the position P_PP_Y is the same as the position P_FP3_Y, the line-of-sight angle ⁇ _Y becomes a value (for example, + 20 °) corresponding to the maximum value ⁇ max_Y.
  • the line-of-sight angle ⁇ _Y becomes a value (for example, ⁇ 20 °) corresponding to the maximum value ⁇ max_Y.
  • the method of calculating the line-of-sight angles ⁇ _X and ⁇ _Y is not limited to these specific examples.
  • Various known techniques can be used to calculate the line-of-sight angles ⁇ _X and ⁇ _Y. Detailed description of these techniques will be omitted.
  • the line-of-sight detection device 10 includes a line-of-sight angle calculation unit 11, and the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ of the target person, but the present invention is not limited to this configuration.
  • the line-of-sight detection device 10 may include a line-of-sight direction calculation unit (not shown), and the line-of-sight direction calculation unit may calculate the line-of-sight direction of the target person.
  • Various known techniques can be used for the calculation method of the line-of-sight direction. Detailed description of these techniques will be omitted.
  • At least one of the line-of-sight angle and the line-of-sight direction of the target person calculated by the line-of-sight detection device 10 is used, for example, for determining the state of the target person.
  • the line-of-sight detection device 10 and the image pickup device 12 are mounted on the vehicle, and the image pickup device 12 takes an image of the driver as a target person.
  • the line-of-sight detection device 10 detects at least one of the line-of-sight angle and the line-of-sight direction of the driver who is the target person.
  • the driver monitoring device (not shown) mounted on the vehicle uses at least one of the driver's line-of-sight angle and line-of-sight direction, and the driver's state is a predetermined state (hereinafter referred to as "warning target state"). .) Is determined.
  • the driver monitoring device determines that the driver is in the warning target state, the driver monitoring device outputs a voice or the like to warn the driver.
  • the warning target states are the state where the driver is driving aside, the state where the driver is not checking backward at the timing when the driver should check backward, the state where the driver's attention is low, and the driver is dozing. It is a state where the driver is not able to drive, and a state where the driver cannot drive.
  • Various known techniques can be used to determine the state subject to warning. Detailed description of these techniques will be omitted.
  • FIG. 15 is a flowchart showing an operation example of the line-of-sight detection device 10 according to the first embodiment. The process shown in FIG. 15 is repeatedly executed when a predetermined condition is satisfied (for example, when the ignition power of the vehicle equipped with the line-of-sight detection device 10 is turned on).
  • a predetermined condition for example, when the ignition power of the vehicle equipped with the line-of-sight detection device 10 is turned on.
  • the image acquisition unit 2 acquires the image I1 captured by the subject from the image pickup device 12 (step ST1).
  • the eye region extraction unit 3 extracts the eye region image I2 from the image I1 acquired by the image acquisition unit 2 (step ST2).
  • the pupil position detection unit 4 detects one or more pupil position PPs from the eye area image I2 extracted by the eye area extraction unit 3 (step ST3).
  • the first reliability calculation unit 5 calculates the brightness difference between the pixels of the pupil position PP and the pixels around the pupil position PP for each pupil position PP detected by the pupil position detection unit 4, and the brightness difference is calculated.
  • the first reliability which is an index of the certainty of the pupil position PP, is calculated based on the brightness difference (step ST4).
  • the line-of-sight angle calculation unit 11 has the highest reliability value calculated by the first reliability calculation unit 5 among the one or more pupil position PPs detected by the pupil position detection unit 4.
  • the line-of-sight angle ⁇ of the subject is calculated based on (step ST5).
  • the pupil position detection device 1 includes an image acquisition unit 2, an eye region extraction unit 3, a pupil position detection unit 4, and a first reliability calculation unit 5.
  • the image acquisition unit 2 acquires an image captured by the subject.
  • the eye region extraction unit 3 extracts the eye region from the image acquired by the image acquisition unit 2.
  • the pupil position detection unit 4 detects one or more pupil positions from the eye region extracted by the eye region extraction unit 3.
  • the first reliability calculation unit 5 calculates the brightness difference between the pixel at the pupil position and the pixel around the pupil position for each pupil position detected by the pupil position detection unit 4, and based on the brightness difference.
  • the first reliability which is an index of the certainty of the pupil position, is calculated.
  • No. 1 can evaluate the reliability of the pupil position without being affected by the imaging environment.
  • the line-of-sight detection device 10 includes a pupil position detection device 1 and a line-of-sight angle calculation unit 11.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle based on one or more pupil positions output by the pupil position detection device 1 and the first reliability. Since the line-of-sight detection device 10 calculates the line-of-sight angle based on the highly reliable pupil position evaluated by a method that is not affected by the imaging environment, the line-of-sight angle can be calculated accurately.
  • FIG. 16 is a block diagram showing a configuration example of the line-of-sight detection device 10 according to the second embodiment.
  • the line-of-sight detection device 10 according to the second embodiment has a configuration in which a second reliability calculation unit 6 and a third reliability calculation unit 7 are added to the line-of-sight detection device 10 of the first embodiment shown in FIG. Is.
  • the same or corresponding parts as those in FIG. 1 are designated by the same reference numerals, and the description thereof will be omitted.
  • the second reliability calculation unit 6 and the third reliability calculation unit 7 shown in FIG. 16 are realized by the processing circuit 100 shown in FIG. 2A.
  • the functions of the second reliability calculation unit 6 and the third reliability calculation unit 7 are realized by the processor 101 shown in FIG. 2B executing a program stored in the memory 102.
  • the pupil position detection unit 4 of the second embodiment outputs information including the position of the eye feature point and one or more pupil positions to the second reliability calculation unit 6 in addition to the first reliability calculation unit 5.
  • the first reliability calculation unit 5 of the second embodiment outputs information including the position of the eye feature point, one or more pupil positions, and the first reliability of each pupil position to the third reliability calculation unit 7. ..
  • the second reliability calculation unit 6 calculates the degree of expansion of the plurality of pupil positions based on the positional relationship of the plurality of pupil positions.
  • the second reliability calculation unit 6 calculates the second reliability, which is an index of the certainty of the plurality of pupil positions, based on the calculated spread degree.
  • the second reliability calculation unit 6 outputs the calculated second reliability information to the third reliability calculation unit 7.
  • the pupil position detection unit 4 erroneously detects the pupil position. This is an evaluation method based on the idea that there is a high possibility. Since the pupil position detecting unit 4 tries to detect the pupil position in the pupil region, there is a tendency to detect a plurality of pupil positions having substantially the same evaluation value E in the pupil region. Therefore, when the distance spread of the plurality of pupil positions detected by the pupil position detection unit 4 is small, it is highly possible that the plurality of pupil positions exist in the pupil region, and thus the plurality of pupil positions are located. It can be said that it is highly reliable.
  • FIG. 17A is a diagram showing an example of three pupil positions PP1, PP2, and PP3 having a small degree of spread.
  • FIG. 17B is an explanatory diagram showing an example of a method for calculating the second reliability in the example of FIG. 17A.
  • the pupil position detection unit 4 detects three pupil positions PP1, PP2, PP3 from the eye region image I2.
  • the pupil position image unit U_21 corresponds to the pupil position PP1
  • the pupil position image unit U_22 corresponds to the pupil position PP2
  • the pupil position image unit U_23 corresponds to the pupil position PP3.
  • the second reliability calculation unit 6 sets the circumscribed rectangle 21 circumscribing the pupil position image units U_21, U_22, and U_23 with respect to the eye region image I2.
  • the second reliability calculation unit 6 calculates the diagonal length L1 based on the minimum coordinates (xmin, ymin) and the maximum coordinates (xmax, ymax) in the set circumscribing rectangle 21.
  • FIG. 18A is a diagram showing an example of three pupil positions PP4, PP5, and PP6 having a large degree of spread.
  • FIG. 18B is an explanatory diagram showing an example of a method for calculating the second reliability in the example of FIG. 18A.
  • the pupil position detection unit 4 detects three pupil positions PP4, PP5, and PP6 from the eye region image I2.
  • the pupil position image unit U_24 corresponds to the pupil position PP4
  • the pupil position image unit U_25 corresponds to the pupil position PP5
  • the pupil position image unit U_26 corresponds to the pupil position PP6.
  • the second reliability calculation unit 6 sets the circumscribed rectangle 22 circumscribing the pupil position image units U_24, U_25, and U_26 with respect to the eye region image I2.
  • the second reliability calculation unit 6 calculates the diagonal length L2 based on the minimum coordinates (xmin, ymin) and the maximum coordinates (xmax, ymax) in the set circumscribing rectangle 22.
  • the diagonal length L1 is a value representing the degree of spread of the pupil position image units U_21, U_22, and U_23.
  • the diagonal length L2 is a value representing the degree of spread of the pupil position image units U_24, U_25, and U_26.
  • the smaller the value of the diagonal length L2 the higher the reliability of the pupil positions PP4, PP5, PP6 corresponding to the pupil position image units U_24, U_25, U_26.
  • the second reliability calculation unit 6 may use the reciprocal of the diagonal lengths L1 and L2 as the second reliability, or may calculate the second reliability according to the diagonal lengths L1 and L2.
  • the shorter the diagonal lengths L1 and L2 the larger the second reliability is calculated, and the longer the diagonal lengths L1 and L2, the higher the second reliability. Is calculated as a small value.
  • the second reliability calculation unit 6 has set circumscribed rectangles circumscribing at a plurality of pupil positions, but a figure having a shape other than the rectangle may be set. Further, although the second reliability calculation unit 6 has calculated the diagonal length as the degree of spread, an index other than the diagonal length may be calculated as long as it is an index indicating the degree of spread (variation) of a plurality of pupil positions. ..
  • the second reliability calculation unit 6 is the one.
  • the degree of spread of the pupil position may be calculated.
  • the degree of spread is the minimum value.
  • the second reliability is the maximum value.
  • the second reliability calculation unit 6 may perform a plurality of evaluations. A plurality of pupil positions corresponding to an evaluation value E larger than a predetermined value among the values E may be acquired from the pupil position detection unit 4. In this case, the second reliability calculation unit 6 calculates the degree of spread of the plurality of pupil positions acquired from the pupil position detection unit 4, and as a result, determines the reliability of the pupil position corresponding to the maximum evaluation value E. Can be evaluated.
  • the third reliability calculation unit 7 calculates the third reliability using the first reliability calculated by the first reliability calculation unit 5 and the second reliability calculated by the second reliability calculation unit 6. ..
  • the third reliability calculation unit 7 outputs information including the position of the eye feature point, one or more pupil positions, and the third reliability of each pupil position to the line-of-sight angle calculation unit 11. A method of calculating the third reliability will be described with reference to FIG.
  • FIG. 19 is an explanatory diagram showing an example of a method for calculating the third reliability.
  • the first reliability calculation unit 5 calculates that the first reliability of the pupil position PP4 is 60%, the first reliability of the pupil position PP5 is 90%, and the first reliability of the pupil position PP6 is 55%.
  • the second reliability calculation unit 6 calculates that the second reliability common to the pupil positions PP4, PP5, and PP6 is 30%.
  • the third reliability calculation unit 7 uses a predetermined first reliability weight (for example, 0.8) and a second reliability weight (for example, 0.2) to be used in FIG. As shown in the above, the third reliability of the pupil position PP4 is calculated as 54%, the third reliability of the pupil position PP5 is 78%, and the third reliability of the pupil position PP6 is calculated as 50%.
  • the third reliability calculation unit 7 calculates the weighted average value obtained by weighted averaging the first reliability and the second reliability as the third reliability, but the third reliability is calculated by another method such as a simple average. The reliability may be calculated.
  • the line-of-sight angle calculation unit 11 acquires information output by the pupil position detection device 1 including the position of the eye feature point, one or more pupil positions, and the third reliability of each pupil position.
  • the line-of-sight angle calculation unit 11 calculates the line-of-sight angle ⁇ of the subject using the acquired information. For example, when the pupil position detection device 1 outputs the above information regarding only the right eye, the line-of-sight angle calculation unit 11 selects the pupil position having the highest third reliability from the one or more pupil positions included in this information. Then, if the third reliability of the selected pupil position is equal to or higher than a predetermined threshold value, the line-of-sight angle ⁇ is calculated based on the selected pupil position.
  • the line-of-sight angle calculation unit 11 selects the third reliability from among the two or more pupil positions included in the information. The pupil position having the highest degree is selected, and if the third reliability of the selected pupil position is equal to or higher than the above threshold value, the line-of-sight angle ⁇ is calculated based on the selected pupil position. The line-of-sight angle calculation unit 11 does not calculate the line-of-sight angle ⁇ when there is no pupil position having a third reliability that is equal to or higher than the above threshold value.
  • the line-of-sight angle calculation unit 11 may acquire information on the second reliability from the pupil position detection device 1 and determine whether or not to calculate the line-of-sight angle ⁇ based on the second reliability. For example, when the second reliability calculated for a certain eye area image I2 is less than a predetermined threshold value, the line-of-sight angle calculation unit 11 discards the eye area image I2 and does not calculate the line-of-sight angle ⁇ . .. On the other hand, when the second reliability calculated for the eye area image I2 is equal to or higher than the above threshold value, the line-of-sight angle calculation unit 11 has the highest third reliability among the pupil positions detected from the eye area image I2. The line-of-sight angle ⁇ is calculated based on the large pupil position.
  • the pupil position detecting device 1 shown in FIG. 16 has a configuration in which a third reliability is calculated from the first reliability and the second reliability and the third reliability is output to the line-of-sight angle calculation unit 11.
  • the first reliability and the second reliability may be output to the line-of-sight angle calculation unit 11.
  • the pupil position detecting device 1 does not include the third reliability calculation unit 7.
  • the line-of-sight angle calculation unit 11 discards the eye area image I2 and the line of sight. The angle ⁇ is not calculated.
  • the line-of-sight angle calculation unit 11 has the highest reliability among the pupil positions detected from the eye area image I2.
  • the line-of-sight angle ⁇ is calculated based on the large pupil position.
  • FIG. 20 is a flowchart showing an operation example of the line-of-sight detection device 10 according to the second embodiment.
  • the process shown in FIG. 20 is repeatedly executed when a predetermined condition is satisfied (for example, when the ignition power of the vehicle equipped with the line-of-sight detection device 10 is turned on). Since the operations of steps ST1 to ST4 in FIG. 20 are the same as the operations of steps ST1 to ST4 in FIG. 15, the description thereof will be omitted.
  • step ST11 the second reliability calculation unit 6 calculates the degree of spread of the plurality of pupil position PPs based on the positional relationship of the plurality of pupil position PPs detected by the pupil position detection unit 4, and the degree of spread is calculated.
  • the second reliability which is an index of the certainty of the plurality of pupil position PPs, is calculated based on the above.
  • the operation of step ST4 and the operation of step ST11 may be performed in the reverse order or in parallel.
  • the third reliability calculation unit 7 has a first reliability for each one or more pupil position PP calculated by the first reliability calculation unit 5, and one or more calculated by the second reliability calculation unit 6.
  • the weighted average with the second reliability common to the pupil position PP is calculated to obtain the third reliability (step ST12).
  • the line-of-sight angle calculation unit 11 has the highest third reliability value calculated by the third reliability calculation unit 7 among the one or more pupil position PPs detected by the pupil position detection unit 4.
  • the line-of-sight angle ⁇ of the subject is calculated based on (step ST13).
  • the pupil position detecting device 1 includes the second reliability calculation unit 6.
  • the second reliability calculation unit 6 calculates the degree of expansion of the plurality of pupil positions based on the positional relationship of the plurality of pupil positions, and calculates the degree of expansion of the plurality of pupil positions.
  • the second reliability which is an index of the certainty of the plurality of pupil positions, is calculated based on the degree. Since the degree of spread of the plurality of pupil positions is not affected by the imaging environment, the pupil position detecting device 1 can evaluate the reliability of the pupil positions without being affected by the imaging environment.
  • the pupil position detection device 1 includes a third reliability calculation unit 7.
  • the third reliability calculation unit 7 calculates the third reliability using the first reliability calculated by the first reliability calculation unit 5 and the second reliability calculated by the second reliability calculation unit 6. calculate. By combining the first reliability and the second reliability with different evaluation methods, the pupil position detecting device 1 can evaluate the reliability of the pupil position more accurately.
  • the image acquisition unit 2, the eye area extraction unit 3, the pupil position detection unit 4, the first reliability calculation unit 5, the second reliability calculation unit 6, the third reliability calculation unit 7, and the line-of-sight angle calculation unit Although the functions of 11 are integrated in the in-vehicle device mounted on the vehicle, they may be distributed to the server device on the network, the mobile terminal such as a smartphone, the in-vehicle device, and the like.
  • the line-of-sight detection device includes a driver monitoring device that monitors the state of a driver in a vehicle, an occupant monitoring device that monitors the state of each occupant including a driver in a vehicle, and the like. Suitable for use.
  • 1 pupil position detection device 2 image acquisition unit, 3 eye area extraction unit, 4 pupil position detection unit, 5 1st reliability calculation unit, 6 2nd reliability calculation unit, 7 3rd reliability calculation unit, 10 line-of-sight detection Device, 11 line-of-sight angle calculation unit, 12 image pickup device, 21 and 22, circumscribing rectangle, 100 processing circuit, 101 processor, 102 memory.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

L'invention concerne une unité d'extraction de région d'œil (3) qui extrait une région d'œil à partir d'une image acquise par une unité d'acquisition d'image (2). Une unité de détection de position de pupille (4) détecte au moins une position de pupille à partir de la région d'œil extraite par l'unité d'extraction de région d'œil (3). Une première unité de calcul de confiance (5) calcule une différence de luminance entre un pixel dans la position de pupille et un pixel sur la périphérie de la position de pupille pour chaque position de pupille détectée par l'unité de détection de position de pupille (4) et calcule une première confiance qui sert d'indice de la probabilité de la position de la pupille sur la base de la différence de luminance.
PCT/JP2020/007601 2020-02-26 2020-02-26 Dispositif de détection de position de pupille, dispositif de détection de ligne visuelle et procédé de détection de position de pupille Ceased WO2021171395A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/007601 WO2021171395A1 (fr) 2020-02-26 2020-02-26 Dispositif de détection de position de pupille, dispositif de détection de ligne visuelle et procédé de détection de position de pupille

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/007601 WO2021171395A1 (fr) 2020-02-26 2020-02-26 Dispositif de détection de position de pupille, dispositif de détection de ligne visuelle et procédé de détection de position de pupille

Publications (1)

Publication Number Publication Date
WO2021171395A1 true WO2021171395A1 (fr) 2021-09-02

Family

ID=77490795

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/007601 Ceased WO2021171395A1 (fr) 2020-02-26 2020-02-26 Dispositif de détection de position de pupille, dispositif de détection de ligne visuelle et procédé de détection de position de pupille

Country Status (1)

Country Link
WO (1) WO2021171395A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3132967A1 (fr) 2022-02-23 2023-08-25 Patrick CHARREYRON Procédé de génération d’une image reconstituée et dispositif de génération d’une telle image reconstituée.

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006025969A (ja) * 2004-07-14 2006-02-02 Matsushita Electric Ind Co Ltd 瞳孔検出装置および虹彩認証装置
WO2010035472A1 (fr) * 2008-09-26 2010-04-01 パナソニック株式会社 Dispositif et procédé de détermination de la direction d’une ligne de vision
US20140098198A1 (en) * 2012-10-09 2014-04-10 Electronics And Telecommunications Research Institute Apparatus and method for eye tracking
JP2017010337A (ja) * 2015-06-23 2017-01-12 富士通株式会社 瞳孔検出プログラム、瞳孔検出方法、瞳孔検出装置および視線検出システム
JP2017182739A (ja) * 2016-03-31 2017-10-05 富士通株式会社 視線検出装置、視線検出方法及び視線検出用コンピュータプログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006025969A (ja) * 2004-07-14 2006-02-02 Matsushita Electric Ind Co Ltd 瞳孔検出装置および虹彩認証装置
WO2010035472A1 (fr) * 2008-09-26 2010-04-01 パナソニック株式会社 Dispositif et procédé de détermination de la direction d’une ligne de vision
US20140098198A1 (en) * 2012-10-09 2014-04-10 Electronics And Telecommunications Research Institute Apparatus and method for eye tracking
JP2017010337A (ja) * 2015-06-23 2017-01-12 富士通株式会社 瞳孔検出プログラム、瞳孔検出方法、瞳孔検出装置および視線検出システム
JP2017182739A (ja) * 2016-03-31 2017-10-05 富士通株式会社 視線検出装置、視線検出方法及び視線検出用コンピュータプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TIMM FABIAN ET AL.: "ACCURATE EYE CENTRE LOCALISATION BY MEANS OF GRADIENTS", IN PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY, APPLICATIONS, 7 March 2011 (2011-03-07), pages 125 - 130, XP055208015 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3132967A1 (fr) 2022-02-23 2023-08-25 Patrick CHARREYRON Procédé de génération d’une image reconstituée et dispositif de génération d’une telle image reconstituée.

Similar Documents

Publication Publication Date Title
US20250131583A1 (en) Line-of-sight direction tracking method and apparatus
US11681366B2 (en) Gaze tracking using mapping of pupil center position
ES2756677T3 (es) Aparato de procesamiento de imágenes, procedimiento de procesamiento de imágenes y programa
US10842364B2 (en) Image processing device, endoscope apparatus, information storage device, and image processing method
KR101950562B1 (ko) 시선 추적
US10163027B2 (en) Apparatus for and method of processing image based on object region
US20210241487A1 (en) Pupil positioning method and apparatus, vr/ar apparatus and computer readable medium
JP6582604B2 (ja) 瞳孔検出プログラム、瞳孔検出方法、瞳孔検出装置および視線検出システム
US10417495B1 (en) Systems and methods for determining biometric information
US10311583B2 (en) Eye motion detection method, program, program storage medium, and eye motion detection device
US20120177266A1 (en) Pupil detection device and pupil detection method
US12148189B2 (en) Iris recognition system, iris recognition method, and storage medium
US9082000B2 (en) Image processing device and image processing method
JP5776323B2 (ja) 角膜反射判定プログラム、角膜反射判定装置および角膜反射判定方法
US11636711B2 (en) Image processing system, image processing method, and storage medium
CN115484860B (zh) 超光谱视网膜图像中的阴影的实时检测和校正
WO2020093566A1 (fr) Procédé et dispositif de traitement d'image d'hémorragie cérébrale, dispositif informatique et support d'informations
WO2019024350A1 (fr) Procédé et appareil de reconnaissance biométrique
CN109635761B (zh) 一种虹膜识别图像确定方法、装置、终端设备及存储介质
WO2021171395A1 (fr) Dispositif de détection de position de pupille, dispositif de détection de ligne visuelle et procédé de détection de position de pupille
JP2018101211A (ja) 車載器
JP2018101212A (ja) 車載器および顔正面度算出方法
WO2021171396A1 (fr) Dispositif de détection de position de pupille, appareil de détection de ligne visuelle et procédé de détection de position de pupille
WO2020053984A1 (fr) Dispositif d'authentification biométrique, programme de détermination de contrefaçon et procédé de détermination de contrefaçon
WO2021140579A1 (fr) Dispositif de détection de pupille, dispositif de détection de ligne de visée, système de surveillance d'occupant et procédé de détection de pupille

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20922224

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 20922224

Country of ref document: EP

Kind code of ref document: A1