[go: up one dir, main page]

WO2015075894A1 - Imaging device, pupil imaging device, pupil-diameter measurement device, pupil-state detection device, and pupil imaging method - Google Patents

Imaging device, pupil imaging device, pupil-diameter measurement device, pupil-state detection device, and pupil imaging method Download PDF

Info

Publication number
WO2015075894A1
WO2015075894A1 PCT/JP2014/005642 JP2014005642W WO2015075894A1 WO 2015075894 A1 WO2015075894 A1 WO 2015075894A1 JP 2014005642 W JP2014005642 W JP 2014005642W WO 2015075894 A1 WO2015075894 A1 WO 2015075894A1
Authority
WO
WIPO (PCT)
Prior art keywords
pupil
light
image
imaging
wavelength
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2014/005642
Other languages
French (fr)
Japanese (ja)
Inventor
浩 今井
雅雄 今井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to JP2015548974A priority Critical patent/JPWO2015075894A1/en
Publication of WO2015075894A1 publication Critical patent/WO2015075894A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/08Measuring arrangements characterised by the use of optical techniques for measuring diameters
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Definitions

  • the present invention relates to an imaging device, a pupil imaging device, a pupil diameter measuring device, a pupil state detecting device, and a pupil imaging method, and in particular, an imaging device having a single imaging element, a pupil imaging device, a pupil diameter measuring device, a pupil state detecting device, And a pupil imaging method.
  • Patent Documents 1 to 3 describe a method of preparing two sets of imaging devices and measuring a position and the like based on triangulation in order to measure accurate three-dimensional coordinates when an object moves back and forth. .
  • Patent Documents 1 to 3 require at least two cameras, which increases the size and cost of the apparatus.
  • An object of the present invention is to provide an imaging apparatus that solves the above-described problems.
  • the image pickup apparatus of the present invention includes one image pickup device and an image forming unit that forms a plurality of images at different viewpoints at a common portion of the subject irradiated with light from the light source on the one image pickup device, Have
  • an imaging apparatus capable of suppressing an increase in size and cost can be provided.
  • the top view of the imaging device in a first embodiment is shown.
  • the top view in case the imaging device in 1st embodiment has arbitrary light sources is shown.
  • the top view in case the imaging device in 1st embodiment has arbitrary division
  • the top view in case the imaging device in 1st embodiment has an imaging means is shown.
  • the top view of the pupil imaging device in a second embodiment is shown.
  • the top view of the pupil diameter measuring apparatus in 3rd embodiment is shown.
  • the top view in case the pupil diameter measuring apparatus in 3rd embodiment is provided with a pupil extraction means and a pupil diameter calculation means as a pupil diameter estimation means is shown.
  • the flowchart showing the flow until it calculates the three-dimensional coordinate and dimension of one or more pupils from one imaged image is shown.
  • the pupil diameter measuring apparatus according to the third embodiment, (a) an example of an image captured by the imaging device 8, (b) an intensity distribution of an image in a cross section of the line segment PQ in (a), (c) an extracted pupil An example of obtaining the circumference of the portion 21 and the coordinates of the circumference is shown. It is a figure explaining the calculation method of the three-dimensional coordinate of the pupil in 3rd embodiment.
  • the top view of the pupil state detection apparatus in 3rd embodiment is shown. It is a flowchart which shows a process in case it has a pupil state determination means in 3rd embodiment.
  • the top view of the pupil imaging device in a third embodiment is shown. It is the figure which showed an example of the specific structure of the pupil imaging device in 4th embodiment. It is a figure which shows the optical path of the light of the 1st wavelength of the pupil imaging device in 4th embodiment. It is a figure which shows the optical path of the light of the 2nd wavelength of the pupil imaging device in 4th embodiment. It is a figure which shows the structure of the prism body provided with the total reflection surface and the wavelength selection film
  • the top view of the pupil diameter measuring apparatus in 5th embodiment is shown. It is a flowchart in case a pupil diameter estimation means is provided with a pupil extraction means and a pupil diameter calculation means in 5th embodiment.
  • FIG. 5th embodiment It is a flowchart which shows operation
  • A) of the pupil diameter measuring apparatus in 5th embodiment is a figure which shows intensity distribution of the image imaged with the 1st image pick-up element 44, and (b) intensity distribution of the image imaged with the 2nd image pick-up element 45. It is. It is a figure explaining the effect
  • FIG. 1 is a top view of the imaging apparatus 100 according to the present embodiment.
  • the imaging apparatus 100 according to the present embodiment includes one light source 2a, half mirror 2ba, mirrors 2bb to 2bd, one imaging element 5, and a lens 6b.
  • the light source 2a emits incident light to the half mirror 2ba.
  • the half mirror 2ba divides incident light from the light source 2a into light that passes through the half mirror 2ba and travels in the direction of the mirror 2bb, and light that reflects off the half mirror 2ba and travels in the direction of the mirror 2bc.
  • the mirror 2bc reflects incident light from the half mirror 2ba in the direction of the mirror 2bd.
  • the mirrors 2bb and 2bd allow incident light to enter the subject 1 from different positions.
  • FIG. 1 shows an optical axis A and an optical axis B as two optical axes when incident light is incident on the subject 1 from these mirrors 2bb and 2bd.
  • the optical axis A and the optical axis B are parallel, but if the optical axis A and the optical axis B do not intersect and incident light can be incident on different positions of the subject 1, It does not necessarily have to be parallel.
  • the light from the subject 1 on the optical axis A is reflected by the mirror 2bb and the half mirror 2ba and is incident on the lens 6b, and an image on the optical axis A (for example, the first viewpoint 3) is formed on the image sensor 5.
  • the mirror 2bd and the mirror 2bc reflect the light from the subject 1 on the optical axis B and make the light incident on the half mirror 2ba.
  • the half mirror 2ba makes the light from the mirror 2bc incident on the lens 6b.
  • the half mirror 2ba forms an image on the image sensor 5 on the optical axis B (for example, the second viewpoint 4).
  • the image sensor 5 On the image sensor 5, at least light from the subject at the first viewpoint 3 and the second viewpoint 4 which are two different viewpoints is simultaneously imaged by the lens 6b.
  • the light reflected by the half mirror 2ba and traveling in the direction of the mirror 2bc is reflected by the mirror 2bc and incident on the mirror 2bd, is parallel to the light transmitted through the half mirror 2ba and traveled in the direction of the mirror 2bb, and is at a different position on the subject 1 Is incident on.
  • the images at the first viewpoint 3 and the second viewpoint 4 include at least the image of the feature point 1a that is a common part of the subject 1 at each viewpoint. That is, at least an image of the feature point 1 a that is a common part of the subject 1 is formed on the image sensor 5.
  • the configuration of the imaging apparatus 100 according to the present embodiment is adopted, the light of the subject 1 from different viewpoints can be imaged and imaged on one image element 5 by the optical system, so that the configuration of the apparatus is simple.
  • the imaging device 100 capable of reducing the size can be obtained.
  • the exact position of the feature point 1a can be obtained from the captured image by triangulation.
  • the example using the single light source 2a, the half mirror 2ba, the mirror 2bb, the mirror 2bc, and the mirror 2bd has been described, but the present embodiment is not necessarily limited to each of these configurations.
  • the light source 2a shown in FIG. 1 may be replaced by an arbitrary light source 2 having any position / angle such as natural light or indoor lighting.
  • an arbitrary dividing means 2b may be used instead of the half mirror 2ba of FIG.
  • the light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 is imaged on the image sensor 5, but the light from the subject 1 at least at two different viewpoints is combined. You can image it. If light from two or more viewpoints is imaged, it is possible to calculate three-dimensional coordinates by triangulation.
  • the imaging device 5 may be a known device such as a CMOS (complementary metal oxide semiconductor) or a CCD (charge-coupled device).
  • CMOS complementary metal oxide semiconductor
  • CCD charge-coupled device
  • the optical path length between the half mirror 2ba and the mirror 2bb is L
  • the optical path length between the mirror 2bc and the mirror 2bd are respectively L / 2. That is, the sum of the optical path length between the half mirror 2ba and the mirror 2bc and the optical path length between the mirror 2bc and the mirror 2bd is L.
  • the optical path lengths of the light traveling along the path reaching the optical axis B emitted in the direction of the subject 1 are arranged to be equal.
  • the light from the subject 1 from the direction of the optical axis A and the light from the subject 1 from the direction of the optical axis B are imaged on the image sensor 5 with the same optical path length.
  • each light is imaged with the same optical path length, so that accurate three-dimensional coordinates can be calculated from the imaged image by triangulation.
  • processing for obtaining three-dimensional coordinates from an image formed is not performed.
  • the triangulation will be described in detail in the description of the configuration and operation in the third embodiment to be described later. If the sum is L, the optical path length between the half mirror 2ba and the mirror 2bc and the optical path length between the mirror 2bc and the mirror 2bd may be set to arbitrary values.
  • the configuration in which the light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 is imaged on the image sensor 5 uses the above-described number of mirrors, half mirrors, and lenses. Not necessarily.
  • the above-described configuration has an arbitrary mirror, half mirror, lens, and the like that can image light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 on the image sensor 5.
  • An imaging means 6 comprising an optical system consisting of can be substituted.
  • the arbitrary light source 2 shown in FIG. 2 is configured by an optical system such as a light source 2a, a half mirror 2ba, and a mirror 2aa.
  • the imaging apparatus 100 can be made smaller if the configuration of the optical path shared with the optical system constituting the imaging means 6 is configured using these optical system configurations.
  • the internal configuration of the apparatus can be simplified, the imaging apparatus 100 can be manufactured at a lower cost.
  • FIG. 5 is a top view showing a configuration of the pupil imaging device 200 according to the present embodiment.
  • the pupil imaging device 200 of the present embodiment includes a light source 11, a wavelength selection unit 12, and an imaging unit 20, as shown in FIG.
  • the light source 11 emits light having a predetermined wavelength.
  • the wavelength selection unit 12 selectively transmits light having a predetermined wavelength emitted from the light source 11.
  • the image forming means 20 has images at different viewpoints for at least a part of the face including at least one pupil 14a of the detected person 14 irradiated with light of a predetermined wavelength from the light source 11 via the wavelength selecting means 12. Is imaged on one image pickup device 13 via the wavelength selecting means 12.
  • the different viewpoints are on different optical axes substantially parallel to the direction in which the light source 11 emits light, and the different optical axes are parallel to each other.
  • the pupil imaging device 200 of the second embodiment includes a light source 11, a wavelength selection unit 12, a single imaging device 13, a first viewpoint 15, a second viewpoint 16, and an imaging unit 20.
  • the imaging unit 20 includes a half mirror 20a, mirrors 20b to 20d, and a lens 20e will be described.
  • the operation of the pupil imaging apparatus 200 of the present embodiment will be described.
  • the process until the incident light from the light source 11 irradiates the detection subject 14 will be described.
  • the light emitted from the light source 11 enters the half mirror 20a.
  • Incident light at the half mirror 20a is split into light that passes through the half mirror 20a and travels in the direction of the mirror 20b, and light that reflects off the half mirror 20a and travels in the direction of the mirror 20c.
  • the light reflected by the half mirror 20a and traveling in the direction of the mirror 20c is reflected by the mirror 20c and incident on the mirror 20d, and is parallel to the light transmitted through the half mirror 20a and proceeding in the direction of the mirror 20b.
  • the light enters the different positions of the person 14.
  • the light from the detected person 14 at the first viewpoint 15 and the second viewpoint 16 includes light from at least a part of the face including at least one pupil 14a of the detected person 14 at each viewpoint. It is. That is, an image of at least a part of the face including one or more pupils 14 a of the person to be detected 14 is formed on the image sensor 13. Since the optical axes A and B are substantially parallel to the light incident from the person 14 to be detected, fundus reflection light is captured at the pupil portion of the image to be formed.
  • the pupil imaging apparatus 200 In the pupil imaging apparatus 200 according to the present embodiment, at least a part of the face including one or more pupils 14a of the person 14 to be detected at two different viewpoints is applied to one imaging element 13 by the optical system. Since an image can be formed and imaged, the configuration of the apparatus is simple, and the imaging apparatus 200 that can be reduced in size can be obtained. Further, with such an apparatus, an image of the pupil 14a that can accurately obtain the three-dimensional coordinates around the pupil and the pupil diameter using triangulation can be captured.
  • the pupil imaging apparatus 200 may be used to calculate and measure a human feature amount such as the intensity of the pupil image of the detected person and the reflected light distribution of the face in addition to the three-dimensional coordinates around the pupil.
  • the pupil diameter estimating means 21 includes a pupil extracting means 21a and a pupil diameter calculating means 21b.
  • the pupil extracting means 21a extracts at least two regions corresponding to one or more pupils 14a from one image formed on the image sensor 13.
  • the pupil diameter calculating means 21b calculates the three-dimensional coordinates and dimensions of one or more pupils 14a from the at least two regions described above. Since there are usually two pupils 14a, a total of four regions are extracted by the pupil extraction means 21a.
  • the pupil diameter estimating means 21 includes a pupil extracting means 21a and a pupil diameter calculating means 21b.
  • the pupil extraction means 21a extracts at least two regions corresponding to one or more pupils 14a from one image formed on one image sensor 13. From the two regions extracted by the pupil extracting means 21a, the pupil diameter calculating means 21b calculates the three-dimensional coordinates and dimensions of one or more pupils 14a. As described above, since there are usually two pupils 14a, a total of four regions are extracted by the pupil extracting means 21a.
  • the pupil extraction means 21a extracts two regions from one image imaged on one image sensor 13, and the pupil diameter calculation means 21b determines the three-dimensional coordinates and dimensions of one or more pupils 14a. The flow until the calculation is performed will be described with reference to the flowchart of FIG.
  • the pupil extracting means 21a captures one image formed on the image sensor 13 (S2; image capture). Then, the pupil extracting unit 21a performs binarization processing on the captured image, extracts four regions corresponding to the pupil 14a in the image, and outputs them to the pupil diameter calculating unit 21b (S3; pupil part extraction). From the four regions corresponding to the pupil 14a in the image extracted by the pupil extracting means 21a, the pupil diameter calculating means 21b has coordinates (X and Y coordinates) in the surface of the image sensor 13 around the pupil 14a in the image. Is acquired (S4; coordinate acquisition).
  • the pupil diameter calculating means 21b generates a real image including the pupil 14a imaged on the image sensor 13 by triangulation using the calculation methods shown in Equations 1 to 3 described later in addition to the obtained X and Y coordinates.
  • a vertical distance (referred to as Z coordinate) from the image sensor 13 is obtained (S5; acquisition of three-dimensional coordinates of the pupil).
  • the pupil diameter calculating means 21b calculates the pupil diameter from the obtained three-dimensional coordinates (S6; pupil diameter calculation), and records the calculated pupil diameter (S7; pupil diameter recording).
  • the process returns to step S1 after the process of S7 is completed in the flowchart of FIG.
  • the image captured in S2 in FIG. 8, that is, the image captured by the image sensor 8 is a facial image of the detected person 14 as shown in FIG. 9A as an example.
  • This image is an image in which the facial images of the detection subject 14 viewed from the directions of the optical axis A and the optical axis B in FIG.
  • FIG. 9B shows the intensity distribution of the image in the cross section of the line segment PQ in FIG. 9A, and the pupil portions 22 of the detection subject 14 viewed from the directions of the optical axis A and the optical axis B are high. Images are taken with intensity values.
  • the pupil extracting unit 21a acquires and records the coordinates of the circumference of the extracted pupil portion 21 in FIG. 9C.
  • the coordinates to be recorded are the light of the X component of the coordinates recorded by the pupil extracting means 21a (the direction of the line segment PQ in FIG. 9A).
  • the minimum coordinate XARmin and the maximum coordinate XARmax of the right eye image in the axis A direction are recorded.
  • the recorded coordinates are recorded as the minimum coordinate XALmin and the maximum coordinate XALmax of the left eye image in the optical axis A direction.
  • the recorded coordinates are recorded as the minimum coordinate XBRmin and the maximum coordinate XBRmax of the right eye image in the optical axis B direction.
  • the recorded coordinates are recorded as the minimum coordinate XBLmin and the maximum coordinate XBLmax of the left eye image in the optical axis B direction.
  • the Y coordinate can be recorded.
  • FIG. 10 is a diagram for explaining the operation of the pupil diameter calculating means 21 of the pupil diameter measuring apparatus 300 of the present embodiment.
  • the position of the principal point of the lens 20e is the origin, and the optical axis A direction of the lens 20e is the Z direction. Further, it is perpendicular to the optical axis A, and the horizontal direction on the surface of FIG. Further, the depth direction in FIG.
  • the optical axes A and B in FIG. 10 and the optical axes A and B in FIG. 5 are the same.
  • FIG. 10 shows the optical axes A and B in FIG.
  • FIG. 7 re-expressed linearly for each optical axis, the lens 20e, and the image sensor 13.
  • the images viewed from the respective optical axes in FIG. 5 are the viewpoints A and B (corresponding to the first viewpoint 15 and the second viewpoint 16) on the imaging element 13 and the optically equivalent imaging element 13a in FIG.
  • the minimum X coordinate XARmin of the right end of the right eye pupil of the viewpoint A and the minimum X coordinate XBRmin of the right end of the right eye pupil of the viewpoint B recorded by the pupil extracting means 21a in FIG.
  • the positional relationship is as shown in FIG. 10 with respect to the distance.
  • a method of measuring the three-dimensional coordinates E (XRmin, YRmin, ZRmin) will be described by taking the right end of the right eye pupil 14a shown in FIG. 10 as an example.
  • the distance d between the optical axes A and B is L + L / 2 from FIG. F is the focal length of the lens 20e.
  • the lens 20e and the image sensor 13 with respect to the optical axis B in FIG. 5 are expressed as an optically equivalent lens 20ea and an optically equivalent image sensor 13a in FIG.
  • description will be given using this optical equivalent diagram.
  • the detected person 14 and the optical axis A and optical axis B, and the viewpoints A and B have the same relationship as that of triangulation (stereo measurement), and therefore, the three-dimensional coordinates E (XRmin) of the right end of the right eye pupil 14a.
  • YRmin, ZRmin are expressed by Equations 1 to 3.
  • the pupil diameter measuring device 300 is shown, but by further including the pupil state determining means 24, the pupil state detecting device 400 can be obtained.
  • the difference with the pupil diameter measuring apparatus 300 among the structures of the pupil state detection apparatus 400 which has the pupil state determination means 24 is demonstrated using FIG.
  • processing when the pupil state determination unit 24 is provided will be described with reference to the flowchart of FIG.
  • the pupil state determination means 24 determines the psychological state and physiological state of the detected person 14 from the three-dimensional coordinates and dimensions of one or more pupils 14a output from the pupil diameter calculation means 21b.
  • FIG. 12 shows a flowchart when the pupil state determination means 24 performs frequency analysis on the three-dimensional coordinates and dimensions of one or more pupils 14a and determines the psychological state and physiological state of the detected person 14 from the result.
  • the pupil diameter calculation means 21b calculates the pupil diameter in a predetermined cycle in S6.
  • the pupil diameter calculated in S7 is recorded, the pupil diameter data collected in time series in S8 is analyzed by the pupil state determination means 24, and the psychological state and physiological state of the detected person 14 are determined from the analysis result.
  • the process returns to the step of S1. Steps S8 and after may be performed in parallel with the processing of S1 to S7, or may be performed separately.
  • the pupil diameter calculation means 21b calculates the pupil diameter data as time-series data every 1/60 seconds.
  • the calculated pupil diameter data can be recorded by the pupil state determination means 24.
  • the time-series data recorded in the pupil state determination unit 24 can be subjected to wavelet transform as frequency analysis by the pupil state determination unit 24.
  • the frequency band of the wavelet transform is a low frequency band 0.04 to 0.15 Hz with 0.15 Hz as a boundary, and a high frequency band. It is divided into 0.16 Hz to 0.5 Hz. Then, the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude.
  • the parasympathetic nerve is dominant in the low frequency band, and the sympathetic nerve is dominant in the high frequency band. For example, if the value obtained by dividing the power in the low frequency band by the power in the high frequency band is 4 or more, the parasympathetic nerve is dominant and the detected person is relaxed. It is determined that the detector is excited.
  • the pupil diameter measuring apparatus 300 has a simple configuration and can be reduced in size.
  • the pupil diameter can be accurately obtained using triangulation. It is.
  • the pupil state detection device 400 it is possible to provide a small and low-cost device that objectively and quantitatively evaluates the mental state in a free posture without restraining the body of the person 14 to be detected.
  • the frequency band of the wavelet transform may be other than 0.04 to 0.5 Hz as long as the psychological state can be distinguished, and the frequency dividing the low frequency band and the high frequency band may be other than 0.15 Hz.
  • Fourier transform or other methods may be used.
  • the image pickup device 13 was a CMOS sensor having horizontal 1920 pixels and vertical 1080 pixels, and the pixel pitch in the X and Y directions in this case was 3 ⁇ m. Other pixel numbers may be applied to the image definition.
  • the image update frequency of the image sensor 13 was 60 times per second. That is, the pupil diameter data is calculated by the pupil diameter calculation means 21b as time series data every 1/60 seconds, but may be other time intervals.
  • the measurement resolution of the pupil 14a in the X and Y directions when the imaging distance (Z coordinate) is 500 mm is 0.1 mm. It becomes. Further, in the Z direction, the imaging distance could be corrected with a resolution of 1 mm.
  • LED having a wavelength of 850 nm
  • a light having a different wavelength may be used as long as the same effect can be obtained.
  • a sensor for measuring ambient light may be provided to correct the pupillary light reflection.
  • LED is an abbreviation for “Light Emitting Diode”.
  • FIG. 13 is a top view showing the configuration of the pupil imaging device 500 according to this embodiment.
  • the pupil imaging apparatus 500 of the present embodiment is different from the second embodiment in that it includes a second light source 42 and a second imaging means 49.
  • the pupil imaging device 500 of the fourth embodiment includes a first light source 41, a second light source 42, a wavelength selection unit 43, a first imaging element 44, a second imaging element 45, The first imaging means 48 and the second imaging means 49 are provided.
  • the first light source 41 emits light having a first wavelength.
  • the second light source 42 emits light of the second wavelength from a position different from that of the first light source 41 at a different angle.
  • the wavelength selection unit 43 selectively transmits the light having the first wavelength and the light having the second wavelength.
  • the first imaging device 44 selects the light of the first wavelength at the first viewpoint 46 and the second viewpoint 47 from at least a part of the face including the one or more pupils 14a of the person 14 to be detected. Light is received through the means 43.
  • the second imaging element 45 receives, via the wavelength selection means 43, the light having the second wavelength at the first viewpoint 46 of at least a part of the face including the one or more pupils 14a of the person 14 to be detected.
  • the first image forming unit 48 selects the light from the detected person 14 in wavelength selection means substantially parallel to the optical axis A and the optical axis B when the light from the first light source 41 is emitted to the detected person 14. Light is received through 43. Further, the first imaging means 48 is configured so that each of the first viewpoint 46 and the second viewpoint 47 from at least a part of the face including one or more pupils 14a of the subject 14 to be irradiated is first. An image of light having a wavelength of 1 is formed on the first image sensor 44 via the wavelength selection means 43.
  • the second imaging unit 49 generates an image of the second wavelength light at the first viewpoint of at least a part of the face including at least one or more pupils of the subject via the wavelength selection unit 43.
  • An image is formed on the second image sensor 45.
  • the first viewpoint 46 and the second viewpoint 47 are on the optical axis A and the optical axis B, which are two optical axes that are substantially parallel to the direction in which the first light source 41 emits light. , Parallel to each other.
  • the first light source 41 emits light having a first wavelength.
  • the light of the first wavelength is irradiated to the detected person 14 via the wavelength selection means 43.
  • the second light source 42 emits light having the second wavelength at a different angle from a position different from that of the first light source 41.
  • the light of the second wavelength is irradiated to the person to be detected 14.
  • the light of the first wavelength from at least a part of the face including one or more pupils 14a of the detected person 14 is emitted from the first light source 41 to the detected person 14 via the wavelength selecting means 43.
  • the incident light having the first wavelength forms an image on the first image sensor 44 by the first imaging means 48.
  • the light having the second wavelength from at least a part of the face including one pupil 14 a of the person to be detected 14 enters the second imaging unit 49 via the wavelength selection unit 43.
  • the incident light of the second wavelength is imaged on the second image sensor 45 by the second imaging means 49.
  • FIG. 14 is a diagram illustrating an example of a specific configuration of the pupil imaging device 500.
  • FIG. 15A is a diagram showing an optical system of the optical path of the first wavelength from the first light source 41.
  • FIG. 15B is a diagram showing an optical system of the optical path of the second wavelength from the second light source 42.
  • the configuration of FIG. 15A is an optical system that optically has the same action as that of the third embodiment, and each of the optical axes A and B is included in the first image sensor 44 as in FIG. 9A.
  • the face images of the person to be detected 14 corresponding to the direction are captured in an overlapping manner.
  • the first imaging means 48 includes a first wavelength selection mirror 48a, a second wavelength selection mirror 48b, a third wavelength selection mirror 48c, a first mirror 48d, and a second mirror. 48e, a half mirror 48f, and a lens 48g.
  • the second imaging means 49 includes a first wavelength selection mirror 48a, a mirror 49a, a second wavelength selection mirror 48b, a lens 48g, and a third wavelength selection mirror 48c.
  • the first wavelength selection mirror 48a, the second wavelength selection mirror 48b, and the third wavelength selection mirror 48c selectively reflect the light of the first wavelength emitted from the first light source 41, and the second wavelength selection mirror 48b. The light is selectively transmitted to the second wavelength emitted by the light source 42.
  • the light emitted from the first light source 41 enters the half mirror 48f.
  • Incident light that has entered the half mirror 48f is split into light that passes through the half mirror 48f and travels in the direction of the first wavelength selection mirror 48a, and light that reflects by the half mirror 48f and travels in the direction of the second mirror 48e.
  • the light reflected by the half mirror 48f is reflected by the second mirror 48e, reflected by the first mirror 48d, and travels in the direction of the person to be detected 14 via the wavelength selection means.
  • the light transmitted through the half mirror 48f is reflected by the first wavelength selection mirror 48a and travels in the direction of the detected person 14 through the wavelength selection means. That is, the light transmitted through the half mirror 48f is parallel to the light reflected by the half mirror 48f and is incident on different positions of the detection subject 14.
  • the second light source 42 emits light of the second wavelength from the position different from that of the first light source 41 in the direction of the person to be detected 14.
  • the first image sensor 44 and the second image sensor 45 will be described.
  • light from at least a part of the face including one or more pupils 14a of the person to be detected 14 is parallel to the optical axis A and is selected via the wavelength selection unit 43.
  • the light enters the mirror 48a.
  • the light of the 1st wavelength among the lights which entered via the wavelength selection means 43 is reflected by the 1st wavelength selection mirror 48a.
  • the first wavelength light is partially reflected by the half mirror 48f, reflected by the second wavelength selection mirror 48b, incident on the lens 48g, and reflected by the third wavelength selection mirror 48c.
  • the image is formed on the image sensor 44.
  • the first mirror 48d Similarly, light from at least a part of the face including one or more pupils 14a of the person 14 to be detected is incident on the first mirror 48d via the wavelength selection means 43 in parallel with the optical axis B. And the light of the 1st wavelength among the lights which entered via the wavelength selection means 43 is reflected by the 1st mirror 48d. Further, the light is reflected by the second mirror 48e, partially transmitted through the half mirror 48f, and thereafter imaged on the first image sensor 44 through the same path as the light incident parallel to the optical axis A. That is, the image of the person 14 to be detected from the viewpoints in both the optical axis A direction and the optical axis B direction is formed on the first image sensor 44.
  • the pupil 14a has an optical axis (optical axis A and optical axis B) on which the light emitted from the light source 41 is irradiated to the detected person 14 and light reflected from the detected person 14. Since the optical axes (optical axis A and optical axis B) incident on the image sensor 44 are coaxial, fundus reflection light is imaged. Therefore, a so-called bright pupil image is captured.
  • light from at least a part of the face including one or more pupils 14 a of the person to be detected 14 enters the wavelength selection unit 43.
  • the second wavelength light is transmitted through the first wavelength selection mirror 48a.
  • the transmitted light is reflected by the mirror 49a, transmitted by the second wavelength selection mirror 48b, enters the lens 48g, passes through the third wavelength selection mirror 48c, and is coupled onto the second image sensor 45.
  • Imaged that is, unlike the first image sensor 44, only an image of at least a part of the face including one or more pupils 14 a of the person 14 to be detected from the direction of the optical axis A is formed on the image sensor 45.
  • the pupil 14 a has an optical axis on which the light emitted from the light source 42 is applied to the detected person 14, and an optical axis on which the light reflected from the detected person 14 enters the image sensor 44 ( Since the optical axis A and the optical axis B) are non-coaxial, fundus reflection light is not imaged. For this reason, a so-called dark pupil image is captured.
  • the light of the second wavelength from the direction of the optical axis B is transmitted through the second wavelength selection mirror 48b in FIG.
  • the pupil imaging device 500 In the pupil imaging device 500 according to the present embodiment, light sources having different wavelengths are arranged coaxially with the imaging element, and the other is arranged at a non-coaxial position / angle, and on different imaging elements for each wavelength. Since the image of the pupil 14a is formed on the light, the images of the bright pupil and the dark pupil can be captured.
  • the half mirror 48f Since the light incident from both the optical axis A and the optical axis B is imaged on the image sensor 44, the half mirror 48f has a first wavelength transmittance of 50% (a reflectance of 50%). It is preferable to use one.
  • the first wavelength selection mirror 48a, the second wavelength selection mirror 48b, and the third wavelength selection mirror 48c are:
  • An optically equivalent configuration such as the prism body 52 shown in FIG. 15C may be used.
  • the prism body 52 includes a total reflection surface 51 and a wavelength selection film 52. In this case, it arrange
  • other mirrors may have other optically equivalent configurations.
  • the total reflection surface 50 of the prism body 52 is used as the surface of the above-described mirror. Use.
  • FIG. 16 is a top view showing a pupil diameter measuring apparatus 600 of this embodiment.
  • a region obtained by imaging the pupil 14a in one image formed on the first image sensor 44 and a second image sensor 45 are connected.
  • a pupil diameter estimating means 53 is provided for estimating the pupil diameter by triangulation using the same imaged area of the pupil 14a in one image.
  • the pupil diameter estimating means 53 includes a pupil extracting means 53a and a pupil diameter calculating means 53b as shown in FIG.
  • the pupil extracting means 53a images the pupil 14a in the image imaged on the first image sensor 44 and the same pupil 14a in the image imaged on the second image sensor 45. Extract regions. Usually, since there are two pupils, a total of four regions are extracted.
  • the pupil diameter calculating means 53b calculates the three-dimensional coordinates and dimensions of the pupil 14a from the extracted four regions.
  • the pupil extracting means 53a uses a single image formed on the first image sensor 44 and a single image formed on the second image sensor 45. The flow from extraction of two regions until the pupil diameter calculation means 53b calculates the three-dimensional coordinates and dimensions of one or more pupil extraction means 53a will be described with reference to the flowchart of FIG.
  • the pupil extracting means 53a captures the respective images formed on the first image sensor 44 and the second image sensor 45 (S102; image capture).
  • the pupil extracting means 53a calculates the difference between the image formed on the first image sensor 44 and the image formed on the second image sensor 45 (S103; image difference).
  • the pupil extracting means 53a performs binarization processing on the image after the difference calculation, and extracts two regions in which the pupil 14a is imaged (S104; optical axis A image pupil partial extraction).
  • the pupil diameter calculating unit 53b acquires coordinates (X and Y coordinates) in the imaging element plane around the pupil 14a from the two extracted regions, and calculates at least one pupil diameter (S105: viewpoint) A image pupil partial pupil diameter calculation).
  • the pupil diameter calculating means 53b sets a differential filter from the obtained pupil diameter (S106; differential filter setting). Further, the pupil diameter calculating means 53b applies the above-described differential filter to the image formed on the first image sensor 44 (S107; image differential processing).
  • the pupil extracting unit 53a performs binarization processing on the image to which the above-described image differentiation processing is applied, extracts four regions corresponding to the pupil portion 22 in the image, and outputs them to the pupil diameter calculating unit 53b (S108; Pupil part extraction).
  • the pupil diameter calculating means 53b calculates the X and Y coordinates in the first imaging element 44 plane of the four regions corresponding to the pupil portion 22 in the image (S109; coordinate acquisition).
  • the pupil diameter calculation unit 53b converts the pupil image 22 into a real image including the pupil portion 22 imaged on the first image sensor 44 by triangulation using the calculation methods shown in Equations 1 to 3.
  • a distance in the vertical direction (Z coordinate) is obtained from the first image sensor 44 (S110; acquisition of three-dimensional coordinates of the pupil).
  • the pupil diameter calculating means 53b calculates the pupil diameters of the four pupil portions 22 from the obtained three-dimensional coordinates (S111; pupil diameter calculation), and records the calculated pupil diameter (S112; pupil diameter recording).
  • the process returns to step S101 after the process of S112 is completed in the flowchart of FIG.
  • FIG. 19A is an intensity distribution of an image captured by the first image sensor 44
  • FIG. 19B is a diagram illustrating an intensity distribution of an image captured by the second image sensor 45.
  • FIG. 19A which is an image of a bright pupil
  • FIG. 19B which is an image of a dark pupil
  • FIG. 19C only the pupil portion 22 when viewed in the direction of the optical axis A can be extracted.
  • the right and left pupil diameters DR and DL of the pupil portion 22 in the optical axis A direction can be obtained from the extracted pupil portion 22 as S105.
  • the minimum coordinate of the right eye image in the optical axis A direction is recorded as XARmin and the maximum coordinate is recorded as XARmax, as in FIG. 9C.
  • the minimum coordinate of the left eye image in the direction of the optical axis A is recorded as XALmin, and the maximum coordinate is recorded as XALmax.
  • the minimum coordinate of the right eye image in the optical axis B direction is recorded as XBRmin, and the maximum coordinate is recorded as XBRmax.
  • the minimum coordinate of the left eye image in the optical axis B direction is recorded as XBLmin, and the maximum coordinate as XBLmax.
  • the Y component can be recorded.
  • FIG. 20 is a diagram for explaining the operation of the feature extraction filter.
  • a second-order differential filter represented by Expression 4 is exemplified as the feature extraction filter.
  • the differential interval DR is the right pupil diameter of the pupil portion in the optical axis A direction described above.
  • any one of the right and left pupil diameters DR and DL of the pupil portion 22 in the direction of the optical axis A is applied to Equation 4.
  • the image of FIG. 20A is subjected to the second-order differential filter of FIG. 20B as S107, and the binarization process is performed as S108, so that the one having the diameter DR can be selectively extracted, as shown in FIG. 20C.
  • the pupil portion 22 when viewed in the direction of the optical axis B can also be extracted.
  • the extracted pupil portion 22 is output to the pupil diameter calculating means 53b.
  • I (x, y) 2 ⁇ I (x, y) ⁇ I (x ⁇ DR, y) ⁇ I (x + DR, y)
  • the DR value can always be obtained from the extracted image of the pupil 14a in the direction of the optical axis A, even when the imaging distance varies and the DR value increases as shown in FIG. 20E, for example.
  • the time value DR is the differential interval
  • the secondary differential filter can be optimized as shown in FIG. 20F, and the pupil 14a can be selectively extracted as shown in FIG. 20G.
  • the pupil diameter measuring apparatus 600 may further include pupil state determining means 54 as shown in FIG. FIG. 21 shows a flowchart when the pupil state determination means 54 is provided. From the value of the pupil diameter (S111; pupil diameter calculation) calculated in FIG. 18 (S112; pupil diameter recording) and the analysis result of the frequency analysis (S8 in FIG. 12) shown in the third embodiment, the detected person 14 The psychological state and physiological state are determined. Note that, as described in the third embodiment, as a method for determining the psychological state and physiological state of the person 14 to be detected, first, as described in the third embodiment, the frequency band of wavelet transform is set to a low frequency band 0.
  • the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude. That is, when the pupil state determination means 54 is provided, the flowchart of FIG. 21 returns to the step of S201 after the processing of S211 is completed.
  • the steps after S212 may be performed in parallel with the processing of S201 to S211 or may be performed separately.
  • the pupil imaging apparatus 600 even when the difference in intensity level between the face portion 23 and the pupil portion 22 is small, an accurate pupil diameter can be calculated by binarization processing. Moreover, the pupil state detection apparatus 700 in this embodiment can provide a small and low-cost apparatus that objectively and quantitatively evaluates the mental state in a free posture without restraining the body of the person 14 to be detected. Is possible.
  • a feature extraction filter may be a two-dimensional distribution represented by Equation 5 instead of a second-order differential filter.
  • the filtering process is as shown in FIGS. 20D and 20H.
  • I (x, y) 4 ⁇ I (x, y) ⁇ I (x ⁇ D, y) ⁇ I (x + D, y) ⁇ I (x, y ⁇ D) ⁇ I (x, y + D)
  • the left pupil diameter DL may be used, or the right and left pupil diameters DR and DL may be used alternately.
  • An LED having a wavelength of 850 nm was used as the light source 1.
  • An LED with a wavelength of 940 nm was used as the light source 67.
  • the light sources 1 and 2 may have different wavelengths as long as the same effect can be obtained.
  • wavelength selection filter a dichroic mirror using a dielectric multilayer film can be used, but other known wavelength selection filters may be used.
  • the optical axis A and the optical axis B are parallel, but it may be a condition for convergence imaging in which the angle between the optical axis A and the optical axis B is changed by changing the angle of each mirror.
  • FIG. 22 is a top view showing a configuration of a pupil state detection / display apparatus 800 according to the present embodiment.
  • the pupil state detection display device 800 of the present embodiment includes a display device 61 between the detected person 14 and the wavelength selection means 43 as compared with the third or fifth embodiment. Is different. Other configurations and configuration variations are the same as those in the third and fifth embodiments.
  • 22A shows a case where the display device 61 is an optical filter 61a
  • FIG. 22B shows an optical filter 61a and a display unit 61b, an information processing device 61c
  • FIG. 22C shows an optical filter 61a, a display unit 61b, and a half mirror 61d. .
  • the pupil state detection device 101 is the pupil state detection device 400 or the pupil state detection device 700 of the third and fifth embodiments of the present invention.
  • the optical filter 61a in FIG. 22A is visible light total reflection and infrared light total transmission, when it sees from the to-be-detected person 14, it looks like a mirror. Further, if the optical filter 61a is made to absorb visible light and transmit infrared light, it will appear as a black plane when viewed from the person 14 to be detected.
  • FIG. 22B shows a configuration in which the result determined by the information processing device 61c is displayed on the display unit 61b based on the determination result of the pupil state detection device 101.
  • the subject 14 is notified of the fatigue state and the mental state, or a visual stimulus for measuring the reaction of the subject 14 such as light reflection is given. Also good.
  • the frequency band of wavelet transform is set to a low frequency band 0. It is divided into 04 to 0.15 Hz and high frequency band 0.16 Hz to 0.5 Hz. Then, the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude.
  • FIG. 22C is a configuration provided with a half mirror 61c in the configuration shown in FIG. 22B.
  • a half mirror 61c When there is no display on the display unit 61b, it acts as a mirror, and when there is a display on the display unit 61b, the mirror image of the detected person 14 and the display image can be optically fused.
  • the detected person 14 can evaluate a more accurate psychological state without feeling that it is being measured. Furthermore, in the case of the configuration of FIG. 22B, a visual stimulus is given to the detected person 14 according to the evaluation result by the psychological state of the detected person 14, the pupil state detecting device 400 of the physiological state, and the pupil state detecting device 700, It has become possible to have further effects on stress management and treatment of mental illness. In the case of the configuration of FIG. 22C, the effect of the configuration of FIG. 22A and FIG. 22B described above can be obtained according to the display content.
  • a liquid crystal display can be used as the display unit 61b.
  • an organic EL display or other display may be used.
  • EL is an abbreviation for “Electroluminescence”.
  • the display unit 61b may be a stereoscopic display using a lenticular lens or a parallax barrier, in addition to a normal two-dimensional display.
  • the left and right pupil positions obtained by the pupil state detection device 101, the convergence angle obtained from the pupil position, and the focus adjustment state of the eyeball obtained from the pupil diameter are used to observe the stereoscopic display.
  • An appropriate parallax image may be displayed.
  • An imaging apparatus comprising: an imaging unit configured to form a plurality of images at different viewpoints of a common portion of a subject irradiated with light from a light source on the one imaging element.
  • the at least two different viewpoints are a first viewpoint and a second viewpoint,
  • the imaging means reflects the light from the subject at the first viewpoint by a mirror and enters the lens to form an image on the one image sensor, and the light from the subject at the second viewpoint.
  • the imaging apparatus according to appendix 1, wherein the image is reflected by a mirror and incident on the lens so as to form an image of the subject at the first viewpoint and the second viewpoint at the same time on the one imaging element.
  • the arbitrary light source includes one light source, a dividing unit that divides incident light emitted from the one light source, and a subject direction from two directions that are different depending on an angle at which the incident light divided by the dividing unit is parallel or converged.
  • the imaging apparatus according to appendix 1 or 2 wherein the imaging apparatus is a mirror that is made incident on the lens.
  • the dividing unit is a half mirror.
  • a light source that emits light of a predetermined wavelength; Wavelength selection means for selectively transmitting light of the predetermined wavelength; One image sensor, Respective images at different viewpoints of at least a part of the face including one or more pupils of the detection subject irradiated with light of the predetermined wavelength from the light source via the wavelength selection unit, the wavelength selection unit An image forming means for forming an image on the one image sensor via The different viewpoints are on different optical axes substantially parallel to the direction in which the light source emits light; The different optical axes are arranged parallel or converging to each other, Pupil imaging device.
  • the pupil diameter is determined by triangulation using the pupil imaging device of Appendix 5 and at least two regions corresponding to the one or more pupils in one image formed on the one imaging element.
  • a pupil diameter measuring device comprising: a pupil diameter estimating means for estimating.
  • Appendix 8 The pupil diameter measuring device according to appendix 6 or 7, and pupil state determining means for determining the psychological state and physiological state of the detected person from the three-dimensional coordinates and dimensions of the one or more pupils output from the pupil diameter calculating means And a pupillary state detection device.
  • a first light source that emits light of a first wavelength
  • a second light source that emits light of a second wavelength at an angle different from that of the first light source
  • Wavelength selection means for selectively transmitting the light of the first wavelength and the light of the second wavelength
  • a first image sensor A second imaging device;
  • the optical axis is coaxial, and one of the detected persons irradiated with the light of the first wavelength through the wavelength selecting means
  • Images of the light of the first wavelength at the first viewpoint and the second viewpoint of at least a part of the face including the pupil are connected onto the first imaging element via the wavelength selection unit.
  • First imaging means for imaging The image by the light of the second wavelength at the first viewpoint of at least a part of the face including at least one pupil of the subject irradiated with the light of the second wavelength is passed through the wavelength selection means.
  • second image forming means for forming an image on the second image sensor The first viewpoint and the second viewpoint are on two optical axes substantially parallel to a direction in which the first light source emits light, The two optical axes are arranged parallel to each other or converging. Pupil imaging device.
  • Appendix 10 One or more of the pupil imaging device according to appendix 9, one image imaged on the first image sensor, and one image imaged on the second image sensor And a pupil diameter estimating means for estimating a pupil diameter by triangulation using at least two regions corresponding to the pupil of the pupil diameter.
  • the pupil diameter estimating means determines the region of the one or more pupils from one image imaged on the first image sensor and one image imaged on the second image sensor.
  • Pupil extracting means for extracting;
  • the pupil diameter measuring device according to appendix 10 further comprising pupil diameter calculating means for calculating and outputting three-dimensional coordinates and dimensions of the one or more pupils.
  • the pupil diameter measuring device is a first wavelength selection mirror that selectively reflects the first wavelength of light from at least a part of the face including the one or more pupils in the first viewpoint.
  • the second imaging means reflects light from at least a part of the face including the one or more pupils at the first viewpoint with a mirror and makes it incident on a lens to be coupled onto the second imaging element.
  • the pupil image pickup device according to appendix 9, wherein an image is formed.
  • a pupil state detection display device comprising: the pupil imaging device according to attachment 15; and a display device further provided between the detected person and the wavelength selection unit.
  • Appendix 17 Emitting light of a predetermined wavelength from the light source to at least a part of the face including one or more pupils of the subject; Light from at least a part of the face including the one or more pupils on at least two different viewpoints on each of the two optical axes substantially parallel to the direction of emitting the light on one image sensor Image, Pupil imaging method.
  • the first light source emits light of the first wavelength to at least a part of the face including one pupil of the detected person
  • the second light source emits light of the second wavelength to at least a part of the face including one pupil of the detected person at a different angle from a position different from the first light source
  • the light of the first wavelength and the light of the second wavelength that enter from at least a part of the face including one pupil of the detected person selects the light of the first wavelength and the light of the second wavelength Is incident on the first imaging means and the second imaging means via the wavelength selection means that transmits the light
  • the first imaging means whose optical axis is coaxial is one pupil of the detected person at the first viewpoint and the second viewpoint.
  • a second imaging means images the light of the second wavelength of at least a part of the face including at least one pupil of the subject at the first viewpoint on the second image sensor; Pupil imaging method.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Eye Examination Apparatus (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

[Problem] To provide an imaging device which solves the problems of device size and cost increases. [Solution] This imaging device has one imaging element and an image formation means for forming an image, on the one imaging element, of a plurality of images from different perspectives of a shared site on an imaging subject irradiated by light from a light source.

Description

撮像装置、瞳孔撮像装置、瞳孔径測定装置、瞳孔状態検出装置、及び瞳孔撮像方法Imaging device, pupil imaging device, pupil diameter measuring device, pupil state detection device, and pupil imaging method

 本発明は撮像装置、瞳孔撮像装置、瞳孔径測定装置、瞳孔状態検出装置、及び瞳孔撮像方法に関し、特に一つの撮像素子を有する撮像装置、瞳孔撮像装置、瞳孔径測定装置、瞳孔状態検出装置、及び瞳孔撮像方法に関する。 The present invention relates to an imaging device, a pupil imaging device, a pupil diameter measuring device, a pupil state detecting device, and a pupil imaging method, and in particular, an imaging device having a single imaging element, a pupil imaging device, a pupil diameter measuring device, a pupil state detecting device, And a pupil imaging method.

 対象物が前後に動く場合に、正確な三次元座標を測定するために、撮像装置を2組用意して三角測量に基づいて位置等を測定する方法が特許文献1乃至3に記載されている。 Patent Documents 1 to 3 describe a method of preparing two sets of imaging devices and measuring a position and the like based on triangulation in order to measure accurate three-dimensional coordinates when an object moves back and forth. .

特開2007-144113号公報JP 2007-144113 A 特開2012-210394号公報JP 2012-210394 A 特開平11-063927号公報Japanese Patent Laid-Open No. 11-063927

 しかし、特許文献1乃至3に記載の技術では少なくとも2台のカメラが必要であり、その分装置のサイズやコストが増大するという問題があった。 However, the techniques described in Patent Documents 1 to 3 require at least two cameras, which increases the size and cost of the apparatus.

 本発明の目的は、上述した課題を解決する撮像装置を提供することにある。 An object of the present invention is to provide an imaging apparatus that solves the above-described problems.

 本発明の撮像装置は、一つの撮像素子と、光源から光を照射される被写体の共通する箇所の異なる視点での複数の像を、前記一つの撮像素子上に結像させる結像手段と、を有する。 The image pickup apparatus of the present invention includes one image pickup device and an image forming unit that forms a plurality of images at different viewpoints at a common portion of the subject irradiated with light from the light source on the one image pickup device, Have

 本発明によれば、一台のカメラで、三角測量可能な画像を撮像できるため、サイズ、コストの増大を抑制することが可能な撮像装置を提供できる。 According to the present invention, since an image capable of triangulation can be taken with a single camera, an imaging apparatus capable of suppressing an increase in size and cost can be provided.

第一の実施形態における撮像装置の上面図を示す。The top view of the imaging device in a first embodiment is shown. 第一の実施形態における撮像装置が任意の光源を有する場合の上面図を示す。The top view in case the imaging device in 1st embodiment has arbitrary light sources is shown. 第一の実施形態における撮像装置が任意の分割手段を有する場合の上面図を示す。The top view in case the imaging device in 1st embodiment has arbitrary division | segmentation means is shown. 第一の実施形態における撮像装置が結像手段を有する場合の上面図を示す。The top view in case the imaging device in 1st embodiment has an imaging means is shown. 第二の実施形態における瞳孔撮像装置の上面図を示す。The top view of the pupil imaging device in a second embodiment is shown. 第三の実施形態における瞳孔径測定装置の上面図を示す。The top view of the pupil diameter measuring apparatus in 3rd embodiment is shown. 第三の実施形態における瞳孔径測定装置が瞳孔径推定手段として瞳孔抽出手段と、瞳孔径算出手段を備える場合の上面図を示す。The top view in case the pupil diameter measuring apparatus in 3rd embodiment is provided with a pupil extraction means and a pupil diameter calculation means as a pupil diameter estimation means is shown. 第三の実施形態における瞳孔径測定装置において、結像された1枚の画像から、一つ以上の瞳孔の三次元座標、寸法を算出するまでの流れを表すフローチャートを示す。In the pupil diameter measuring apparatus in 3rd embodiment, the flowchart showing the flow until it calculates the three-dimensional coordinate and dimension of one or more pupils from one imaged image is shown. 第三の実施形態における瞳孔径測定装置において、(a)撮像素子8で撮像された画像の一例、(b)(a)における線分PQの断面における画像の強度分布、(c)抽出した瞳孔部分21の円周と、円周の座標の取得例を示す。In the pupil diameter measuring apparatus according to the third embodiment, (a) an example of an image captured by the imaging device 8, (b) an intensity distribution of an image in a cross section of the line segment PQ in (a), (c) an extracted pupil An example of obtaining the circumference of the portion 21 and the coordinates of the circumference is shown. 第三の実施形態における瞳孔の三次元座標の算出方法を説明する図である。It is a figure explaining the calculation method of the three-dimensional coordinate of the pupil in 3rd embodiment. 第三の実施形態における瞳孔状態検出装置の上面図を示す。The top view of the pupil state detection apparatus in 3rd embodiment is shown. 第三の実施形態において瞳孔状態判定手段を有する場合の処理を示すフローチャートである。It is a flowchart which shows a process in case it has a pupil state determination means in 3rd embodiment. 第三の実施形態における瞳孔撮像装置の上面図を示す。The top view of the pupil imaging device in a third embodiment is shown. 第四の実施形態における瞳孔撮像装置の具体的な構成の一例を示した図である。It is the figure which showed an example of the specific structure of the pupil imaging device in 4th embodiment. 第四の実施形態における瞳孔撮像装置の第一の波長の光の光路を示す図である。It is a figure which shows the optical path of the light of the 1st wavelength of the pupil imaging device in 4th embodiment. 第四の実施形態における瞳孔撮像装置の第二の波長の光の光路を示す図である。It is a figure which shows the optical path of the light of the 2nd wavelength of the pupil imaging device in 4th embodiment. 全反射面、波長選択膜を備えたプリズム体の構成を示す図である。It is a figure which shows the structure of the prism body provided with the total reflection surface and the wavelength selection film | membrane. 第五の実施形態における瞳孔径測定装置の上面図を示す。The top view of the pupil diameter measuring apparatus in 5th embodiment is shown. 第五の実施形態において瞳孔径推定手段が瞳孔抽出手段と瞳孔径算出手段を備える場合のフローチャートである。It is a flowchart in case a pupil diameter estimation means is provided with a pupil extraction means and a pupil diameter calculation means in 5th embodiment. 第五の実施形態における瞳孔径測定装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔径測定装置の(a)は第一の撮像素子44で撮像される画像の強度分布と(b)第二の撮像素子45で撮像される画像の強度分布を示す図である。(A) of the pupil diameter measuring apparatus in 5th embodiment is a figure which shows intensity distribution of the image imaged with the 1st image pick-up element 44, and (b) intensity distribution of the image imaged with the 2nd image pick-up element 45. It is. 第五の実施形態における瞳孔径測定装置の特徴抽出フィルタの作用を説明する図である。It is a figure explaining the effect | action of the feature extraction filter of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔径測定装置の特徴抽出フィルタの作用を説明する図である。It is a figure explaining the effect | action of the feature extraction filter of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔径測定装置の特徴抽出フィルタの作用を説明する図である。It is a figure explaining the effect | action of the feature extraction filter of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔径測定装置の特徴抽出フィルタの作用を説明する図である。It is a figure explaining the effect | action of the feature extraction filter of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔径測定装置の特徴抽出フィルタの作用を説明する図である。It is a figure explaining the effect | action of the feature extraction filter of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔径測定装置の特徴抽出フィルタの作用を説明する図である。It is a figure explaining the effect | action of the feature extraction filter of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔径測定装置の特徴抽出フィルタの作用を説明する図である。It is a figure explaining the effect | action of the feature extraction filter of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔径測定装置の特徴抽出フィルタの作用を説明する図である。It is a figure explaining the effect | action of the feature extraction filter of the pupil diameter measuring apparatus in 5th embodiment. 第五の実施形態における瞳孔状態検出装置の動作を示すフローチャートである。It is a flowchart which shows operation | movement of the pupil state detection apparatus in 5th embodiment. 第六の実施形態における瞳孔状態検出表示装置が表示装置として光学フィルタを有する場合の斜視図である。It is a perspective view in case the pupil state detection display apparatus in 6th Embodiment has an optical filter as a display apparatus. 第六の実施形態における瞳孔状態検出表示装置が表示装置として光学フィルタ61aと表示部61bを有する場合の斜視図である。It is a perspective view in case the pupil state detection display apparatus in 6th Embodiment has the optical filter 61a and the display part 61b as a display apparatus. 第六の実施形態における瞳孔状態検出表示装置が表示装置として光学フィルタ61a、表示部61b、ハーフミラー61dを有する場合の斜視図である。It is a perspective view in case the pupil state detection display apparatus in 6th Embodiment has the optical filter 61a, the display part 61b, and the half mirror 61d as a display apparatus.

[第一の実施形態]以下に、本発明を実施するための好ましい形態について図面を用いて説明する。但し、以下に述べる実施形態には、本発明を実施するために技術的に好ましい限定がされているが、発明の範囲を以下に限定するものではない。 [First Embodiment] A preferred embodiment for carrying out the present invention will be described below with reference to the drawings. However, the preferred embodiments described below are technically preferable for carrying out the present invention, but the scope of the invention is not limited to the following.

 [構成の説明]図1は、本実施形態における撮像装置100の上面図である。図1に示すように、本実施形態における撮像装置100は、一つの光源2aと、ハーフミラー2baと,ミラー2bb~2bdと、一つの撮像素子5と、レンズ6bと、を備える。 [Description of Configuration] FIG. 1 is a top view of the imaging apparatus 100 according to the present embodiment. As shown in FIG. 1, the imaging apparatus 100 according to the present embodiment includes one light source 2a, half mirror 2ba, mirrors 2bb to 2bd, one imaging element 5, and a lens 6b.

 光源2aは、ハーフミラー2baに入射光を出射する。ハーフミラー2baは、光源2aからの入射光を、ハーフミラー2baを透過しミラー2bb方向に進む光と、ハーフミラー2baで反射しミラー2bc方向に進む光に分割する。ミラー2bcはハーフミラー2baからの入射光をミラー2bd方向に反射する。ミラー2bb、2bdは、互いに異なる位置から入射光を被写体1に向かって入射させる。ここで、これらミラー2bb、2bdから被写体1に向かって入射光を入射させる場合の2つの光軸として、図1は、光軸A、光軸Bを示している。なお、本実施形態では、光軸A及び光軸Bは、平行であるが、これら光軸A及び光軸Bが交差せずに、被写体1の異なる位置に入射光を入射可能であれば、必ずしも平行である必要はない。光軸Aにおける被写体1からの光をミラー2bb、ハーフミラー2baで反射させてレンズ6bに入射させ、光軸A上(例えば、第一の視点3)での像を撮像素子5上に結像させる。ミラー2bd、ミラー2bcは、光軸Bにおける被写体1からの光を反射させ、ハーフミラー2baに光を入射させる。ハーフミラー2baは、ミラー2bcからの光をレンズ6bに入射させる。そして、ハーフミラー2baは、光軸B上(例えば、第二の視点4)での像を撮像素子5上に結像させる。撮像素子5上には、少なくとも、異なる2視点である第一の視点3と第二の視点4における被写体からの光がレンズ6bによって同時に結像される。 The light source 2a emits incident light to the half mirror 2ba. The half mirror 2ba divides incident light from the light source 2a into light that passes through the half mirror 2ba and travels in the direction of the mirror 2bb, and light that reflects off the half mirror 2ba and travels in the direction of the mirror 2bc. The mirror 2bc reflects incident light from the half mirror 2ba in the direction of the mirror 2bd. The mirrors 2bb and 2bd allow incident light to enter the subject 1 from different positions. Here, FIG. 1 shows an optical axis A and an optical axis B as two optical axes when incident light is incident on the subject 1 from these mirrors 2bb and 2bd. In this embodiment, the optical axis A and the optical axis B are parallel, but if the optical axis A and the optical axis B do not intersect and incident light can be incident on different positions of the subject 1, It does not necessarily have to be parallel. The light from the subject 1 on the optical axis A is reflected by the mirror 2bb and the half mirror 2ba and is incident on the lens 6b, and an image on the optical axis A (for example, the first viewpoint 3) is formed on the image sensor 5. Let The mirror 2bd and the mirror 2bc reflect the light from the subject 1 on the optical axis B and make the light incident on the half mirror 2ba. The half mirror 2ba makes the light from the mirror 2bc incident on the lens 6b. Then, the half mirror 2ba forms an image on the image sensor 5 on the optical axis B (for example, the second viewpoint 4). On the image sensor 5, at least light from the subject at the first viewpoint 3 and the second viewpoint 4 which are two different viewpoints is simultaneously imaged by the lens 6b.

 [動作の説明]次に、本実施形態における撮像装置100の動作について説明する。まず、光源2aからの入射光が被写体1に入射するまでを説明する。光源2aから発光した光はハーフミラー2baに入射する。ハーフミラー2baで入射光は、ハーフミラー2baを透過しミラー2bb方向に進む光と、ハーフミラー2baで反射しミラー2bc方向に進む光に分割される。ハーフミラー2baで反射しミラー2bc方向に進んだ光はミラー2bcで反射しミラー2bdに入射し、ハーフミラー2baを透過しミラー2bb方向に進んだ光と互いに平行であって、被写体1の異なる位置に入射する。 [Description of Operation] Next, the operation of the imaging apparatus 100 in the present embodiment will be described. First, the process until the incident light from the light source 2a enters the subject 1 will be described. The light emitted from the light source 2a enters the half mirror 2ba. Incident light at the half mirror 2ba is split into light that passes through the half mirror 2ba and travels in the direction of the mirror 2bb, and light that reflects off the half mirror 2ba and travels in the direction of the mirror 2bc. The light reflected by the half mirror 2ba and traveling in the direction of the mirror 2bc is reflected by the mirror 2bc and incident on the mirror 2bd, is parallel to the light transmitted through the half mirror 2ba and traveled in the direction of the mirror 2bb, and is at a different position on the subject 1 Is incident on.

 次に、被写体1からの光が撮像素子5上に結像するまでを説明する。まず、光軸Aにおける被写体1からの光はミラー2bbで反射され、ハーフミラー2baで反射され、レンズ6bに入射し、光軸A上(例えば、第一の視点3)での像が撮像素子5上に結像する。一方光軸B方向の被写体1からの光はミラー2bdで反射され、さらにミラー2bcで反射され、ハーフミラー2baを透過してレンズ6bに入射し、第一の視点3での被写体からの光とともに光軸B上(例えば、第二の視点4)での像は撮像素子5上に結像する。ここで、第一の視点3と第二の視点4での像は、おのおのの視点で少なくとも被写体1の共通する箇所である特徴点1aの像を含む。すなわち、少なくとも被写体1の共通する箇所である特徴点1aの像が撮像素子5上に結像する。 Next, the process until the light from the subject 1 forms an image on the image sensor 5 will be described. First, light from the subject 1 on the optical axis A is reflected by the mirror 2bb, reflected by the half mirror 2ba, enters the lens 6b, and an image on the optical axis A (for example, the first viewpoint 3) is imaged. 5 is imaged. On the other hand, the light from the subject 1 in the direction of the optical axis B is reflected by the mirror 2bd, further reflected by the mirror 2bc, passes through the half mirror 2ba, enters the lens 6b, and together with the light from the subject at the first viewpoint 3 An image on the optical axis B (for example, the second viewpoint 4) is formed on the image sensor 5. Here, the images at the first viewpoint 3 and the second viewpoint 4 include at least the image of the feature point 1a that is a common part of the subject 1 at each viewpoint. That is, at least an image of the feature point 1 a that is a common part of the subject 1 is formed on the image sensor 5.

 [効果の説明]本実施形態における撮像装置100の構成を採用すれば、異なる視点での被写体1の光を光学系によって一つの画像素子5上に結像し撮像できるため、装置の構成が単純で、サイズを小さくしうる撮像装置100を得られる。また、前述のような構成で、撮像した画像から特徴点1aの正確な位置を三角測量によって求めることができる。 [Explanation of Effects] If the configuration of the imaging apparatus 100 according to the present embodiment is adopted, the light of the subject 1 from different viewpoints can be imaged and imaged on one image element 5 by the optical system, so that the configuration of the apparatus is simple. Thus, the imaging device 100 capable of reducing the size can be obtained. Further, with the above-described configuration, the exact position of the feature point 1a can be obtained from the captured image by triangulation.

 上述の説明においては、一つの光源2aと、ハーフミラー2baと、ミラー2bb、ミラー2bc、ミラー2bdを用いる例を示したが、本実施形態は、必ずしもこれらの各構成に限定されるものではない。図2に示すように、図1の光源2aに変えて、自然光や室内照明など、位置・角度が任意である任意の光源2でもよい。また、図3に示すように、図1のハーフミラー2baに変えて、任意の分割手段2bでもよい。 In the above description, the example using the single light source 2a, the half mirror 2ba, the mirror 2bb, the mirror 2bc, and the mirror 2bd has been described, but the present embodiment is not necessarily limited to each of these configurations. . As shown in FIG. 2, the light source 2a shown in FIG. 1 may be replaced by an arbitrary light source 2 having any position / angle such as natural light or indoor lighting. Further, as shown in FIG. 3, an arbitrary dividing means 2b may be used instead of the half mirror 2ba of FIG.

 また、上述の説明においては第一の視点3と第二の視点4での被写体1からの光を撮像素子5上に結像させたが、少なくとも異なる2視点での被写体1からの光を結像させればよい。2視点以上からの光を結像させれば、三角測量による三次元座標の算出は可能である。 In the above description, the light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 is imaged on the image sensor 5, but the light from the subject 1 at least at two different viewpoints is combined. You can image it. If light from two or more viewpoints is imaged, it is possible to calculate three-dimensional coordinates by triangulation.

 撮像素子5はCMOS(complementary metal oxide semiconductor)やCCD(charge-coupled device)等の既知の素子を用いることができる。 The imaging device 5 may be a known device such as a CMOS (complementary metal oxide semiconductor) or a CCD (charge-coupled device).

 また、図1において、ハーフミラー2baとミラー2bbとの間の光路長はLであり、ハーフミラー2baとミラー2bcとの間の光路長とミラー2bcとミラー2bdとの間の光路長は、それぞれL/2である。つまり、ハーフミラー2baとミラー2bcの間の光路長とミラー2bcとミラー2bdの間の光路長の和はLである。これは、光源2aからミラー2bbに入射し、ミラー2bbから被写体1方向に出射する光軸Aに至る経路と、光源2aからミラー2bcに入射し、ミラー2bcから2bdに入射し、そしてミラー2bdから被写体1方向に出射する光軸Bに至る経路とで進む光の光路長が等しくなるように配置されている。このことで、光軸A方向からの被写体1からの光と、光軸B方向からの被写体1からの光が、等しい光路長で撮像素子5上に結像される。このように、おのおのの光が等しい光路長で結像することで、結像した画像から三角測量によって正確な三次元座標を算出することができる。本実施形態における撮像装置100では結像した画像から三次元座標を求める処理は行わない。また、三角測量の説明は、後述する第三の実施形態における構成・動作の説明にて詳述する。前述の光路長は、和がLであればハーフミラー2baとミラー2bcの間の光路長、ミラー2bcとミラー2bdの間の光路長は任意の値に設定してもよい。 In FIG. 1, the optical path length between the half mirror 2ba and the mirror 2bb is L, the optical path length between the half mirror 2ba and the mirror 2bc, and the optical path length between the mirror 2bc and the mirror 2bd are respectively L / 2. That is, the sum of the optical path length between the half mirror 2ba and the mirror 2bc and the optical path length between the mirror 2bc and the mirror 2bd is L. This is the path from the light source 2a to the mirror 2bb, the path from the mirror 2bb to the optical axis A exiting in the direction of the subject 1, the light source 2a to the mirror 2bc, the mirror 2bc to the 2bd, and the mirror 2bd The optical path lengths of the light traveling along the path reaching the optical axis B emitted in the direction of the subject 1 are arranged to be equal. Thus, the light from the subject 1 from the direction of the optical axis A and the light from the subject 1 from the direction of the optical axis B are imaged on the image sensor 5 with the same optical path length. In this way, each light is imaged with the same optical path length, so that accurate three-dimensional coordinates can be calculated from the imaged image by triangulation. In the imaging apparatus 100 according to the present embodiment, processing for obtaining three-dimensional coordinates from an image formed is not performed. The triangulation will be described in detail in the description of the configuration and operation in the third embodiment to be described later. If the sum is L, the optical path length between the half mirror 2ba and the mirror 2bc and the optical path length between the mirror 2bc and the mirror 2bd may be set to arbitrary values.

 また、第一の視点3と第二の視点4での被写体1からの光を撮像素子5上に結像するに至る構成は、前述のミラー、ハーフミラー、レンズすべてを上述の数用いたものでなくてもよい。図4のように、上述の構成は、第一の視点3と第二の視点4での被写体1からの光を撮像素子5上に結像することができる任意のミラー、ハーフミラー、レンズなどからなる光学系からなる結像手段6であれば代用できる。ここで図3に示す構成は、図2に示す任意の光源2を、光源2a、ハーフミラー2baやミラー2aaなどの光学系で構成している。これらの光学系の構成を用いて、結像手段6を構成する光学系と共用する光路の構成とすれば、撮像装置100はより小さくできる。さらに、装置内部の構成をより簡単にできるため、撮像装置100をより安価に製造することができる。 In addition, the configuration in which the light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 is imaged on the image sensor 5 uses the above-described number of mirrors, half mirrors, and lenses. Not necessarily. As shown in FIG. 4, the above-described configuration has an arbitrary mirror, half mirror, lens, and the like that can image light from the subject 1 at the first viewpoint 3 and the second viewpoint 4 on the image sensor 5. An imaging means 6 comprising an optical system consisting of can be substituted. In the configuration shown in FIG. 3, the arbitrary light source 2 shown in FIG. 2 is configured by an optical system such as a light source 2a, a half mirror 2ba, and a mirror 2aa. The imaging apparatus 100 can be made smaller if the configuration of the optical path shared with the optical system constituting the imaging means 6 is configured using these optical system configurations. Furthermore, since the internal configuration of the apparatus can be simplified, the imaging apparatus 100 can be manufactured at a lower cost.

 [第二の実施形態]次に第二の実施形態について説明をする。図5は本実施形態に関する瞳孔撮像装置200の構成を示す上面図である。 [Second Embodiment] Next, a second embodiment will be described. FIG. 5 is a top view showing a configuration of the pupil imaging device 200 according to the present embodiment.

 [構成の説明]本実施形態の瞳孔撮像装置200は、図5に示すように、光源11と、波長選択手段12と、結像手段20と、を有する。光源11は、所定の波長の光を発光する。波長選択手段12は、光源11から発光した所定の波長の光を選択的に透過する。結像手段20は、波長選択手段12を介して光源11から所定の波長の光が照射される被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分に対する異なる視点でのおのおのの像を、波長選択手段12を介して一つの撮像素子13上に結像させる。ここで、異なる視点は、光源11が光を出射する方向と略平行な異なる光軸上にあり、異なる光軸は、互いに平行している。 [Description of Configuration] The pupil imaging device 200 of the present embodiment includes a light source 11, a wavelength selection unit 12, and an imaging unit 20, as shown in FIG. The light source 11 emits light having a predetermined wavelength. The wavelength selection unit 12 selectively transmits light having a predetermined wavelength emitted from the light source 11. The image forming means 20 has images at different viewpoints for at least a part of the face including at least one pupil 14a of the detected person 14 irradiated with light of a predetermined wavelength from the light source 11 via the wavelength selecting means 12. Is imaged on one image pickup device 13 via the wavelength selecting means 12. Here, the different viewpoints are on different optical axes substantially parallel to the direction in which the light source 11 emits light, and the different optical axes are parallel to each other.

 それ以外の構成や構成のバリエーションは、第一の実施形態と同様である。つまり、第二の実施形態の瞳孔撮像装置200は、光源11と、波長選択手段12と、一つの撮像素子13と、第一の視点15と、第二の視点16と、結像手段20とを有する。本実施形態では、結像手段20としてハーフミラー20aと、ミラー20b~20dと、レンズ20eと、を備える構成を例に説明する。 Other configurations and configuration variations are the same as those in the first embodiment. That is, the pupil imaging device 200 of the second embodiment includes a light source 11, a wavelength selection unit 12, a single imaging device 13, a first viewpoint 15, a second viewpoint 16, and an imaging unit 20. Have In the present embodiment, an example in which the imaging unit 20 includes a half mirror 20a, mirrors 20b to 20d, and a lens 20e will be described.

 [動作の説明]次に、本実施形態の瞳孔撮像装置200における動作について説明する。まず、光源11からの入射光が被検出者14を照射するまでを説明する。図5に示すように、光源11から発光した光はハーフミラー20aに入射する。ハーフミラー20aで入射光は、ハーフミラー20aを透過しミラー20b方向に進む光と、ハーフミラー20aで反射しミラー20c方向に進む光に分割される。ハーフミラー20aで反射しミラー20c方向に進んだ光は、ミラー20cで反射しミラー20dに入射し、ハーフミラー20aを透過しミラー20b方向に進んだ光と互いに平行し、この光とは被検出者14の異なる位置に入射する。 [Description of Operation] Next, the operation of the pupil imaging apparatus 200 of the present embodiment will be described. First, the process until the incident light from the light source 11 irradiates the detection subject 14 will be described. As shown in FIG. 5, the light emitted from the light source 11 enters the half mirror 20a. Incident light at the half mirror 20a is split into light that passes through the half mirror 20a and travels in the direction of the mirror 20b, and light that reflects off the half mirror 20a and travels in the direction of the mirror 20c. The light reflected by the half mirror 20a and traveling in the direction of the mirror 20c is reflected by the mirror 20c and incident on the mirror 20d, and is parallel to the light transmitted through the half mirror 20a and proceeding in the direction of the mirror 20b. The light enters the different positions of the person 14.

 次に、被検出者14からの光が撮像素子13上に結像するまでを説明する。まず、光軸A方向の被検出者14からの光はミラー20bで反射され、ハーフミラー20aで反射され、レンズ20eに入射し、第一の視点15での像は撮像素子13上に結像する。一方光軸B方向から入射した被検出者14からの光はミラー20dで反射され、さらにミラー20cで反射され、ハーフミラー20aを透過してレンズ20eに入射し、第一の視点15での被検出者14からの光とともに第二の視点16での像は撮像素子13上に結像する。ここで、第一の視点15と第二の視点16における被検出者14からの光は、おのおのの視点で被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分からの光が含まれる。すなわち、被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分の像が撮像素子13上に結像する。光軸A、Bは、被検出者14から入射された光に対して略平行であるから、結像する像の瞳孔部分は眼底反射光が撮像されている。 Next, the process until the light from the person to be detected 14 forms an image on the image sensor 13 will be described. First, light from the person 14 to be detected in the direction of the optical axis A is reflected by the mirror 20b, reflected by the half mirror 20a, and incident on the lens 20e, and an image at the first viewpoint 15 is formed on the image sensor 13. To do. On the other hand, the light from the detected person 14 incident from the direction of the optical axis B is reflected by the mirror 20d, further reflected by the mirror 20c, passes through the half mirror 20a, enters the lens 20e, and is detected at the first viewpoint 15. An image at the second viewpoint 16 together with the light from the detector 14 is formed on the image sensor 13. Here, the light from the detected person 14 at the first viewpoint 15 and the second viewpoint 16 includes light from at least a part of the face including at least one pupil 14a of the detected person 14 at each viewpoint. It is. That is, an image of at least a part of the face including one or more pupils 14 a of the person to be detected 14 is formed on the image sensor 13. Since the optical axes A and B are substantially parallel to the light incident from the person 14 to be detected, fundus reflection light is captured at the pupil portion of the image to be formed.

 [効果の説明]本実施形態における瞳孔撮像装置200では、異なる2視点での被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分の光を光学系によって一つの撮像素子13上に結像し撮像できるため、装置の構成が単純で、サイズを小さくしうる撮像装置200を得られる。さらに、このような装置によって瞳孔周囲の三次元座標、瞳孔径を三角測量を用いて正確に求めることが可能な瞳孔14aの画像を撮像できる。 [Explanation of Effects] In the pupil imaging apparatus 200 according to the present embodiment, at least a part of the face including one or more pupils 14a of the person 14 to be detected at two different viewpoints is applied to one imaging element 13 by the optical system. Since an image can be formed and imaged, the configuration of the apparatus is simple, and the imaging apparatus 200 that can be reduced in size can be obtained. Further, with such an apparatus, an image of the pupil 14a that can accurately obtain the three-dimensional coordinates around the pupil and the pupil diameter using triangulation can be captured.

 本実施形態における瞳孔撮像装置200は、瞳孔周囲の三次元座標以外に、被検出者の瞳孔像の強度、顔面の反射光分布等人物の特徴量を計算、計測することに用いてもよい。 The pupil imaging apparatus 200 according to the present embodiment may be used to calculate and measure a human feature amount such as the intensity of the pupil image of the detected person and the reflected light distribution of the face in addition to the three-dimensional coordinates around the pupil.

 [第三の実施形態]第二の実施形態の構成に加え、撮像素子13上に結像された1枚の画像の中の、一つ以上の瞳孔14aに対応する少なくとも2つの領域を用いて、三角測量によって瞳孔径を推定する瞳孔径推定手段21を有することで、図6のような瞳孔径測定装置300とすることができる。 [Third Embodiment] In addition to the configuration of the second embodiment, at least two regions corresponding to one or more pupils 14a in one image formed on the image sensor 13 are used. By having the pupil diameter estimating means 21 that estimates the pupil diameter by triangulation, the pupil diameter measuring apparatus 300 as shown in FIG. 6 can be obtained.

 瞳孔径推定手段21は、図7に示すように、瞳孔抽出手段21aと、瞳孔径算出手段21bと、を具備する。この瞳孔抽出手段21aは、撮像素子13上に結像された1枚の画像から一つ以上の瞳孔14aに対応する少なくとも2つの領域を抽出する。瞳孔径算出手段21bは、前述の少なくとも2つの領域から、一つ以上の瞳孔14aの三次元座標、寸法を算出する。なお、通常、瞳孔14aは、2つあるため、瞳孔抽出手段21aにより合計4つの領域が抽出される。 As shown in FIG. 7, the pupil diameter estimating means 21 includes a pupil extracting means 21a and a pupil diameter calculating means 21b. The pupil extracting means 21a extracts at least two regions corresponding to one or more pupils 14a from one image formed on the image sensor 13. The pupil diameter calculating means 21b calculates the three-dimensional coordinates and dimensions of one or more pupils 14a from the at least two regions described above. Since there are usually two pupils 14a, a total of four regions are extracted by the pupil extraction means 21a.

 [構成・動作の説明]次に、第二の実施形態との差分の構成と動作を説明する。瞳孔径推定手段21が瞳孔抽出手段21aと瞳孔径算出手段21bを備える場合を例に構成・動作を説明する。一つの撮像素子13上に結像された1枚の画像から、瞳孔抽出手段21aは一つ以上の瞳孔14aに対応する少なくとも2つの領域を抽出する。瞳孔抽出手段21aが抽出した2つの領域から、瞳孔径算出手段21bは一つ以上の瞳孔14aの三次元座標、寸法を算出する。なお、上述したように、通常、瞳孔14aは、2つあるため、瞳孔抽出手段21aにより合計4つの領域が抽出される。 [Description of Configuration / Operation] Next, the configuration and operation of the difference from the second embodiment will be described. The configuration and operation will be described by taking as an example the case where the pupil diameter estimating means 21 includes a pupil extracting means 21a and a pupil diameter calculating means 21b. The pupil extraction means 21a extracts at least two regions corresponding to one or more pupils 14a from one image formed on one image sensor 13. From the two regions extracted by the pupil extracting means 21a, the pupil diameter calculating means 21b calculates the three-dimensional coordinates and dimensions of one or more pupils 14a. As described above, since there are usually two pupils 14a, a total of four regions are extracted by the pupil extracting means 21a.

 以下に、一つの撮像素子13上に結像された1枚の画像から、瞳孔抽出手段21aが2つの領域を抽出し、瞳孔径算出手段21bが一つ以上の瞳孔14aの三次元座標、寸法を算出するまでの流れを、図8のフローチャートを用いて説明する。 Below, the pupil extraction means 21a extracts two regions from one image imaged on one image sensor 13, and the pupil diameter calculation means 21b determines the three-dimensional coordinates and dimensions of one or more pupils 14a. The flow until the calculation is performed will be described with reference to the flowchart of FIG.

 図8に示すように、まず瞳孔抽出手段21aが撮像素子13上に結像された一枚の画像を取り込む(S2;画像取り込み)。そして、瞳孔抽出手段21aは、取り込んだ画像に二値化処理を行い、画像内の瞳孔14aに対応する4つの領域を抽出し、瞳孔径算出手段21bに出力する(S3;瞳孔部分抽出)。この瞳孔抽出手段21aが抽出した画像内の瞳孔14aに対応する4つの領域から、瞳孔径算出手段21bが画像内の瞳孔14aの周囲の撮像素子13面内の座標(X、Y座標とする)を取得する(S4;座標取得)。さらに、瞳孔径算出手段21bは、求めたX,Y座標に加え、後述する数式1から数式3で示した計算方法を用いて三角測量によって撮像素子13上に結像した瞳孔14aを含む実像に対して撮像素子13から垂直方向の距離(Z座標とする)を求める(S5;瞳孔の三次元座標取得)。また、瞳孔径算出手段21bは、求めた三次元座標から瞳孔径を算出し(S6;瞳孔径算出)、算出した瞳孔径を記録する(S7;瞳孔径記録)。以上の瞳孔径測定装置300においては、時系列で撮像する場合など複数回撮像する場合は、図8のフローチャートでS7の処理が終了した後に、S1のステップに戻る。 As shown in FIG. 8, first, the pupil extracting means 21a captures one image formed on the image sensor 13 (S2; image capture). Then, the pupil extracting unit 21a performs binarization processing on the captured image, extracts four regions corresponding to the pupil 14a in the image, and outputs them to the pupil diameter calculating unit 21b (S3; pupil part extraction). From the four regions corresponding to the pupil 14a in the image extracted by the pupil extracting means 21a, the pupil diameter calculating means 21b has coordinates (X and Y coordinates) in the surface of the image sensor 13 around the pupil 14a in the image. Is acquired (S4; coordinate acquisition). Further, the pupil diameter calculating means 21b generates a real image including the pupil 14a imaged on the image sensor 13 by triangulation using the calculation methods shown in Equations 1 to 3 described later in addition to the obtained X and Y coordinates. On the other hand, a vertical distance (referred to as Z coordinate) from the image sensor 13 is obtained (S5; acquisition of three-dimensional coordinates of the pupil). Further, the pupil diameter calculating means 21b calculates the pupil diameter from the obtained three-dimensional coordinates (S6; pupil diameter calculation), and records the calculated pupil diameter (S7; pupil diameter recording). In the above-described pupil diameter measuring apparatus 300, when imaging is performed a plurality of times, such as when imaging in time series, the process returns to step S1 after the process of S7 is completed in the flowchart of FIG.

 次に、上述の各ステップの詳細を説明する。図8におけるS2で取り込まれる画像、すなわち撮像素子8で撮像された画像は、一例として図9(a)のような被検出者14の顔面画像である。この画像は、図5における光軸A、光軸Bそれぞれの方向から見た被検出者14の顔面画像が光学的に重なった像である。図9(b)は、図9(a)における線分PQの断面における画像の強度分布であり、光軸A、光軸Bそれぞれの方向から見た被検出者14の瞳孔部分22がそれぞれ高い強度値で撮像される。 Next, the details of the above steps will be described. The image captured in S2 in FIG. 8, that is, the image captured by the image sensor 8 is a facial image of the detected person 14 as shown in FIG. 9A as an example. This image is an image in which the facial images of the detection subject 14 viewed from the directions of the optical axis A and the optical axis B in FIG. FIG. 9B shows the intensity distribution of the image in the cross section of the line segment PQ in FIG. 9A, and the pupil portions 22 of the detection subject 14 viewed from the directions of the optical axis A and the optical axis B are high. Images are taken with intensity values.

 次に、S3で瞳孔抽出手段21aにより、図9(b)に示した閾値で2値化画像処理をすると、図9(c)に示すように光軸A、光軸Bそれぞれの方向から見た被検出者14の瞳孔部分22のみが抽出される。そして、S4で瞳孔抽出手段21aは図9(c)における抽出した瞳孔部分21の円周の座標を取得し、記録する。 Next, when the binarized image processing is performed with the threshold shown in FIG. 9B by the pupil extracting means 21a in S3, the optical axis A and the optical axis B are viewed from the respective directions as shown in FIG. 9C. Only the pupil portion 22 of the detected person 14 is extracted. In step S4, the pupil extracting unit 21a acquires and records the coordinates of the circumference of the extracted pupil portion 21 in FIG. 9C.

 ここで、記録される座標は、例えば、図9(c)に示すように、瞳孔抽出手段21aで記録された座標のX成分(図9(a)における線分PQの方向)のうち、光軸A方向の右目画像の最小座標XARmin、最大座標XARmax、のように記録される。また、記録される座標は、光軸A方向の左目画像の最小座標XALmin、最大座標XALmax、のように記録される。また、記録される座標は、光軸B方向の右目画像の最小座標XBRmin、最大座標XBRmax、のように記録される。また、記録される座標は、光軸B方向の左目画像の最小座標XBLmin、最大座標XBLmax、にように記録される。同様にY座標も記録することが可能になる。 Here, for example, as shown in FIG. 9C, the coordinates to be recorded are the light of the X component of the coordinates recorded by the pupil extracting means 21a (the direction of the line segment PQ in FIG. 9A). The minimum coordinate XARmin and the maximum coordinate XARmax of the right eye image in the axis A direction are recorded. The recorded coordinates are recorded as the minimum coordinate XALmin and the maximum coordinate XALmax of the left eye image in the optical axis A direction. The recorded coordinates are recorded as the minimum coordinate XBRmin and the maximum coordinate XBRmax of the right eye image in the optical axis B direction. The recorded coordinates are recorded as the minimum coordinate XBLmin and the maximum coordinate XBLmax of the left eye image in the optical axis B direction. Similarly, the Y coordinate can be recorded.

 次に、瞳孔14aの三次元座標の算出方法を、具体例を示して説明する。図10は本実施形態の瞳孔径測定装置300の瞳孔径算出手段21の動作を説明する図である。図10において、レンズ20eの主点の位置を原点とし、レンズ20eの光軸A方向をZ方向とする。また、光軸Aに対し、垂直で、図10の面上の水平方向をX方向とする。また、図10の奥行方向をY方向とする。ここで、図10の光軸A、Bと図5の光軸A、Bは同じである。言い換えると、図5における光軸A、Bを、各光軸及びレンズ20eと撮像素子13ごとに直線状に表現しなおしたものが図10である。図5におけるそれぞれの光軸から見た画像は、図10における撮像素子13及び光学的に等価な撮像素子13a上の視点A、B(第一の視点15、第二の視点16と対応)の位置での画像となる。よって、図7において瞳孔抽出手段21aで記録された視点Aの右眼瞳孔右端の最小X座標のXARmin、視点Bの右眼瞳孔右端の最小X座標のXBRminは各光軸、および光軸間の距離に対して図10に示した通りの位置関係となる。以下、図10に示した右目の瞳孔14aの右端を例に、三次元座標E(XRmin,YRmin,ZRmin)を測定する方法について説明する。 Next, a method for calculating the three-dimensional coordinates of the pupil 14a will be described with a specific example. FIG. 10 is a diagram for explaining the operation of the pupil diameter calculating means 21 of the pupil diameter measuring apparatus 300 of the present embodiment. In FIG. 10, the position of the principal point of the lens 20e is the origin, and the optical axis A direction of the lens 20e is the Z direction. Further, it is perpendicular to the optical axis A, and the horizontal direction on the surface of FIG. Further, the depth direction in FIG. Here, the optical axes A and B in FIG. 10 and the optical axes A and B in FIG. 5 are the same. In other words, FIG. 10 shows the optical axes A and B in FIG. 5 re-expressed linearly for each optical axis, the lens 20e, and the image sensor 13. In FIG. The images viewed from the respective optical axes in FIG. 5 are the viewpoints A and B (corresponding to the first viewpoint 15 and the second viewpoint 16) on the imaging element 13 and the optically equivalent imaging element 13a in FIG. The image at the position. Accordingly, in FIG. 7, the minimum X coordinate XARmin of the right end of the right eye pupil of the viewpoint A and the minimum X coordinate XBRmin of the right end of the right eye pupil of the viewpoint B recorded by the pupil extracting means 21a in FIG. The positional relationship is as shown in FIG. 10 with respect to the distance. Hereinafter, a method of measuring the three-dimensional coordinates E (XRmin, YRmin, ZRmin) will be described by taking the right end of the right eye pupil 14a shown in FIG. 10 as an example.

 図10において、光軸A・B間の距離dは、図5よりL+L/2である。また、fはレンズ20eの焦点距離である。図5における光軸Bに対するレンズ20eおよび撮像素子13は、図10において光学的に等価なレンズ20ea、光学的に等価な撮像素子13aとして表現している。以降、この光学的等価図を用いて説明する。 10, the distance d between the optical axes A and B is L + L / 2 from FIG. F is the focal length of the lens 20e. The lens 20e and the image sensor 13 with respect to the optical axis B in FIG. 5 are expressed as an optically equivalent lens 20ea and an optically equivalent image sensor 13a in FIG. Hereinafter, description will be given using this optical equivalent diagram.

 図10において、被検出者14と光軸Aおよび光軸B、視点Aと視点Bは三角測量(ステレオ計測)と同様な関係になるので、右目の瞳孔14aの右端の三次元座標E(XRmin,YRmin,ZRmin)は数式1~3で示される。
[数式1]
ZRmin=f・d/(XBRmin-XARmin)
[数式2]
XRmin=ZRmin・(XARmin)/f
[数式3]
YRmin=ZRmin・(YARmin)/f
同様に、右目瞳孔左端、左目瞳孔右端、左目瞳孔左端の三次元座標を求めることができる。さらに、右目瞳孔の左右端間距離、左目瞳孔の左右端間距離から、左右それぞれの瞳孔の直径を求めることができる。このことにより、被検出者14がレンズ20eに対して光軸方向に動いても(ZRminが変動しても)正確な瞳孔14aの瞳孔径を検出することが可能となる。
In FIG. 10, the detected person 14 and the optical axis A and optical axis B, and the viewpoints A and B have the same relationship as that of triangulation (stereo measurement), and therefore, the three-dimensional coordinates E (XRmin) of the right end of the right eye pupil 14a. , YRmin, ZRmin) are expressed by Equations 1 to 3.
[Formula 1]
ZRmin = f · d / (XBRmin−XARmin)
[Formula 2]
XRmin = ZRmin · (XARmin) / f
[Formula 3]
YRmin = ZRmin · (YARmin) / f
Similarly, three-dimensional coordinates of the left end of the right eye pupil, the right end of the left eye pupil, and the left end of the left eye pupil can be obtained. Furthermore, the diameters of the left and right pupils can be obtained from the distance between the left and right ends of the right eye pupil and the distance between the left and right ends of the left eye pupil. This makes it possible to accurately detect the pupil diameter of the pupil 14a even if the subject 14 moves in the optical axis direction with respect to the lens 20e (even if ZRmin varies).

 本実施形態では、瞳孔径測定装置300を示したが、さらに瞳孔状態判定手段24を有することで、瞳孔状態検出装置400とすることができる。以下に、瞳孔状態判定手段24を有する瞳孔状態検出装置400の構成のうち、瞳孔径測定装置300との差分を、図11を用いて説明する。また、瞳孔状態判定手段24を有する場合の処理を、図12のフローチャートを用いて説明する。 In the present embodiment, the pupil diameter measuring device 300 is shown, but by further including the pupil state determining means 24, the pupil state detecting device 400 can be obtained. Below, the difference with the pupil diameter measuring apparatus 300 among the structures of the pupil state detection apparatus 400 which has the pupil state determination means 24 is demonstrated using FIG. In addition, processing when the pupil state determination unit 24 is provided will be described with reference to the flowchart of FIG.

 図11に示したように、瞳孔状態判定手段24は瞳孔径算出手段21bが出力する一つ以上の瞳孔14aの三次元座標、寸法から被検出者14の心理状態、生理状態を判定する。 As shown in FIG. 11, the pupil state determination means 24 determines the psychological state and physiological state of the detected person 14 from the three-dimensional coordinates and dimensions of one or more pupils 14a output from the pupil diameter calculation means 21b.

 図12には、瞳孔状態判定手段24が一つ以上の瞳孔14aの三次元座標、寸法に対し周波数分析を行い、その結果から被検出者14の心理状態、生理状態を判定する場合のフローチャートを示している。図12に示すように、S6で瞳孔径算出手段21bは所定の周期で瞳孔径を算出する。S7で算出した瞳孔径を記録し、S8で時系列的に収集された瞳孔径のデータを瞳孔状態判定手段24が分析し、分析結果から被検出者14の心理状態、生理状態を判定する。以上の瞳孔状態検出装置400においては、図12のフローチャートに示すように、S7の終了後、S1のステップに戻る。ステップS8以降は、S1~S7の処理と並行に行ってもよいし、別に行ってもよい。 FIG. 12 shows a flowchart when the pupil state determination means 24 performs frequency analysis on the three-dimensional coordinates and dimensions of one or more pupils 14a and determines the psychological state and physiological state of the detected person 14 from the result. Show. As shown in FIG. 12, the pupil diameter calculation means 21b calculates the pupil diameter in a predetermined cycle in S6. The pupil diameter calculated in S7 is recorded, the pupil diameter data collected in time series in S8 is analyzed by the pupil state determination means 24, and the psychological state and physiological state of the detected person 14 are determined from the analysis result. In the above pupil state detection apparatus 400, as shown in the flowchart of FIG. 12, after the completion of S7, the process returns to the step of S1. Steps S8 and after may be performed in parallel with the processing of S1 to S7, or may be performed separately.

 以下、瞳孔径測定装置300との差分のステップを、具体例を挙げ説明する。S7において、瞳孔径算出手段21bによって瞳孔径のデータは時系列データとして1/60秒毎に算出される。算出された瞳孔径のデータは、瞳孔状態判定手段24によって記録されうる。瞳孔状態判定手段24に記録された時系列データは、瞳孔状態判定手段24によって周波数分析としてウェーブレット変換が施されうる。 Hereinafter, the difference step from the pupil diameter measuring apparatus 300 will be described with a specific example. In S7, the pupil diameter calculation means 21b calculates the pupil diameter data as time-series data every 1/60 seconds. The calculated pupil diameter data can be recorded by the pupil state determination means 24. The time-series data recorded in the pupil state determination unit 24 can be subjected to wavelet transform as frequency analysis by the pupil state determination unit 24.

 ウェーブレット変換の結果から被検出者14の心理状態、生理状態を判定する方法として、まず、ウェーブレット変換の周波数帯を、0.15Hzを境とした低周波帯0.04~0.15Hz、高周波帯0.16Hz~0.5Hzとに分ける。そして、低周波帯の振幅と高周波帯の振幅との振幅比から心理状態を判定することができる。一般に、低周波帯の振幅は副交感神経、高周波帯の振幅は交感神経が優位な状態とされている。例えば、低周波帯のパワーを高周波帯のパワーで割った値が4以上であれば副交感神経が優位で被検出者がリラックスしている状態であり、4以下であれば交感神経が優位で被検出者が興奮している状態であると判定する。 As a method of determining the psychological state and physiological state of the person 14 to be detected from the result of the wavelet transform, first, the frequency band of the wavelet transform is a low frequency band 0.04 to 0.15 Hz with 0.15 Hz as a boundary, and a high frequency band. It is divided into 0.16 Hz to 0.5 Hz. Then, the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude. In general, the parasympathetic nerve is dominant in the low frequency band, and the sympathetic nerve is dominant in the high frequency band. For example, if the value obtained by dividing the power in the low frequency band by the power in the high frequency band is 4 or more, the parasympathetic nerve is dominant and the detected person is relaxed. It is determined that the detector is excited.

 [効果の説明]以上、第三の実施形態の瞳孔径測定装置300は、装置の構成が単純で、サイズを小さくしうることに加え、三角測量を用いて瞳孔径を正確に求めることが可能である。瞳孔状態検出装置400では、被検出者14の身体を拘束せずに、自由な態勢で精神状態を客観的定量的に評価する装置を小型かつ低コストで提供することが可能である。 [Description of Effects] As described above, the pupil diameter measuring apparatus 300 according to the third embodiment has a simple configuration and can be reduced in size. In addition, the pupil diameter can be accurately obtained using triangulation. It is. In the pupil state detection device 400, it is possible to provide a small and low-cost device that objectively and quantitatively evaluates the mental state in a free posture without restraining the body of the person 14 to be detected.

 ウェーブレット変換の周波数帯は、心理状態の分別ができれば他の周波数0.04~0.5Hz以外でもよく、低周波帯、高周波帯を分ける周波数は0.15Hz以外でもよい。また、周波数解析はウェーブレット変換以外にフーリエ変換や、その他の方法を用いてもよい。 The frequency band of the wavelet transform may be other than 0.04 to 0.5 Hz as long as the psychological state can be distinguished, and the frequency dividing the low frequency band and the high frequency band may be other than 0.15 Hz. For frequency analysis, in addition to wavelet transform, Fourier transform or other methods may be used.

 実験の際、撮像素子13は水平1920画素、垂直1080画素のCMOSセンサで行い、この場合のX,Y方向の画素ピッチは3μmであった。画像精細度は他の画素数を適用してもよい。 In the experiment, the image pickup device 13 was a CMOS sensor having horizontal 1920 pixels and vertical 1080 pixels, and the pixel pitch in the X and Y directions in this case was 3 μm. Other pixel numbers may be applied to the image definition.

 撮像素子13の画像更新周波数は毎秒60回であった。すなわち、瞳孔径算出手段21bによって瞳孔径のデータは時系列データとして1/60秒毎に算出されたが、他の時間間隔でもよい。 The image update frequency of the image sensor 13 was 60 times per second. That is, the pupil diameter data is calculated by the pupil diameter calculation means 21b as time series data every 1/60 seconds, but may be other time intervals.

 上述の画素数のCMOSセンサを用いた場合、レンズ20eの焦点距離fを15mm、dを50mmとすると、撮像距離(Z座標)が500mmにおけるX,Y方向の瞳孔14aの計測分解能は0.1mmとなる。また、Z方向は1mmの分解能で撮像距離に対する補正が行うことができた。 When the CMOS sensor having the number of pixels described above is used and the focal length f of the lens 20e is 15 mm and d is 50 mm, the measurement resolution of the pupil 14a in the X and Y directions when the imaging distance (Z coordinate) is 500 mm is 0.1 mm. It becomes. Further, in the Z direction, the imaging distance could be corrected with a resolution of 1 mm.

 また、光源1として波長850nmのLEDを使用したが、同様な効果が得られれば別の波長のものを使用してもよい。また、環境光を計測するセンサを備えて、瞳孔径の対光反射に対する補正をしてもよい。「LED」とは、「Light Emitting Diode」の略である。 Further, although an LED having a wavelength of 850 nm is used as the light source 1, a light having a different wavelength may be used as long as the same effect can be obtained. In addition, a sensor for measuring ambient light may be provided to correct the pupillary light reflection. “LED” is an abbreviation for “Light Emitting Diode”.

 本実施形態では光軸Aと光軸Bは平行である場合について説明したが、各ミラーの角度を変えて光軸Aと光軸Bとの間に角度を持たせた輻輳撮影となる条件でもよい。 In the present embodiment, the case where the optical axis A and the optical axis B are parallel has been described. However, even in the condition of convergence imaging in which the angle between the optical axis A and the optical axis B is changed by changing the angle of each mirror. Good.

 [第四の実施形態]次に第四の実施形態について説明をする。図13は本実施形態に関する瞳孔撮像装置500の構成を示す上面図である。本実施形態の瞳孔撮像装置500は、図13に示すように、第二の実施形態と異なる点は、第二の光源42、第二の結像手段49、を備えている点である。それ以外の構成や構成のバリエーションは、第二の実施形態と同様である。つまり、第四の実施形態の瞳孔撮像装置500は、第一の光源41と、第二の光源42と、波長選択手段43と、第一の撮像素子44と、第二の撮像素子45と、第一の結像手段48と、第二の結像手段49と、を有する。 [Fourth Embodiment] Next, a fourth embodiment will be described. FIG. 13 is a top view showing the configuration of the pupil imaging device 500 according to this embodiment. As shown in FIG. 13, the pupil imaging apparatus 500 of the present embodiment is different from the second embodiment in that it includes a second light source 42 and a second imaging means 49. Other configurations and configuration variations are the same as those in the second embodiment. That is, the pupil imaging device 500 of the fourth embodiment includes a first light source 41, a second light source 42, a wavelength selection unit 43, a first imaging element 44, a second imaging element 45, The first imaging means 48 and the second imaging means 49 are provided.

 [構成の説明]次に、本実施形態の瞳孔撮像装置500の構成を、図13を用いて説明する。第一の光源41は、第一の波長の光を出射する。第二の光源42は第一の光源41と異なる位置から異なる角度で第二の波長の光を出射する。波長選択手段43は、前述の第一の波長の光および第二の波長の光を選択的に透過する。第一の撮像素子44は、被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分からの、第一の視点46と第二の視点47での第一の波長の光を波長選択手段43を介して受光する。第二の撮像素子45は、被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分の第一の視点46での第二の波長の光を、波長選択手段43を介して受光する。第一の結像手段48は、第一の光源41からの光が被検出者14へ出射される際の光軸Aおよび光軸Bと略平行に被検出者14からの光を波長選択手段43を介して受光する。さらに第一の結像手段48は、照射される被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分からの、第一の視点46と第二の視点47での各々の第一の波長の光による像を、波長選択手段43を介して第一の撮像素子44上に結像させる。第二の結像手段49は、被検出者の少なくとも一つ以上の瞳孔を含む少なくとも顔面の一部分の第一の視点での第二の波長の光による像を、波長選択手段43を介して第二の撮像素子45上に結像させる。
第一の視点46と第二の視点47は第一の光源41が光を出射する方向と略平行な2つの光軸である光軸A及び光軸B上にあり、この2つの光軸は、互いに平行する。
[Description of Configuration] Next, the configuration of the pupil imaging apparatus 500 of this embodiment will be described with reference to FIG. The first light source 41 emits light having a first wavelength. The second light source 42 emits light of the second wavelength from a position different from that of the first light source 41 at a different angle. The wavelength selection unit 43 selectively transmits the light having the first wavelength and the light having the second wavelength. The first imaging device 44 selects the light of the first wavelength at the first viewpoint 46 and the second viewpoint 47 from at least a part of the face including the one or more pupils 14a of the person 14 to be detected. Light is received through the means 43. The second imaging element 45 receives, via the wavelength selection means 43, the light having the second wavelength at the first viewpoint 46 of at least a part of the face including the one or more pupils 14a of the person 14 to be detected. . The first image forming unit 48 selects the light from the detected person 14 in wavelength selection means substantially parallel to the optical axis A and the optical axis B when the light from the first light source 41 is emitted to the detected person 14. Light is received through 43. Further, the first imaging means 48 is configured so that each of the first viewpoint 46 and the second viewpoint 47 from at least a part of the face including one or more pupils 14a of the subject 14 to be irradiated is first. An image of light having a wavelength of 1 is formed on the first image sensor 44 via the wavelength selection means 43. The second imaging unit 49 generates an image of the second wavelength light at the first viewpoint of at least a part of the face including at least one or more pupils of the subject via the wavelength selection unit 43. An image is formed on the second image sensor 45.
The first viewpoint 46 and the second viewpoint 47 are on the optical axis A and the optical axis B, which are two optical axes that are substantially parallel to the direction in which the first light source 41 emits light. , Parallel to each other.

 [動作の説明]次に、本実施形態の瞳孔撮像装置500における動作について図13に基づいて説明する。まず、光源41からの入射光が被検出者14を照射するまでを説明する。第一の光源41は、第一の波長の光を出射する。第一の波長の光は波長選択手段43を介して被検出者14に照射される。一方第二の光源42は、第一の光源41と異なる位置から異なる角度で第二の波長の光を出射する。第二の波長の光は被検出者14に照射される。 [Description of Operation] Next, the operation of the pupil imaging apparatus 500 of the present embodiment will be described with reference to FIG. First, the process until the incident light from the light source 41 irradiates the detection subject 14 will be described. The first light source 41 emits light having a first wavelength. The light of the first wavelength is irradiated to the detected person 14 via the wavelength selection means 43. On the other hand, the second light source 42 emits light having the second wavelength at a different angle from a position different from that of the first light source 41. The light of the second wavelength is irradiated to the person to be detected 14.

 次に、被検出者14からの光が第1の撮像素子44及び第2の撮像素子45上に結像するまでを説明する。被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分からの前述の第一の波長の光は波長選択手段43を介して第一の光源41からの光が被検出者14へ出射される際の光軸と同軸な第一の光軸(図13中では光軸A)と第二の光軸(図13中では光軸B)上にある第一の結像手段48に入射する。入射した第一の波長の光は第一の結像手段48によって第一の撮像素子44上に結像する。一方、被検出者14の一つの瞳孔14aを含む少なくとも顔面の一部分からの前述の第二の波長の光は波長選択手段43を介して第二の結像手段49に入射する。入射した第二の波長の光は第二の結像手段49によって第二の撮像素子45上に結像する。 Next, a description will be given of how light from the person 14 to be detected forms an image on the first image sensor 44 and the second image sensor 45. The light of the first wavelength from at least a part of the face including one or more pupils 14a of the detected person 14 is emitted from the first light source 41 to the detected person 14 via the wavelength selecting means 43. Is incident on the first imaging means 48 on the first optical axis (optical axis A in FIG. 13) and the second optical axis (optical axis B in FIG. 13) which are coaxial with the optical axis when To do. The incident light having the first wavelength forms an image on the first image sensor 44 by the first imaging means 48. On the other hand, the light having the second wavelength from at least a part of the face including one pupil 14 a of the person to be detected 14 enters the second imaging unit 49 via the wavelength selection unit 43. The incident light of the second wavelength is imaged on the second image sensor 45 by the second imaging means 49.

 以下、第一の結像手段48、第二の結像手段49の具体的な構成を挙げて詳細を説明する。図14は、瞳孔撮像装置500の具体的な構成の一例を示した図である。また、図15Aは第一の光源41からの第一の波長の光路の光学系を示した図である。図15Bは、第二の光源42からの第二の波長の光路の光学系を示した図である。ここで図15Aの構成は、光学的には第三の実施形態と同じ作用を持つ光学系であり、第一の撮像素子44には図9(a)と同様に光軸A、Bそれぞれの方向に対応する被検出者14の顔面画像が重なって撮像される。 Hereinafter, details will be described with specific configurations of the first imaging means 48 and the second imaging means 49. FIG. 14 is a diagram illustrating an example of a specific configuration of the pupil imaging device 500. FIG. 15A is a diagram showing an optical system of the optical path of the first wavelength from the first light source 41. FIG. 15B is a diagram showing an optical system of the optical path of the second wavelength from the second light source 42. Here, the configuration of FIG. 15A is an optical system that optically has the same action as that of the third embodiment, and each of the optical axes A and B is included in the first image sensor 44 as in FIG. 9A. The face images of the person to be detected 14 corresponding to the direction are captured in an overlapping manner.

 図14に示すように、第一の結像手段48は、第一の波長選択ミラー48a、第二の波長選択ミラー48b、第三の波長選択ミラー48c、第一のミラー48d、第二のミラー48e、ハーフミラー48f、レンズ48gを有する。また、第二の結像手段49は、第一の波長選択ミラー48a、ミラー49a、第二の波長選択ミラー48b、レンズ48g、第三の波長選択ミラー48cを有する。ここで第一の波長選択ミラー48a、第二の波長選択ミラー48b、第三の波長選択ミラー48cは、第一の光源41が出射する第一の波長の光を選択的に反射し、第二の光源42が出射する第二の波長に光を選択的に透過する。 As shown in FIG. 14, the first imaging means 48 includes a first wavelength selection mirror 48a, a second wavelength selection mirror 48b, a third wavelength selection mirror 48c, a first mirror 48d, and a second mirror. 48e, a half mirror 48f, and a lens 48g. The second imaging means 49 includes a first wavelength selection mirror 48a, a mirror 49a, a second wavelength selection mirror 48b, a lens 48g, and a third wavelength selection mirror 48c. Here, the first wavelength selection mirror 48a, the second wavelength selection mirror 48b, and the third wavelength selection mirror 48c selectively reflect the light of the first wavelength emitted from the first light source 41, and the second wavelength selection mirror 48b. The light is selectively transmitted to the second wavelength emitted by the light source 42.

 次に、第一の光源41から出射された入射光が被検出者14に照射されるまでの流れを、図15Aを参照して説明する。第一の光源41から発光した光はハーフミラー48fに入射する。ハーフミラー48fに入射した入射光は、ハーフミラー48fを透過し第一の波長選択ミラー48a方向に進む光と、ハーフミラー48fで反射し第二のミラー48e方向に進む光に分割される。ハーフミラー48fで反射された光は、第二のミラー48eで反射し、第一のミラー48dで反射され、波長選択手段を介して被検出者14方向に進む。一方ハーフミラー48fを透過した光は、第一の波長選択ミラー48aで反射され、波長選択手段を介して被検出者14方向に進む。つまり、ハーフミラー48fを透過した光は、ハーフミラー48fで反射された光と互いに平行し、被検出者14の異なる位置に入射する。 Next, a flow until incident light emitted from the first light source 41 is irradiated on the detected person 14 will be described with reference to FIG. 15A. The light emitted from the first light source 41 enters the half mirror 48f. Incident light that has entered the half mirror 48f is split into light that passes through the half mirror 48f and travels in the direction of the first wavelength selection mirror 48a, and light that reflects by the half mirror 48f and travels in the direction of the second mirror 48e. The light reflected by the half mirror 48f is reflected by the second mirror 48e, reflected by the first mirror 48d, and travels in the direction of the person to be detected 14 via the wavelength selection means. On the other hand, the light transmitted through the half mirror 48f is reflected by the first wavelength selection mirror 48a and travels in the direction of the detected person 14 through the wavelength selection means. That is, the light transmitted through the half mirror 48f is parallel to the light reflected by the half mirror 48f and is incident on different positions of the detection subject 14.

 一方、図15Bに示すように、第二の光源42は、第一の光源41と異なる位置から異なる角度で第二の波長の光を被検出者14方向に出射する。 On the other hand, as shown in FIG. 15B, the second light source 42 emits light of the second wavelength from the position different from that of the first light source 41 in the direction of the person to be detected 14.

 次に、被検出者14からの光が第一の撮像素子44、第二の撮像素子45上に結像するまでの流れを説明する。図15Aに示すように、まず、被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分からの光が、光軸Aに平行で、波長選択手段43を介して第1の波長選択ミラー48aに入射する。そして、波長選択手段43を介して入射した光のうち第一の波長の光が、第一の波長選択ミラー48aで反射される。さらに、第一の波長の光が、ハーフミラー48fで一部が反射され、第二の波長選択ミラー48bで反射され、レンズ48gに入射し、第三の波長選択ミラー48cで反射され、第一の撮像素子44上に結像される。 Next, a flow until light from the person 14 to be detected forms an image on the first image sensor 44 and the second image sensor 45 will be described. As shown in FIG. 15A, first, light from at least a part of the face including one or more pupils 14a of the person to be detected 14 is parallel to the optical axis A and is selected via the wavelength selection unit 43. The light enters the mirror 48a. And the light of the 1st wavelength among the lights which entered via the wavelength selection means 43 is reflected by the 1st wavelength selection mirror 48a. Further, the first wavelength light is partially reflected by the half mirror 48f, reflected by the second wavelength selection mirror 48b, incident on the lens 48g, and reflected by the third wavelength selection mirror 48c. The image is formed on the image sensor 44.

 同様に、被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分からの光が、光軸Bに平行で、波長選択手段43を介して第1のミラー48dに入射する。そして、波長選択手段43を介して入射した光のうち第一の波長の光が第一のミラー48dで反射される。さらに、第二のミラー48eで反射され、ハーフミラー48fを一部が透過し、以降光軸Aに平行に入射した光と同じ経路で第一の撮像素子44上に結像される。つまり、光軸A方向と光軸B方向の両方の視点からの被検出者14の像が第一の撮像素子44上に結像される。結像した像のうち、瞳孔14aの部分は、光源41から発光した光が被検出者14に照射される光軸(光軸A及び光軸B)と、被検出者14から反射した光が撮像素子44に入射する光軸(光軸A及び光軸B)とが、同軸であるため、眼底反射光が撮像されている。このため、いわゆる明瞳孔の像が撮像される。 Similarly, light from at least a part of the face including one or more pupils 14a of the person 14 to be detected is incident on the first mirror 48d via the wavelength selection means 43 in parallel with the optical axis B. And the light of the 1st wavelength among the lights which entered via the wavelength selection means 43 is reflected by the 1st mirror 48d. Further, the light is reflected by the second mirror 48e, partially transmitted through the half mirror 48f, and thereafter imaged on the first image sensor 44 through the same path as the light incident parallel to the optical axis A. That is, the image of the person 14 to be detected from the viewpoints in both the optical axis A direction and the optical axis B direction is formed on the first image sensor 44. Of the image formed, the pupil 14a has an optical axis (optical axis A and optical axis B) on which the light emitted from the light source 41 is irradiated to the detected person 14 and light reflected from the detected person 14. Since the optical axes (optical axis A and optical axis B) incident on the image sensor 44 are coaxial, fundus reflection light is imaged. Therefore, a so-called bright pupil image is captured.

 なお、光軸A方向からの光でハーフミラー48fを透過する光、光軸B方向からの光でハーフミラー48fを反射する光は光源41方向に戻る。この戻り光は、光源41からの入射光のうち被検出者14の顔面で反射した光のみであるので、前記戻り光は光源41から出射される入射光の強度と比較して強度がはるかに小さい。このため、もし光源41の表面等で再び反射しても、本実施形態において特段の問題とはならない。 Note that light transmitted through the half mirror 48f with light from the optical axis A direction and light reflected through the half mirror 48f with light from the optical axis B direction return to the light source 41 direction. Since the return light is only the light reflected from the face of the person 14 to be detected among the incident light from the light source 41, the return light has a much higher intensity than the intensity of the incident light emitted from the light source 41. small. For this reason, even if it is reflected again by the surface of the light source 41 or the like, there is no particular problem in this embodiment.

 また、図15Bに示すように、被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分からの光が、波長選択手段43に入射する。波長選択手段43を介して入射した光のうち第二の波長の光が第一の波長選択ミラー48aで透過する。そして、透過した光が、ミラー49aで反射され、第二の波長選択ミラー48bで透過し、レンズ48gに入射し、第三の波長選択ミラー48cを透過し、第二の撮像素子45上に結像される。つまり、第一の撮像素子44と異なり、光軸A方向からの被検出者14の一つ以上の瞳孔14aを含む少なくとも顔面の一部分の像のみが撮像素子45上に結像する。結像した像のうち、瞳孔14aの部分は、光源42から発光した光が被検出者14に照射される光軸と、被検出者14から反射した光が撮像素子44に入射する光軸(光軸A及び光軸B)とが、非同軸であるため、眼底反射光は撮像されていない。このため、いわゆる暗瞳孔の像が撮像される。光軸B方向からの第二の波長の光は、図15Aにおける第二の波長選択ミラー48bを透過するため、撮像素子45上には結像しない。 Further, as shown in FIG. 15B, light from at least a part of the face including one or more pupils 14 a of the person to be detected 14 enters the wavelength selection unit 43. Of the light incident through the wavelength selection means 43, the second wavelength light is transmitted through the first wavelength selection mirror 48a. The transmitted light is reflected by the mirror 49a, transmitted by the second wavelength selection mirror 48b, enters the lens 48g, passes through the third wavelength selection mirror 48c, and is coupled onto the second image sensor 45. Imaged. That is, unlike the first image sensor 44, only an image of at least a part of the face including one or more pupils 14 a of the person 14 to be detected from the direction of the optical axis A is formed on the image sensor 45. Of the image formed, the pupil 14 a has an optical axis on which the light emitted from the light source 42 is applied to the detected person 14, and an optical axis on which the light reflected from the detected person 14 enters the image sensor 44 ( Since the optical axis A and the optical axis B) are non-coaxial, fundus reflection light is not imaged. For this reason, a so-called dark pupil image is captured. The light of the second wavelength from the direction of the optical axis B is transmitted through the second wavelength selection mirror 48b in FIG.

 [効果の説明]本実施形態における瞳孔撮像装置500では、さらに、異なる波長の光源を一方は撮像素子と同軸上、もう一方は非同軸な位置・角度に配置し、波長ごとに異なる撮像素子上に瞳孔14aの像を結像させるため、明瞳孔及び暗瞳孔の像を撮像できる。 [Explanation of Effects] In the pupil imaging device 500 according to the present embodiment, light sources having different wavelengths are arranged coaxially with the imaging element, and the other is arranged at a non-coaxial position / angle, and on different imaging elements for each wavelength. Since the image of the pupil 14a is formed on the light, the images of the bright pupil and the dark pupil can be captured.

 なお、光軸Aと光軸Bの両方の方向から入射した光を撮像素子44上に結像させるため、ハーフミラー48fは第一の波長の透過率が50%(反射率が50%)のものを使用することが好ましい。 Since the light incident from both the optical axis A and the optical axis B is imaged on the image sensor 44, the half mirror 48f has a first wavelength transmittance of 50% (a reflectance of 50%). It is preferable to use one.

 また、瞳孔撮像装置500製造の際の光学系の位置合わせ工程を簡略化するために、例えば、第一の波長選択ミラー48a、第二の波長選択ミラー48b、第三の波長選択ミラー48cは、図15Cに示す、プリズム体52のような光学的に等価な構成にしてもよい。このプリズム体52は、全反射面51、波長選択膜52を具備している。この場合、被検出者14の顔面で反射した光が波長選択ミラーに入射する方向に波長選択膜51を向けるように、かつ波長選択ミラーと同じ角度で配置する。他のミラーについても同様に、光学的に等価な他の構成にしてもよく、全反射するミラーをプリズム体52で置き換える場合には、前述のミラーの面としてプリズム体52の全反射面50を用いる。 In order to simplify the alignment process of the optical system when manufacturing the pupil imaging device 500, for example, the first wavelength selection mirror 48a, the second wavelength selection mirror 48b, and the third wavelength selection mirror 48c are: An optically equivalent configuration such as the prism body 52 shown in FIG. 15C may be used. The prism body 52 includes a total reflection surface 51 and a wavelength selection film 52. In this case, it arrange | positions at the same angle as a wavelength selection mirror so that the wavelength selection film | membrane 51 may be directed in the direction in which the light reflected on the face of the to-be-detected person 14 injects into a wavelength selection mirror. Similarly, other mirrors may have other optically equivalent configurations. When the mirror that totally reflects is replaced with the prism body 52, the total reflection surface 50 of the prism body 52 is used as the surface of the above-described mirror. Use.

 [第五の実施形態]図16は本実施形態の瞳孔径測定装置600を示す上面図である。本実施形態は、第四の実施形態の構成に加え、第一の撮像素子44上に結像された1枚の画像中の瞳孔14aを撮像した領域と、第二の撮像素子45上に結像された1枚の画像中の同じく瞳孔14aを撮像した領域を用いて、三角測量によって瞳孔径を推定する瞳孔径推定手段53を有する。 [Fifth Embodiment] FIG. 16 is a top view showing a pupil diameter measuring apparatus 600 of this embodiment. In the present embodiment, in addition to the configuration of the fourth embodiment, a region obtained by imaging the pupil 14a in one image formed on the first image sensor 44 and a second image sensor 45 are connected. A pupil diameter estimating means 53 is provided for estimating the pupil diameter by triangulation using the same imaged area of the pupil 14a in one image.

 瞳孔径推定手段53は、図17に示すように瞳孔抽出手段53aと瞳孔径算出手段53bを備えている。瞳孔抽出手段53aは、第一の撮像素子44上に結像された画像中の瞳孔14aを撮像した領域と、第二の撮像素子45上に結像された画像中の同じく瞳孔14aを撮像した領域を抽出する。通常、瞳孔は2つあるため合計4つの領域を抽出する。瞳孔径算出手段53bは抽出した4つの領域から、瞳孔14aの三次元座標、寸法を算出する。 The pupil diameter estimating means 53 includes a pupil extracting means 53a and a pupil diameter calculating means 53b as shown in FIG. The pupil extracting means 53a images the pupil 14a in the image imaged on the first image sensor 44 and the same pupil 14a in the image imaged on the second image sensor 45. Extract regions. Usually, since there are two pupils, a total of four regions are extracted. The pupil diameter calculating means 53b calculates the three-dimensional coordinates and dimensions of the pupil 14a from the extracted four regions.

 [構成・動作の説明]以下に、第一の撮像素子44上に結像された1枚の画像と第二の撮像素子45上に結像された1枚の画像から、瞳孔抽出手段53aが2つの領域を抽出し、瞳孔径算出手段53bが一つ以上の瞳孔抽出手段53aの三次元座標、寸法を算出するまでの流れを、図18のフローチャートを用いて説明する。 [Description of Configuration / Operation] In the following, the pupil extracting means 53a uses a single image formed on the first image sensor 44 and a single image formed on the second image sensor 45. The flow from extraction of two regions until the pupil diameter calculation means 53b calculates the three-dimensional coordinates and dimensions of one or more pupil extraction means 53a will be described with reference to the flowchart of FIG.

 図18に示すように、まず瞳孔抽出手段53aは、第一の撮像素子44、第二の撮像素子45上に結像されたそれぞれの画像を取り込む(S102;画像取り込み)。次に、瞳孔抽出手段53aは、第一の撮像素子44上に結像された画像と第二の撮像素子45上に結像された画像との差分を算出する(S103;画像差分)。また、瞳孔抽出手段53aは、差分算出後の画像に二値化処理を行い、瞳孔14aを撮像した2つの領域を抽出する(S104;光軸A画像瞳孔部分抽出)。瞳孔径算出手段53bは、抽出した2つの領域から、瞳孔14aの周囲の撮像素子面内の座標(X、Y座標とする)を取得し、少なくともいずれかの瞳孔直径を算出する(S105;視点A画像瞳孔部分瞳孔直径算出)。瞳孔径算出手段53bは、求めた瞳孔径から微分フィルタを設定する(S106;微分フィルタ設定)。さらに、瞳孔径算出手段53bは、第一の撮像素子44に結像した画像に先述の微分フィルタを適用する(S107;画像微分処理)。瞳孔抽出手段53aは、先述の画像微分処理を適用した画像に二値化処理を行い、画像内の瞳孔部分22に対応する4つの領域を抽出し、瞳孔径算出手段53bに出力する(S108;瞳孔部分抽出)。瞳孔径算出手段53bは、画像内の瞳孔部分22に対応する4つの領域の第一の撮像素子44面内X、Y座標を算出する(S109;座標取得)。瞳孔径算出手段53bは、求めたX,Y座標に加え、数式1から数式3で示した計算方法を用いて三角測量によって第一の撮像素子44上に結像した瞳孔部分22を含む実像に対して第一の撮像素子44から垂直方向の距離(Z座標とする)を求める(S110;瞳孔の三次元座標取得)。瞳孔径算出手段53bは、求めた三次元座標から4つの瞳孔部分22の瞳孔径を算出し(S111;瞳孔径算出)、算出した瞳孔径を記録する(S112;瞳孔径記録)。以上の瞳孔径測定装置600においては、時系列で撮像する場合など複数回撮像する場合は、図18のフローチャートでS112の処理が終了した後に、S101のステップに戻る。 As shown in FIG. 18, first, the pupil extracting means 53a captures the respective images formed on the first image sensor 44 and the second image sensor 45 (S102; image capture). Next, the pupil extracting means 53a calculates the difference between the image formed on the first image sensor 44 and the image formed on the second image sensor 45 (S103; image difference). Further, the pupil extracting means 53a performs binarization processing on the image after the difference calculation, and extracts two regions in which the pupil 14a is imaged (S104; optical axis A image pupil partial extraction). The pupil diameter calculating unit 53b acquires coordinates (X and Y coordinates) in the imaging element plane around the pupil 14a from the two extracted regions, and calculates at least one pupil diameter (S105: viewpoint) A image pupil partial pupil diameter calculation). The pupil diameter calculating means 53b sets a differential filter from the obtained pupil diameter (S106; differential filter setting). Further, the pupil diameter calculating means 53b applies the above-described differential filter to the image formed on the first image sensor 44 (S107; image differential processing). The pupil extracting unit 53a performs binarization processing on the image to which the above-described image differentiation processing is applied, extracts four regions corresponding to the pupil portion 22 in the image, and outputs them to the pupil diameter calculating unit 53b (S108; Pupil part extraction). The pupil diameter calculating means 53b calculates the X and Y coordinates in the first imaging element 44 plane of the four regions corresponding to the pupil portion 22 in the image (S109; coordinate acquisition). In addition to the obtained X and Y coordinates, the pupil diameter calculation unit 53b converts the pupil image 22 into a real image including the pupil portion 22 imaged on the first image sensor 44 by triangulation using the calculation methods shown in Equations 1 to 3. On the other hand, a distance in the vertical direction (Z coordinate) is obtained from the first image sensor 44 (S110; acquisition of three-dimensional coordinates of the pupil). The pupil diameter calculating means 53b calculates the pupil diameters of the four pupil portions 22 from the obtained three-dimensional coordinates (S111; pupil diameter calculation), and records the calculated pupil diameter (S112; pupil diameter recording). In the above pupil diameter measuring apparatus 600, when imaging a plurality of times, such as when imaging in time series, the process returns to step S101 after the process of S112 is completed in the flowchart of FIG.

 次に、上述の各ステップのうち、第三の実施形態における図8のフローチャートとの差分である、S104~S108について詳細に説明する。まず、S104について説明する。図19(a)は第一の撮像素子44で撮像される画像の強度分布、図19(b)は第二の撮像素子45で撮像される画像の強度分布を示す図である。 Next, among the steps described above, S104 to S108, which are differences from the flowchart of FIG. 8 in the third embodiment, will be described in detail. First, S104 will be described. FIG. 19A is an intensity distribution of an image captured by the first image sensor 44, and FIG. 19B is a diagram illustrating an intensity distribution of an image captured by the second image sensor 45.

 図19(a)では、図9(b)と同様に顔面部分23に対して瞳孔部分22の強度が大きくなるが、顔面部分23と瞳孔部分22の強度レベル差が小さい場合、2値化処理が難しくなる。 In FIG. 19A, as in FIG. 9B, the intensity of the pupil part 22 increases with respect to the face part 23, but when the difference in intensity level between the face part 23 and the pupil part 22 is small, binarization processing is performed. Becomes difficult.

 このような場合に、明瞳孔の像である図19(a)と、暗瞳孔の像である図19(b)の差分をとる(図19(a)から図19(b)を減算処理する)ことで、顔面部分23からの反射の成分を相殺する。その結果図19(c)に示すように光軸A方向で見た時の瞳孔部分22のみを抽出できる。瞳孔部分22の抽出後、S105として、図19(c)に示したように、抽出した瞳孔部分22から光軸A方向の瞳孔部分22の右左瞳孔直径DR、DLをそれぞれ求めることができる。 In such a case, the difference between FIG. 19A, which is an image of a bright pupil, and FIG. 19B, which is an image of a dark pupil, is taken (subtraction processing is performed on FIG. 19B from FIG. 19A). Thus, the component of reflection from the face portion 23 is canceled. As a result, as shown in FIG. 19C, only the pupil portion 22 when viewed in the direction of the optical axis A can be extracted. After the extraction of the pupil portion 22, as shown in FIG. 19C, the right and left pupil diameters DR and DL of the pupil portion 22 in the optical axis A direction can be obtained from the extracted pupil portion 22 as S105.

 2値化処理した画像からは図9(c)と同様に、光軸A方向の右目画像の最小座標はXARmin、最大座標はXARmaxのように記録される。光軸A方向の左目画像の最小座標はXALmin、最大座標はXALmax、のように記録される。光軸B方向の右目画像の最小座標はXBRmin、最大座標はXBRmax、のように記録される。光軸B方向の左目画像の最小座標はXBLmin、最大座標はXBLmax、のように記録される。同様にY成分も記録することが可能である。 From the binarized image, the minimum coordinate of the right eye image in the optical axis A direction is recorded as XARmin and the maximum coordinate is recorded as XARmax, as in FIG. 9C. The minimum coordinate of the left eye image in the direction of the optical axis A is recorded as XALmin, and the maximum coordinate is recorded as XALmax. The minimum coordinate of the right eye image in the optical axis B direction is recorded as XBRmin, and the maximum coordinate is recorded as XBRmax. The minimum coordinate of the left eye image in the optical axis B direction is recorded as XBLmin, and the maximum coordinate as XBLmax. Similarly, the Y component can be recorded.

 次にS106を説明する。図20は特徴抽出フィルタの作用を説明する図である。本実施形態では、特徴抽出フィルタとして数式4で示す2次微分フィルタを例示する。ここで、微分間隔DRは前述の光軸A方向の瞳孔部分の右瞳孔直径である。S106として、求めた光軸A方向の瞳孔部分22の右左瞳孔直径DR、DLのいずれかを数式4に適用する。そしてS107として図20Aの画像に図20Bの二次微分フィルタを施し、S108として二値化処理をすることで、直径DRの大きさのものを選択的に抽出ができ、図20Cに示すように光軸B方向で見た時の瞳孔部分22も抽出できる。抽出した瞳孔部分22を瞳孔径算出手段53bに出力する。
[数式4]
I(x,y)=2・I(x,y)-I(x-DR,y)-I(x+DR,y)
 ここで、DRの値は光軸A方向の抽出した瞳孔14aの像から常に求めることができるので、撮像距離が変動して、例えば図20EのようにDRの値が増加した場合においても、このときの値DRを微分間隔とするため、図20Fに示すように二次微分フィルタを最適化でき、図20Gに示すように瞳孔14aの選択的抽出ができる。
Next, S106 will be described. FIG. 20 is a diagram for explaining the operation of the feature extraction filter. In the present embodiment, a second-order differential filter represented by Expression 4 is exemplified as the feature extraction filter. Here, the differential interval DR is the right pupil diameter of the pupil portion in the optical axis A direction described above. As S106, any one of the right and left pupil diameters DR and DL of the pupil portion 22 in the direction of the optical axis A is applied to Equation 4. Then, the image of FIG. 20A is subjected to the second-order differential filter of FIG. 20B as S107, and the binarization process is performed as S108, so that the one having the diameter DR can be selectively extracted, as shown in FIG. 20C. The pupil portion 22 when viewed in the direction of the optical axis B can also be extracted. The extracted pupil portion 22 is output to the pupil diameter calculating means 53b.
[Formula 4]
I (x, y) = 2 · I (x, y) −I (x−DR, y) −I (x + DR, y)
Here, since the DR value can always be obtained from the extracted image of the pupil 14a in the direction of the optical axis A, even when the imaging distance varies and the DR value increases as shown in FIG. 20E, for example. Since the time value DR is the differential interval, the secondary differential filter can be optimized as shown in FIG. 20F, and the pupil 14a can be selectively extracted as shown in FIG. 20G.

 以上、瞳孔径測定装置600について説明したが、図17に示すように、瞳孔径測定装置600はさらに瞳孔状態判定手段54を備えても良い。図21には、瞳孔状態判定手段54を備えた場合のフローチャートを示している。図18において算出された瞳孔径(S111;瞳孔径算出)の値から(S112;瞳孔径記録)、第三の実施形態で示した周波数分析(図12中S8)の分析結果から被検出者14の心理状態、生理状態を判定する。なお、被検出者14の心理状態、生理状態を判定する方法として、第三の実施形態で述べたように、まず、ウェーブレット変換の周波数帯を、0.15Hzを境とした低周波帯0.04~0.15Hz、高周波帯0.16Hz~0.5Hzとに分ける。そして、低周波帯の振幅と高周波帯の振幅との振幅比から心理状態を判定することができる。つまり、瞳孔状態判定手段54を備えた場合においては、図21のフローチャートはS211の処理が終了した後に、S201のステップに戻る。S212以降は、S201~S211の処理と並行に行ってもよいし、別に行ってもよい。 Although the pupil diameter measuring apparatus 600 has been described above, the pupil diameter measuring apparatus 600 may further include pupil state determining means 54 as shown in FIG. FIG. 21 shows a flowchart when the pupil state determination means 54 is provided. From the value of the pupil diameter (S111; pupil diameter calculation) calculated in FIG. 18 (S112; pupil diameter recording) and the analysis result of the frequency analysis (S8 in FIG. 12) shown in the third embodiment, the detected person 14 The psychological state and physiological state are determined. Note that, as described in the third embodiment, as a method for determining the psychological state and physiological state of the person 14 to be detected, first, as described in the third embodiment, the frequency band of wavelet transform is set to a low frequency band 0. It is divided into 04 to 0.15 Hz and high frequency band 0.16 Hz to 0.5 Hz. Then, the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude. That is, when the pupil state determination means 54 is provided, the flowchart of FIG. 21 returns to the step of S201 after the processing of S211 is completed. The steps after S212 may be performed in parallel with the processing of S201 to S211 or may be performed separately.

 [効果の説明]本実施形態における瞳孔撮像装置600では、さらに、顔面部分23と瞳孔部分22の強度レベル差が小さい場合でも、2値化処理によって正確な瞳孔径を算出できる。また、本実施形態における瞳孔状態検出装置700は、被検出者14の身体を拘束せずに、自由な態勢で精神状態を客観的定量的に評価する装置を小型かつ低コストで提供することが可能である。 [Description of Effects] In the pupil imaging apparatus 600 according to the present embodiment, even when the difference in intensity level between the face portion 23 and the pupil portion 22 is small, an accurate pupil diameter can be calculated by binarization processing. Moreover, the pupil state detection apparatus 700 in this embodiment can provide a small and low-cost apparatus that objectively and quantitatively evaluates the mental state in a free posture without restraining the body of the person 14 to be detected. Is possible.

 なお、特徴抽出フィルタとして、二次微分フィルタではなく数式5に示す2次元分布のものを用いてもよい。この場合、フィルタ処理は図20D、図20Hのようになる。
[数式5]
I(x,y)=4・I(x,y)―I(x-D,y)―I(x+D,y)―I(x,y―D)―I(x,y+D)
 瞳孔14aの抽出には左瞳孔直径DLを用いてもよく、右左瞳孔直径DR、DLを交互に用いてもよい。
Note that a feature extraction filter may be a two-dimensional distribution represented by Equation 5 instead of a second-order differential filter. In this case, the filtering process is as shown in FIGS. 20D and 20H.
[Formula 5]
I (x, y) = 4 · I (x, y) −I (x−D, y) −I (x + D, y) −I (x, y−D) −I (x, y + D)
For extraction of the pupil 14a, the left pupil diameter DL may be used, or the right and left pupil diameters DR and DL may be used alternately.

 特徴抽出フィルタは値DR、DLを基にしたものであれば、既知の各種画像処理フィルタを用いてもよい。 As long as the feature extraction filter is based on the values DR and DL, various known image processing filters may be used.

 光源1の光源には波長850nmのLEDを使用した。光源67の光源には波長940nmのLEDを使用した。光源1、2は同様な効果得られれば別の波長のものを使用してもよい。 An LED having a wavelength of 850 nm was used as the light source 1. An LED with a wavelength of 940 nm was used as the light source 67. The light sources 1 and 2 may have different wavelengths as long as the same effect can be obtained.

 波長選択フィルタは誘電体多層膜を用いたダイクロイックミラーを用いることができるが、他の既知の波長選択フィルタを用いてもよい。 As the wavelength selection filter, a dichroic mirror using a dielectric multilayer film can be used, but other known wavelength selection filters may be used.

 本実施形態では光軸Aと光軸Bは平行であるが、各ミラーの角度を変えて光軸Aと光軸Bとの間に角度を持たせた輻輳撮影となる条件でもよい。 In the present embodiment, the optical axis A and the optical axis B are parallel, but it may be a condition for convergence imaging in which the angle between the optical axis A and the optical axis B is changed by changing the angle of each mirror.

 [第六の実施形態]次に第六の実施形態について説明をする。図22は本実施形態に関する瞳孔状態検出表示装置800の構成を示す上面図である。本実施形態の瞳孔状態検出表示装置800は、図22に示すように、第三、または第五の実施形態に対し、被検出者14と波長選択手段43との間に表示装置61を備えている点で異なる。それ以外の構成や構成のバリエーションは、第三、第五の実施形態と同様である。なお、図22Aは表示装置61として光学フィルタ61a、図22Bでは光学フィルタ61aと表示部61b、情報処理装置61c、図22Cでは光学フィルタ61a、表示部61b、ハーフミラー61dである場合を示している。 [Sixth Embodiment] Next, the sixth embodiment will be described. FIG. 22 is a top view showing a configuration of a pupil state detection / display apparatus 800 according to the present embodiment. As shown in FIG. 22, the pupil state detection display device 800 of the present embodiment includes a display device 61 between the detected person 14 and the wavelength selection means 43 as compared with the third or fifth embodiment. Is different. Other configurations and configuration variations are the same as those in the third and fifth embodiments. 22A shows a case where the display device 61 is an optical filter 61a, FIG. 22B shows an optical filter 61a and a display unit 61b, an information processing device 61c, and FIG. 22C shows an optical filter 61a, a display unit 61b, and a half mirror 61d. .

 [構成の説明]図22Aにおいて瞳孔状態検出装置101は本願発明の第三および第五の実施形態の瞳孔状態検出装置400、または瞳孔状態検出装置700である。図22A中の光学フィルタ61aが可視光全反射、赤外光全透過である場合、被検出者14から見ると鏡のように見える。また、光学フィルタ61aを可視光吸収、赤外光透過とすれば被検出者14から見ると黒い平面に見える。 [Description of Configuration] In FIG. 22A, the pupil state detection device 101 is the pupil state detection device 400 or the pupil state detection device 700 of the third and fifth embodiments of the present invention. When the optical filter 61a in FIG. 22A is visible light total reflection and infrared light total transmission, when it sees from the to-be-detected person 14, it looks like a mirror. Further, if the optical filter 61a is made to absorb visible light and transmit infrared light, it will appear as a black plane when viewed from the person 14 to be detected.

 図22Bは、瞳孔状態検出装置101での判定結果に基づいて、情報処理装置61cにより判断した結果を表示部61bに表示する構成である。この構成において、表示部61bに表示することで、被検出者14に疲労状態や精神状態を知らせたり、被検出者14の対光反射などの反応を測定するための視覚刺激を与えたりしてもよい。なお、被検出者14の心理状態、生理状態を判定する方法として、第三の実施形態で述べたように、まず、ウェーブレット変換の周波数帯を、0.15Hzを境とした低周波帯0.04~0.15Hz、高周波帯0.16Hz~0.5Hzとに分ける。そして、低周波帯の振幅と高周波帯の振幅との振幅比から心理状態を判定することができる。 FIG. 22B shows a configuration in which the result determined by the information processing device 61c is displayed on the display unit 61b based on the determination result of the pupil state detection device 101. In this configuration, by displaying on the display unit 61b, the subject 14 is notified of the fatigue state and the mental state, or a visual stimulus for measuring the reaction of the subject 14 such as light reflection is given. Also good. Note that, as described in the third embodiment, as a method for determining the psychological state and physiological state of the person 14 to be detected, first, as described in the third embodiment, the frequency band of wavelet transform is set to a low frequency band 0. It is divided into 04 to 0.15 Hz and high frequency band 0.16 Hz to 0.5 Hz. Then, the psychological state can be determined from the amplitude ratio between the low frequency band amplitude and the high frequency band amplitude.

 図22Cは、図22Bで示す構成にハーフミラー61cを備えた構成である。
表示部61bの表示がないときは鏡として作用、表示部61bの表示がある場合は、被検出者14の鏡像と表示画像を光学的に融合することができる。
FIG. 22C is a configuration provided with a half mirror 61c in the configuration shown in FIG. 22B.
When there is no display on the display unit 61b, it acts as a mirror, and when there is a display on the display unit 61b, the mirror image of the detected person 14 and the display image can be optically fused.

 [効果の説明]第六の実施形態の瞳孔状態検出表示装置800により、図22Aの構成の場合、被検出者14は測定されていることを感じず、より精度の高い心理状態を評価できる。さらに図22Bの構成の場合、被検出者14の心理状態、生理状態の瞳孔状態検出装置400、瞳孔状態検出装置700による評価結果に応じて被検出者14に視覚刺激を与えることで、精神的ストレス管理や、精神的疾病の治療にさらなる効果をもたらすことが可能となった。図22Cの構成の場合、上述の図22Aと図22Bの構成の効果を表示内容に応じて得ることができる。 [Explanation of Effects] With the pupil state detection display device 800 of the sixth embodiment, in the case of the configuration of FIG. 22A, the detected person 14 can evaluate a more accurate psychological state without feeling that it is being measured. Furthermore, in the case of the configuration of FIG. 22B, a visual stimulus is given to the detected person 14 according to the evaluation result by the psychological state of the detected person 14, the pupil state detecting device 400 of the physiological state, and the pupil state detecting device 700, It has become possible to have further effects on stress management and treatment of mental illness. In the case of the configuration of FIG. 22C, the effect of the configuration of FIG. 22A and FIG. 22B described above can be obtained according to the display content.

 表示部61bとしては、液晶ディスプレイを用いることができる。液晶ディスプレイの他に、有機ELディスプレイやその他のディスプレイを使用してもよい。なお、「EL」は、「Electroluminescence」の略である。また、表示部61bは、通常の2次元表示のもの以外に、レンチキュラレンズやパララックスバリアを用いた立体ディスプレイを用いてもよい。立体ディスプレイを用いた場合、瞳孔状態検出装置101で得られた左右の瞳孔位置や瞳孔位置から求められる輻輳角、瞳孔径から求められる眼球の焦点調節状態の状態を用いて、立体ディスプレイの観察に適切な視差画像を表示してもよい。 A liquid crystal display can be used as the display unit 61b. In addition to the liquid crystal display, an organic EL display or other display may be used. Note that “EL” is an abbreviation for “Electroluminescence”. The display unit 61b may be a stereoscopic display using a lenticular lens or a parallax barrier, in addition to a normal two-dimensional display. When a stereoscopic display is used, the left and right pupil positions obtained by the pupil state detection device 101, the convergence angle obtained from the pupil position, and the focus adjustment state of the eyeball obtained from the pupil diameter are used to observe the stereoscopic display. An appropriate parallax image may be displayed.

 上記の実施形態の一部または全部は、以下の付記のようにも記載されうるが、以下には限られない。
[付記1]
一つの撮像素子と、
光源から光を照射される被写体の共通する箇所の異なる視点での複数の像を、前記一つの撮像素子上に結像させる結像手段と、を有する、撮像装置。
[付記2]
前記少なくとも異なる2視点が第一の視点と第二の視点であり、
前記結像手段は、前記第一の視点における被写体からの光をミラーで反射させてレンズに入射させることで前記一つの撮像素子上に結像させ、前記第二の視点における前記被写体からの光をミラーで反射させて前記レンズに入射させることで前記一つの撮像素子上に前記第一の視点と前記第二の視点とにおける被写体の像と同時に結像させる、付記1に記載の撮像装置。
[付記3]
前記任意の光源は、一つの光源と、前記一つの光源から出射された入射光を分割する分割手段と、前記分割手段で分割した入射光を平行するあるいは輻輳する角度で異なる2方向から被写体方向に入射させるミラーである、付記1または2に記載の撮像装置。
[付記4]
前記分割手段がハーフミラーである、付記3に記載の撮像装置。
[付記5]
所定の波長の光を出射する光源と、
前記所定の波長の光を選択的に透過する波長選択手段と、
一つの撮像素子と、
前記波長選択手段を介して前記光源から前記所定の波長の光を照射される被検出者の一つ以上の瞳孔を含む少なくとも顔面の一部分の異なる視点でのおのおのの像を、前記波長選択手段を介して前記一つの撮像素子上に結像させる結像手段を有し、
前記異なる視点は前記光源が光を出射する方向と略平行な異なる光軸上にあり、
前記異なる光軸は、互いに平行または輻輳する配置である、
瞳孔撮像装置。
[付記6]
付記5の瞳孔撮像装置と、前記一つの撮像素子上に結像された1枚の画像の中の、前記一つ以上の瞳孔に対応する少なくとも2つの領域を用いて、三角測量によって瞳孔径を推定する瞳孔径推定手段と、を有する瞳孔径測定装置。
[付記7]
前記瞳孔径推定手段が、前記一つの撮像素子上に結像された1枚の画像から前記少なくとも2つの領域を抽出する瞳孔抽出手段と、
前記少なくとも2つの領域から、前記一つ以上の瞳孔の三次元座標、寸法を算出し、出力する瞳孔径算出手段を有する、付記6に記載の瞳孔径測定装置。
[付記8]
付記6または7に記載の瞳孔径測定装置と、前記瞳孔径算出手段が出力する前記一つ以上の瞳孔の三次元座標、寸法から被検出者の心理状態、生理状態を判定する瞳孔状態判定手段と、を有する瞳孔状態検出装置。
[付記9]
第一の波長の光を出射する第一の光源と、
前記第一の光源と異なる角度で第二の波長の光を出射する第二の光源と、
前記第一の波長の光および前記第二の波長の光を選択的に透過する波長選択手段と、
第一の撮像素子と、
第二の撮像素子と、
前記第一の光源からの光が被検出者へ出射される際と光軸が同軸であり、前記第一の波長の光を前記波長選択手段を介して照射される前記被検出者の一つ以上の瞳孔を含む少なくとも顔面の一部分の第一の視点と第二の視点でのおのおのの前記第一の波長の光による像を、前記波長選択手段を介して前記第一の撮像素子上に結像させる第一の結像手段と、
前記第二の波長の光を照射される被検出者の少なくとも一つの瞳孔を含む少なくとも顔面の一部分の前記第一の視点での前記第二の波長の光による像を、前記波長選択手段を介して前記第二の撮像素子上に結像させる第二の結像手段と、を有し、
前記第一の視点と前記第二の視点は前記第一の光源が光を出射する方向と略平行な2つの光軸上にあり、
前記2つの光軸は、互いに平行または輻輳する配置である、
瞳孔撮像装置。
[付記10]
付記9の瞳孔撮像装置と、前記第一の撮像素子上に結像された1枚の画像と前記第二の撮像素子上に結像された1枚の画像との中の、前記一つ以上の瞳孔に対応する少なくとも2つの領域を用いて、三角測量によって瞳孔径を推定する瞳孔径推定手段と、を有する瞳孔径測定装置。
[付記11]
前記瞳孔径推定手段が、前記第一の撮像素子上に結像された1枚の画像と前記第二の撮像素子上に結像された1枚の画像から前記一つ以上の瞳孔の領域を抽出する瞳孔抽出手段と、
前記一つ以上の瞳孔の三次元座標、寸法を算出し、出力する瞳孔径算出手段と、を有する、付記10に記載の瞳孔径測定装置。
[付記12]
前記第一の撮像素子と前記第二の撮像素子との上に結像されたそれぞれの画像が、明瞳孔を含む画像と暗瞳孔を含む画像である、付記11記載の瞳孔径測定装置。
[付記13]
前記瞳孔抽出手段は、前記それぞれの画像の減算処理をする付記11記載の瞳孔径測定装置。
[付記14]
前記第一の結像手段は、前記第一の視点における前記一つ以上の瞳孔を含む少なくとも顔面の一部分からの光を、前記第一の波長を選択的に反射する第一の波長選択ミラーで反射させ、さらにミラーで反射させ、さらに前記第一の波長を選択的に反射する第二の波長選択ミラーで反射させ、レンズに入射させることで前記第一の撮像素子上に結像させ、同時に前記第二の視点における前記一つ以上の瞳孔を含む少なくとも顔面の一部分からの光をミラーで反射させ、さらに前記レンズに入射させることで前記第一の撮像素子上に結像させる、付記9記載の瞳孔撮像装置。
[付記15]
前記第二の結像手段は、前記第一の視点における前記一つ以上の瞳孔を含む少なくとも顔面の一部分からの光をミラーで反射させ、レンズに入射させることで第二の撮像素子上に結像させる、付記9記載の瞳孔撮像装置。
[付記16]
付記15に記載の瞳孔撮像装置と、前記被検出者と前記波長選択手段との間にさらに表示装置と、を有する、瞳孔状態検出表示装置。
[付記17]
被検出者の一つ以上の瞳孔を含む少なくとも顔面の一部分に光源から所定の波長の光を出射し、
前記光を出射する方向と略平行の2つの光軸の各々の光軸上の異なる少なくとも2視点での、前記一つ以上の瞳孔を含む少なくとも顔面の一部分からの光を一つの撮像素子上に結像させる、
瞳孔撮像方法。
[付記18]
被検出者の一つの瞳孔を含む少なくとも顔面の一部分に第一の光源が第一の波長の光を出射し、
前記第一の光源と異なる位置から異なる角度で第二の光源が被検出者の一つの瞳孔を含む少なくとも顔面の一部分に前記第二の波長の光を出射し、
被検出者の一つの瞳孔を含む少なくとも顔面の一部分方向から入射する前記第一の波長の光および前記第二の波長の光が前記第一の波長の光および前記第二の波長の光を選択的に透過する波長選択手段を介し第一の結像手段と第二の結像手段に入射し、
前記第一の光源からの光が被検出者へ出射される際と光軸が同軸である第一の結像手段は第一の視点と第二の視点での前記被検出者の一つの瞳孔を含む少なくとも顔面の一部分の各々の前記第一の波長の光を第一の撮像素子上に結像させ、
第二の結像手段は第一の視点での前記被検出者の少なくとも一つの瞳孔を含む少なくとも顔面の一部分の前記第二の波長の光を第二の撮像素子上に結像させる、
瞳孔撮像方法。
 以上、実施形態(及び実施例)を参照して本願発明を説明したが、本願発明は上記実施形態(及び実施例)に限定されるものではない。本願発明の構成や詳細には、本願発明のスコープ内で当業者が理解し得る様々な変更をすることができる。
 この出願は、2013年11月19日に出願された日本出願特願2013-238493を基礎とする優先権を主張し、その開示の全てをここに取り込む。
A part or all of the above-described embodiment can be described as in the following supplementary notes, but is not limited thereto.
[Appendix 1]
One image sensor,
An imaging apparatus, comprising: an imaging unit configured to form a plurality of images at different viewpoints of a common portion of a subject irradiated with light from a light source on the one imaging element.
[Appendix 2]
The at least two different viewpoints are a first viewpoint and a second viewpoint,
The imaging means reflects the light from the subject at the first viewpoint by a mirror and enters the lens to form an image on the one image sensor, and the light from the subject at the second viewpoint. The imaging apparatus according to appendix 1, wherein the image is reflected by a mirror and incident on the lens so as to form an image of the subject at the first viewpoint and the second viewpoint at the same time on the one imaging element.
[Appendix 3]
The arbitrary light source includes one light source, a dividing unit that divides incident light emitted from the one light source, and a subject direction from two directions that are different depending on an angle at which the incident light divided by the dividing unit is parallel or converged. The imaging apparatus according to appendix 1 or 2, wherein the imaging apparatus is a mirror that is made incident on the lens.
[Appendix 4]
The imaging apparatus according to appendix 3, wherein the dividing unit is a half mirror.
[Appendix 5]
A light source that emits light of a predetermined wavelength;
Wavelength selection means for selectively transmitting light of the predetermined wavelength;
One image sensor,
Respective images at different viewpoints of at least a part of the face including one or more pupils of the detection subject irradiated with light of the predetermined wavelength from the light source via the wavelength selection unit, the wavelength selection unit An image forming means for forming an image on the one image sensor via
The different viewpoints are on different optical axes substantially parallel to the direction in which the light source emits light;
The different optical axes are arranged parallel or converging to each other,
Pupil imaging device.
[Appendix 6]
The pupil diameter is determined by triangulation using the pupil imaging device of Appendix 5 and at least two regions corresponding to the one or more pupils in one image formed on the one imaging element. A pupil diameter measuring device comprising: a pupil diameter estimating means for estimating.
[Appendix 7]
The pupil diameter estimating means, the pupil extracting means for extracting the at least two regions from one image formed on the one image sensor;
The pupil diameter measuring device according to appendix 6, further comprising pupil diameter calculating means for calculating and outputting three-dimensional coordinates and dimensions of the one or more pupils from the at least two regions.
[Appendix 8]
The pupil diameter measuring device according to appendix 6 or 7, and pupil state determining means for determining the psychological state and physiological state of the detected person from the three-dimensional coordinates and dimensions of the one or more pupils output from the pupil diameter calculating means And a pupillary state detection device.
[Appendix 9]
A first light source that emits light of a first wavelength;
A second light source that emits light of a second wavelength at an angle different from that of the first light source;
Wavelength selection means for selectively transmitting the light of the first wavelength and the light of the second wavelength;
A first image sensor;
A second imaging device;
When the light from the first light source is emitted to the detected person, the optical axis is coaxial, and one of the detected persons irradiated with the light of the first wavelength through the wavelength selecting means Images of the light of the first wavelength at the first viewpoint and the second viewpoint of at least a part of the face including the pupil are connected onto the first imaging element via the wavelength selection unit. First imaging means for imaging,
The image by the light of the second wavelength at the first viewpoint of at least a part of the face including at least one pupil of the subject irradiated with the light of the second wavelength is passed through the wavelength selection means. And second image forming means for forming an image on the second image sensor,
The first viewpoint and the second viewpoint are on two optical axes substantially parallel to a direction in which the first light source emits light,
The two optical axes are arranged parallel to each other or converging.
Pupil imaging device.
[Appendix 10]
One or more of the pupil imaging device according to appendix 9, one image imaged on the first image sensor, and one image imaged on the second image sensor And a pupil diameter estimating means for estimating a pupil diameter by triangulation using at least two regions corresponding to the pupil of the pupil diameter.
[Appendix 11]
The pupil diameter estimating means determines the region of the one or more pupils from one image imaged on the first image sensor and one image imaged on the second image sensor. Pupil extracting means for extracting;
The pupil diameter measuring device according to appendix 10, further comprising pupil diameter calculating means for calculating and outputting three-dimensional coordinates and dimensions of the one or more pupils.
[Appendix 12]
The pupil diameter measuring device according to appendix 11, wherein each of the images formed on the first imaging element and the second imaging element is an image including a bright pupil and an image including a dark pupil.
[Appendix 13]
The pupil diameter measuring device according to appendix 11, wherein the pupil extracting means performs a subtraction process on the respective images.
[Appendix 14]
The first imaging means is a first wavelength selection mirror that selectively reflects the first wavelength of light from at least a part of the face including the one or more pupils in the first viewpoint. Reflect, further reflect by a mirror, further reflect by the second wavelength selective mirror that selectively reflects the first wavelength, and enter the lens to form an image on the first imaging element, The light of at least a part of the face including the one or more pupils in the second viewpoint is reflected by a mirror and further incident on the lens to form an image on the first image sensor. Pupil imaging device.
[Appendix 15]
The second imaging means reflects light from at least a part of the face including the one or more pupils at the first viewpoint with a mirror and makes it incident on a lens to be coupled onto the second imaging element. The pupil image pickup device according to appendix 9, wherein an image is formed.
[Appendix 16]
A pupil state detection display device comprising: the pupil imaging device according to attachment 15; and a display device further provided between the detected person and the wavelength selection unit.
[Appendix 17]
Emitting light of a predetermined wavelength from the light source to at least a part of the face including one or more pupils of the subject;
Light from at least a part of the face including the one or more pupils on at least two different viewpoints on each of the two optical axes substantially parallel to the direction of emitting the light on one image sensor Image,
Pupil imaging method.
[Appendix 18]
The first light source emits light of the first wavelength to at least a part of the face including one pupil of the detected person,
The second light source emits light of the second wavelength to at least a part of the face including one pupil of the detected person at a different angle from a position different from the first light source,
The light of the first wavelength and the light of the second wavelength that enter from at least a part of the face including one pupil of the detected person selects the light of the first wavelength and the light of the second wavelength Is incident on the first imaging means and the second imaging means via the wavelength selection means that transmits the light,
When the light from the first light source is emitted to the detected person, the first imaging means whose optical axis is coaxial is one pupil of the detected person at the first viewpoint and the second viewpoint. Imaging light of the first wavelength of each of at least a portion of the face including a first imaging element;
A second imaging means images the light of the second wavelength of at least a part of the face including at least one pupil of the subject at the first viewpoint on the second image sensor;
Pupil imaging method.
While the present invention has been described with reference to the embodiments (and examples), the present invention is not limited to the above embodiments (and examples). Various changes that can be understood by those skilled in the art can be made to the configuration and details of the present invention within the scope of the present invention.
This application claims the priority on the basis of Japanese application Japanese Patent Application No. 2013-238493 for which it applied on November 19, 2013, and takes in those the indications of all here.

 1a  特徴点
 2  任意の光源
 2a  光源
 2aa  ミラー
 2b  分割手段
 2ba  ハーフミラー
 2bb~2bd  ミラー
 2b  レンズ
 3  第一の視点
 4  第二の視点
 5  撮像素子
 6  結像手段
 6a  ミラー
 6b  レンズ
 100  撮像装置
 11  光源
 12  波長選択手段
 13  撮像素子
 13a  光学的に等価な撮像素子
 14  被検出者
 14a  瞳孔
 15  第一の視点
 16  第二の視点
 20  結像手段
 20a  ハーフミラー
 20b~d  ミラー
 20e  レンズ
 20ea  光学的に等価なレンズ
 21  瞳孔径推定手段
 21a  瞳孔抽出手段
 21b  瞳孔径算出手段
 22  瞳孔部分
 23  顔面部分
 24  瞳孔状態判定手段
 200  瞳孔撮像装置
 300  瞳孔径測定装置
 400  瞳孔状態検出装置
 41  第一の光源
 42  第二の光源
 43  波長選択手段
 44  第一の撮像素子
 45  第二の撮像素子
 46  第一の視点
 47  第二の視点
 48  第一の結像手段
 48a  第一の波長選択ミラー
 48b  第二の波長選択ミラー
 48c  第三の波長選択ミラー
 48d  第一のミラー
 48e  第二のミラー
 48f  ハーフミラー
 48g  レンズ
 49  第二の結像手段
 49a  ミラー
 50  全反射面
 51  波長選択膜
 52  プリズム体
 53  瞳孔径推定手段
 53a  瞳孔抽出手段
 53b  瞳孔径算出手段
 54  瞳孔状態判定手段
 500  瞳孔撮像装置
 600  瞳孔径測定装置
 700  瞳孔状態検出装置
 61  表示装置
 61b  表示部
 101  瞳孔状態検出装置
 103  光学フィルタ
 104  ハーフミラー
 800  瞳孔状態検出表示装置
DESCRIPTION OF SYMBOLS 1a Feature point 2 Arbitrary light source 2a Light source 2aa Mirror 2b Dividing means 2ba Half mirror 2bb-2bd Mirror 2b Lens 3 First viewpoint 4 Second viewpoint 5 Imaging element 6 Imaging means 6a Mirror 6b Lens 100 Imaging device 11 Light source 12 Wavelength selection means 13 Imaging element 13a Optically equivalent imaging element 14 Detected person 14a Pupil 15 First viewpoint 16 Second viewpoint 20 Imaging means 20a Half mirror 20b-d Mirror 20e Lens 20ea Optically equivalent lens 21 pupil diameter estimation means 21a pupil extraction means 21b pupil diameter calculation means 22 pupil part 23 face part 24 pupil state determination means 200 pupil imaging device 300 pupil diameter measurement device 400 pupil state detection device 41 first light source 42 second light source 43 Wavelength selection means 44 First imaging element 45 Second imaging element 46 First viewpoint 47 Second viewpoint 48 First imaging means 48a First wavelength selection mirror 48b Second wavelength selection mirror 48c Third wavelength selection mirror 48d First mirror 48e Second mirror 48f Half mirror 48g Lens 49 Second imaging means 49a Mirror 50 Total reflection surface 51 Wavelength selection film 52 Prism body 53 Pupil diameter estimation means 53a Pupil extraction means 53b Pupil diameter calculation means 54 Pupil State determining means 500 Pupil imaging device 600 Pupil diameter measuring device 700 Pupil state detecting device 61 Display device 61b Display unit 101 Pupil state detecting device 103 Optical filter 104 Half mirror 800 Pupil state detecting display device

Claims (10)

一つの撮像素子と、
光源から光を照射される被写体の共通する箇所の異なる視点での複数の像を、前記一つの撮像素子上に結像させる結像手段と、を有する、撮像装置。
One image sensor,
An imaging apparatus, comprising: an imaging unit configured to form a plurality of images at different viewpoints of a common portion of a subject irradiated with light from a light source on the one imaging element.
前記少なくとも異なる2視点が第一の視点と第二の視点であり、
前記結像手段は、前記第一の視点における被写体からの光をミラーで反射させてレンズに入射させることで前記一つの撮像素子上に結像させ、前記第二の視点における前記被写体からの光をミラーで反射させて前記レンズに入射させることで前記一つの撮像素子上に前記第一の視点と前記第二の視点とにおける被写体の像と同時に結像させる、請求項1に記載の撮像装置。
The at least two different viewpoints are a first viewpoint and a second viewpoint,
The imaging means reflects the light from the subject at the first viewpoint by a mirror and enters the lens to form an image on the one image sensor, and the light from the subject at the second viewpoint. The imaging apparatus according to claim 1, wherein an image of the subject at the first viewpoint and the second viewpoint is simultaneously formed on the one imaging element by reflecting the light from a mirror and entering the lens. .
所定の波長の光を出射する光源と、
前記所定の波長の光を選択的に透過する波長選択手段と、
一つの撮像素子と、
前記波長選択手段を介して前記光源から前記所定の波長の光を照射される被検出者の一つ以上の瞳孔を含む少なくとも顔面の一部分の異なる視点でのおのおのの像を、前記波長選択手段を介して前記一つの撮像素子上に結像させる結像手段と、を有し、
前記異なる視点は前記光源が光を出射する方向と略平行な異なる光軸上にあり、
前記異なる光軸は、互いに平行または輻輳する配置である、
瞳孔撮像装置。
A light source that emits light of a predetermined wavelength;
Wavelength selection means for selectively transmitting light of the predetermined wavelength;
One image sensor,
Respective images at different viewpoints of at least a part of the face including one or more pupils of the detection subject irradiated with light of the predetermined wavelength from the light source via the wavelength selection unit, the wavelength selection unit Imaging means for forming an image on the one image sensor via
The different viewpoints are on different optical axes substantially parallel to the direction in which the light source emits light;
The different optical axes are arranged parallel or converging to each other,
Pupil imaging device.
請求項3の瞳孔撮像装置と、前記一つの撮像素子上に結像された1枚の画像の中の、前記一つ以上の瞳孔に対応する少なくとも2つの領域を用いて、三角測量によって瞳孔径を推定する瞳孔径推定手段と、を有する瞳孔径測定装置。 A pupil diameter by triangulation using the pupil imaging device according to claim 3 and at least two regions corresponding to the one or more pupils in one image formed on the one image sensor. A pupil diameter measuring means for estimating the pupil diameter. 前記瞳孔径推定手段が、前記一つの撮像素子上に結像された1枚の画像から前記少なくとも2つの領域を抽出する瞳孔抽出手段と、
前記少なくとも2つの領域から、前記一つ以上の瞳孔の三次元座標、寸法を算出し、出力する瞳孔径算出手段を有する、請求項4に記載の瞳孔径測定装置。
The pupil diameter estimating means, the pupil extracting means for extracting the at least two regions from one image formed on the one image sensor;
The pupil diameter measuring device according to claim 4, further comprising a pupil diameter calculating unit that calculates and outputs three-dimensional coordinates and dimensions of the one or more pupils from the at least two regions.
第一の波長の光を出射する第一の光源と、
前記第一の光源と異なる角度で第二の波長の光を出射する第二の光源と、
前記第一の波長の光および前記第二の波長の光を選択的に透過する波長選択手段と、
第一の撮像素子と、
第二の撮像素子と、
前記第一の光源からの光が被検出者へ出射される際と光軸が同軸であり、前記第一の波長の光を前記波長選択手段を介して照射される前記被検出者の一つ以上の瞳孔を含む少なくとも顔面の一部分の第一の視点と第二の視点でのおのおのの前記第一の波長の光による像を、前記波長選択手段を介して前記第一の撮像素子上に結像させる第一の結像手段と、
前記第二の波長の光を照射される被検出者の少なくとも一つの瞳孔を含む少なくとも顔面の一部分の前記第一の視点での前記第二の波長の光による像を、前記波長選択手段を介して前記第二の撮像素子上に結像させる第二の結像手段と、を有し、
前記第一の視点と前記第二の視点は前記第一の光源が光を出射する方向と略平行な2つの光軸上にあり、
前記2つの光軸は、互いに平行または輻輳する配置である、
瞳孔撮像装置。
A first light source that emits light of a first wavelength;
A second light source that emits light of a second wavelength at an angle different from that of the first light source;
Wavelength selection means for selectively transmitting the light of the first wavelength and the light of the second wavelength;
A first image sensor;
A second imaging device;
When the light from the first light source is emitted to the detected person, the optical axis is coaxial, and one of the detected persons irradiated with the light of the first wavelength through the wavelength selecting means Images of the light of the first wavelength at the first viewpoint and the second viewpoint of at least a part of the face including the pupil are connected onto the first imaging element via the wavelength selection unit. First imaging means for imaging,
The image by the light of the second wavelength at the first viewpoint of at least a part of the face including at least one pupil of the subject irradiated with the light of the second wavelength is passed through the wavelength selection means. And second image forming means for forming an image on the second image sensor,
The first viewpoint and the second viewpoint are on two optical axes substantially parallel to a direction in which the first light source emits light,
The two optical axes are arranged parallel to each other or converging.
Pupil imaging device.
請求項6の瞳孔撮像装置と、前記第一の撮像素子上に結像された1枚の画像と前記第二の撮像素子上に結像された1枚の画像との中の、前記一つ以上の瞳孔に対応する少なくとも2つの領域を用いて、三角測量によって瞳孔径を推定する瞳孔径推定手段と、を有する瞳孔径測定装置。 7. The pupil imaging device according to claim 6, and one of the one image formed on the first image sensor and the one image formed on the second image sensor. A pupil diameter measuring device comprising: pupil diameter estimating means for estimating a pupil diameter by triangulation using at least two regions corresponding to the above pupils. 前記瞳孔径推定手段が、前記第一の撮像素子上に結像された1枚の画像と前記第二の撮像素子上に結像された1枚の画像から前記一つ以上の瞳孔の領域を抽出する瞳孔抽出手段と、
前記一つ以上の瞳孔の三次元座標、寸法を算出し、出力する瞳孔径算出手段と、を有する、請求項7に記載の瞳孔径測定装置。
The pupil diameter estimating means determines the region of the one or more pupils from one image imaged on the first image sensor and one image imaged on the second image sensor. Pupil extracting means for extracting;
The pupil diameter measuring device according to claim 7, further comprising pupil diameter calculating means for calculating and outputting three-dimensional coordinates and dimensions of the one or more pupils.
前記第一の撮像素子と前記第二の撮像素子との上に結像されたそれぞれの画像が、明瞳孔を含む画像と暗瞳孔を含む画像であり、
前記瞳孔抽出手段は、前記それぞれの画像の減算処理をする請求項8記載の瞳孔径測定装置。
Each image formed on the first image sensor and the second image sensor is an image including a bright pupil and an image including a dark pupil,
9. The pupil diameter measuring device according to claim 8, wherein the pupil extracting means performs a subtraction process on the respective images.
被検出者の一つ以上の瞳孔を含む少なくとも顔面の一部分に光源から所定の波長の光を出射し、
前記光を出射する方向と略平行の2つの光軸の各々の光軸上の異なる少なくとも2視点での、前記一つ以上の瞳孔を含む少なくとも顔面の一部分からの光を一つの撮像素子上に結像させる、
瞳孔撮像方法。
Emitting light of a predetermined wavelength from the light source to at least a part of the face including one or more pupils of the subject;
Light from at least a part of the face including the one or more pupils on at least two different viewpoints on each of the two optical axes substantially parallel to the direction of emitting the light on one image sensor Image,
Pupil imaging method.
PCT/JP2014/005642 2013-11-19 2014-11-10 Imaging device, pupil imaging device, pupil-diameter measurement device, pupil-state detection device, and pupil imaging method Ceased WO2015075894A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2015548974A JPWO2015075894A1 (en) 2013-11-19 2014-11-10 Imaging device, pupil imaging device, pupil diameter measuring device, pupil state detection device, and pupil imaging method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013238493 2013-11-19
JP2013-238493 2013-11-19

Publications (1)

Publication Number Publication Date
WO2015075894A1 true WO2015075894A1 (en) 2015-05-28

Family

ID=53179185

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/005642 Ceased WO2015075894A1 (en) 2013-11-19 2014-11-10 Imaging device, pupil imaging device, pupil-diameter measurement device, pupil-state detection device, and pupil imaging method

Country Status (2)

Country Link
JP (1) JPWO2015075894A1 (en)
WO (1) WO2015075894A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017179938A1 (en) * 2016-04-15 2017-10-19 이문기 Device for photographing eye

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0870474A (en) * 1994-08-29 1996-03-12 Sanyo Electric Co Ltd Stereoscopic image pickup unit and stereoscopic image recording and reproducing device
JP2004261598A (en) * 2003-02-28 2004-09-24 Agilent Technol Inc Pupil detection apparatus and method
WO2007125794A1 (en) * 2006-04-27 2007-11-08 Konica Minolta Holdings, Inc. Data measuring device and data measuring method
JP2008246013A (en) * 2007-03-30 2008-10-16 National Univ Corp Shizuoka Univ Sleepiness detection device
JP2012068937A (en) * 2010-09-24 2012-04-05 Panasonic Corp Pupil detection device and pupil detection method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0870474A (en) * 1994-08-29 1996-03-12 Sanyo Electric Co Ltd Stereoscopic image pickup unit and stereoscopic image recording and reproducing device
JP2004261598A (en) * 2003-02-28 2004-09-24 Agilent Technol Inc Pupil detection apparatus and method
WO2007125794A1 (en) * 2006-04-27 2007-11-08 Konica Minolta Holdings, Inc. Data measuring device and data measuring method
JP2008246013A (en) * 2007-03-30 2008-10-16 National Univ Corp Shizuoka Univ Sleepiness detection device
JP2012068937A (en) * 2010-09-24 2012-04-05 Panasonic Corp Pupil detection device and pupil detection method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017179938A1 (en) * 2016-04-15 2017-10-19 이문기 Device for photographing eye
KR20170118618A (en) * 2016-04-15 2017-10-25 이문기 Eye capturing device

Also Published As

Publication number Publication date
JPWO2015075894A1 (en) 2017-03-16

Similar Documents

Publication Publication Date Title
EP2731490B1 (en) System and method for remote measurement of optical focus
CN106062665B (en) User interface for optical sensing and tracking based on user's eye movement and position
CN100586364C (en) fundus observation device
US10204262B2 (en) Infrared imaging recognition enhanced by 3D verification
WO2010143377A1 (en) Fixation-object determination device and method
JP6346525B2 (en) Gaze detection device
CN104154898B (en) A kind of initiative range measurement method and system
CN100998495A (en) Fundus observation device
JP6631951B2 (en) Eye gaze detection device and eye gaze detection method
CN108259887B (en) Gaze point calibration method and device, gaze point calibration method and device
JP4971532B1 (en) Stereoscopic image capturing apparatus and endoscope
CN108040243A (en) Multispectral 3-D visual endoscope device and image interfusion method
WO2018031249A1 (en) Hybrid imaging sensor for structured light object capture
JP2013228557A (en) Display device and control method thereof
CN108073016A (en) Image forming apparatus
CN106767526A (en) A kind of colored multi-thread 3-d laser measurement method based on the projection of laser MEMS galvanometers
JP2015231498A (en) Endoscope device
JP2020515346A (en) Ophthalmic imaging device and system
CN109964230A (en) Method and apparatus for eye metric acquisition
CN103654699A (en) Fluorescence excitation binocular endoscopic system
JP6210483B2 (en) 3D shape acquisition device from stereoscopic endoscope image
JP2019052857A (en) Imaging apparatus
JP5587756B2 (en) Optical distance measuring device, distance measuring method of optical distance measuring device, and distance measuring program
JP2015197358A (en) Road surface detection system
WO2015075894A1 (en) Imaging device, pupil imaging device, pupil-diameter measurement device, pupil-state detection device, and pupil imaging method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14863686

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2015548974

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14863686

Country of ref document: EP

Kind code of ref document: A1