[go: up one dir, main page]

WO2020137908A1 - Appareil d'éclairage pour véhicule et véhicule - Google Patents

Appareil d'éclairage pour véhicule et véhicule Download PDF

Info

Publication number
WO2020137908A1
WO2020137908A1 PCT/JP2019/050170 JP2019050170W WO2020137908A1 WO 2020137908 A1 WO2020137908 A1 WO 2020137908A1 JP 2019050170 W JP2019050170 W JP 2019050170W WO 2020137908 A1 WO2020137908 A1 WO 2020137908A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pixels
photodetector
image
intensity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/050170
Other languages
English (en)
Japanese (ja)
Inventor
真太郎 杉本
祐介 笠羽
安男 中村
正人 五味
修己 山本
健人 新田
修 廣田
祐太 春瀬
輝明 鳥居
健佑 荒井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koito Manufacturing Co Ltd
Original Assignee
Koito Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koito Manufacturing Co Ltd filed Critical Koito Manufacturing Co Ltd
Priority to JP2020563214A priority Critical patent/JP7408572B2/ja
Priority to CN201980085931.7A priority patent/CN113227838B/zh
Publication of WO2020137908A1 publication Critical patent/WO2020137908A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/02Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
    • B60Q1/04Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21SNON-PORTABLE LIGHTING DEVICES; SYSTEMS THEREOF; VEHICLE LIGHTING DEVICES SPECIALLY ADAPTED FOR VEHICLE EXTERIORS
    • F21S41/00Illuminating devices specially adapted for vehicle exteriors, e.g. headlamps
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Definitions

  • the present invention relates to a vehicle lamp.
  • An object identification system that senses the position and type of objects existing around the vehicle is used for automatic driving and automatic control of headlamp light distribution.
  • the object identification system includes a sensor and an arithmetic processing unit that analyzes the output of the sensor.
  • the sensor is selected from cameras, LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), millimeter wave radar, ultrasonic sonar, etc., in consideration of application, required accuracy and cost.
  • Ghost imaging irradiates an object while randomly switching the intensity distribution (pattern) of reference light, and measures the light detection intensity of reflected light for each pattern.
  • the photodetection intensity is an integrated value of energy or intensity over a certain plane, not an intensity distribution. Then, the image of the object is reconstructed by taking the correlation between the corresponding pattern and the light detection intensity.
  • One of the exemplary purposes of the first aspect of the present invention is to accurately irradiate a distant object with reference light.
  • the amount of correlation calculation explosively increases according to the number of pixels of the restored image. Specifically, when the number of times of random reference light irradiation is M and the number of pixels is (X ⁇ Y), the number of calculations is M x (X x Y) 2 becomes
  • One of the exemplary purposes of the second aspect of the present invention is to provide an imaging apparatus or an imaging method with a reduced amount of calculation.
  • Patterning devices such as DMD (Digital Micromirror Device) and liquid crystal are used for patterning the reference light.
  • the DMD and the liquid crystal have a plurality of pixels arranged in a matrix, and the reflectance and the transmittance can be controlled for each pixel.
  • the in-vehicle imaging device can detect various objects such as cars, people, motorcycles and bicycles, structures, and plants and animals.
  • the situation in which the imaging device is used also changes greatly depending on the traveling environment such as weather, time of day, traveling road, traveling speed, and the like.
  • the imaging device itself moves, the objects also move, and their relative movement directions are various.
  • One of the exemplary objects of the third aspect of the present invention is to provide an illuminating device suitable for imaging for a specific application.
  • One of the exemplary objects of the fourth aspect of the present invention is to provide an illumination device suitable for a distant imaging device.
  • One of the exemplary purposes of the fifth aspect of the present invention is to provide an imaging device capable of reducing the number of irradiations.
  • the first aspect of the present invention relates to a vehicle lamp.
  • the vehicular lamp includes a headlight and a pseudo heat light source.
  • the pseudo heat light source can irradiate the object while randomly changing the intensity distribution of the reference light.
  • the pseudo thermal light source constitutes an imaging device together with a photodetector that measures reflected light from an object, and an arithmetic processing device that reconstructs a restored image of the object based on the output of the photodetector and the intensity distribution of the reference light. .. At least some components of the headlamp are shared with the pseudo heat source.
  • the headlight is designed with a light source and an optical system for the purpose of irradiating light up to several tens of meters. Therefore, by incorporating the pseudo heat light source of the imaging device in the vehicle lamp and diverting some of the components of the headlight to the pseudo heat light source, it is possible to accurately irradiate the distant object with the reference light. .. Also, the overall cost can be reduced.
  • the pseudo heat light source may share the optical system with the headlight.
  • the headlamp optics may include a patterning device that controls the light distribution.
  • the pseudo heat source and the headlight may share a patterning device.
  • the headlight optical system may include a reflector that reflects the light emitted from the light source toward the front of the vehicle.
  • the pseudo heat source and the headlight may share a reflector.
  • the reference light may be infrared or ultraviolet.
  • the pseudo heat light source may share the light source with the headlight.
  • the reference light may be white light.
  • the entire headlamp may be operable as a pseudo heat source for the imaging device.
  • the second aspect of the present invention relates to an in-vehicle imaging device.
  • the in-vehicle imaging device divides the measurement range into a plurality of sections, and illuminates a reference light whose intensity distribution is random while switching the sections, a photodetector that measures reflected light from an object, and a plurality of sections. Each of them is provided with an arithmetic processing unit that reconstructs a restored image of a portion included in the section of the object based on the detected intensity based on the output of the photodetector and the intensity distribution of the reference light.
  • the calculation time can be reduced.
  • the number of multiple partitions may be set so that the amount of decrease in the computation time due to the division is larger than the amount of increase in the measurement time. This enables high-speed sensing.
  • a third aspect of the present invention relates to an illumination device used for an imaging device based on ghost imaging.
  • the lighting device has a plurality of pixels arranged in a matrix, and is configured to be able to modulate the light intensity distribution based on a combination of ON and OFF of the plurality of pixels.
  • the intensity distribution is controlled in units of pixel blocks including at least one pixel, and the pixel blocks are variable.
  • the illumination device has a plurality of pixels arranged in a matrix, and is configured to be able to modulate the light intensity distribution based on a combination of ON and OFF of the plurality of pixels.
  • the intensity distribution is controlled by a combination of predetermined patterns including two or more ON pixels and OFF pixels.
  • a fourth aspect of the present invention is a lighting device.
  • the lighting device has a plurality of pixels arranged in a matrix, and is configured to be able to modulate the light intensity distribution based on a combination of ON and OFF of the plurality of pixels. On/off of a plurality of pixels is controlled under a predetermined constraint condition.
  • a fifth aspect of the invention relates to an imaging device.
  • the imaging device includes an illumination that illuminates an object while changing the intensity distribution of the reference light in a plurality of M ways, a photodetector that measures reflected light from the object for each of the plurality of intensity distributions I 1 to I M , and a plurality of photodetectors.
  • a photodetector that measures reflected light from the object for each of the plurality of intensity distributions I 1 to I M
  • a plurality of photodetectors Of the intensity distributions I 1 to I M and a plurality of detected intensities b 1 to b M based on the output of the photodetector, and an arithmetic processing unit for reconstructing a restored image of the object.
  • the plurality of intensity distributions I 1 to I M are (i) modeling the transfer characteristics of the path from the illumination through the object to the photodetector, and (ii) defining the reference object and its corresponding reference image.
  • the first aspect of the present invention it is possible to accurately irradiate the reference light at a distance. 2. According to the second aspect of the present invention, the amount of calculation can be reduced while obtaining high resolution. 3. According to the third aspect of the present invention, it is possible to provide an illumination device suitable for imaging for a specific application. 4. According to the fourth aspect of the present invention, it is possible to provide an illumination device suitable for an imaging device for a distant object. 5. According to the fifth aspect of the present invention, the number of irradiations can be reduced.
  • FIG. 1 is a diagram showing a vehicle lighting device according to a first embodiment.
  • FIG. 6 is a diagram showing a vehicle lamp according to a second embodiment.
  • FIG. 10 is a diagram illustrating first pattern control in the second embodiment.
  • FIG. 9 is a diagram illustrating second pattern control in the second embodiment.
  • FIG. 7 is a diagram showing a vehicle lamp according to a third embodiment.
  • FIG. 9 is a diagram illustrating a first control in the third embodiment.
  • FIG. 10 is a diagram illustrating third control in the third embodiment. It is a block diagram of an object identification system. It is a figure which shows a motor vehicle.
  • FIG. 6 is a diagram showing an imaging device according to a second embodiment.
  • FIG. 7 is a diagram illustrating the intensity distribution of reference light according to the second embodiment. It is a figure explaining the trade-off of calculation time and measurement time.
  • 16(a) and 16(b) are diagrams showing a modified example of a section.
  • FIG. 7 is a diagram showing an imaging device according to a third embodiment.
  • FIGS. 18A to 18C are views for explaining the pixels of the DMD that is the patterning device.
  • 19A to 19D are diagrams showing pixel blocks B having different sizes.
  • 20A and 20B are diagrams showing an example of the pattern signal PTN based on the pixel blocks B having different sizes.
  • 22A and 22B are diagrams illustrating the layout of the pixel blocks B having different sizes according to the running scene. It is a figure explaining the dynamic layout of the pixel block B from which size differs.
  • 24A to 24C are diagrams showing a pixel block B according to Modification 3.1.
  • 25A to 25D are diagrams showing a pixel block B according to Modification 3.2.
  • 26A to 26D are diagrams showing pixel blocks B having different shapes.
  • 27A to 27C are diagrams showing an example of the pattern signal PTN based on pixel blocks having different shapes.
  • FIGS. 28A and 28B are views for explaining sensing based on the pattern signal PTN having the pixel block B having a characteristic shape.
  • 29A to 29D are diagrams for explaining the pattern block PB according to the embodiment 3.5.
  • 30A and 30B are diagrams showing examples of pattern signals based on the combination of pattern blocks.
  • FIGS. 31A and 31B are diagrams for explaining the improvement of the spatial incoherence of the reference light.
  • 32A and 32B are diagrams showing examples of intensity distributions that can improve spatial incoherence.
  • FIGS. 33A to 33D are diagrams for explaining the pattern control with the lighting rate as a constraint condition.
  • 34(a) and 34(b) are diagrams for explaining the control of the lighting rate according to the modification. It is a figure which shows the imaging device which concerns on Embodiment 4.
  • FIG. 6 is a flowchart showing a method of determining a set of a plurality of intensity distributions I 1 to I M. It is a figure explaining the relationship between a reference object and reference image T (x, y).
  • a set I 100 ⁇ shown FIG consisting of 100 kinds of the intensity distribution I 1 ⁇ I 100 obtained for M 100. It is a figure which shows the restored image when using the set of optimized intensity distribution. It is a figure which shows the restored image when using a set of random intensity distribution.
  • the intensity distribution is random in the present specification does not mean that it is completely random, but may be random as long as an image can be reconstructed in ghost imaging. Therefore, “random” in the present specification can include a certain degree of regularity therein. Also, “random” does not require to be completely unpredictable, but may be predictable and reproducible.
  • FIG. 1 is a diagram showing a vehicle lamp 400 according to the first embodiment.
  • the vehicle lamp (or lamp system) 400 includes a headlamp 410 and a pseudo heat light source 420.
  • the headlamp 410 and the pseudo heat light source 420 are housed in the housing 402.
  • the front surface of the housing 402 is covered with a transparent cover 404.
  • the headlight 410 includes a low beam, a high beam, or both, and emits a beam Sb for forming a light distribution in front of the vehicle.
  • the pseudo heat light source 420 constitutes the imaging device 100 together with the photodetector 120 and the arithmetic processing device 130.
  • the photodetector 120 and the arithmetic processing unit 130 may be built in the housing 402 or may be provided outside the housing 402.
  • the imaging apparatus 100 is a correlation function image sensor (also referred to as single pixel imaging) that uses the principle of ghost imaging, and includes a pseudo thermal light source 110 (pseudo thermal light source 420 in FIG. 1), a photodetector 120, and an arithmetic processing unit 130. ..
  • the imaging device 100 is also called a quantum radar camera.
  • the pseudo heat light source 110 generates the reference light S1 having the intensity distribution I(x, y) that can be regarded as substantially random, and irradiates the object OBJ. Irradiation of the reference light S1 onto the object OBJ is performed while changing its intensity distribution according to a plurality of M patterns.
  • the pseudo-thermal light source 110 may include, for example, a light source 112 that generates a light S0 having a uniform intensity distribution, and a patterning device 114 that can spatially modulate the intensity distribution I of the light S0.
  • the light source 112 may use a laser, a light emitting diode, or the like.
  • the wavelength and spectrum of the reference light S1 are not particularly limited, and may be white light having a plurality of or continuous spectra, or monochromatic light having a predetermined wavelength.
  • a DMD Digital Micromirror Device
  • a liquid crystal device can be used as the patterning device 114.
  • the pattern signal PTN (image data) designating the intensity distribution I is given to the patterning device 114 from the arithmetic processing unit 130. Therefore, the arithmetic processing unit 130 currently receives the reference light S1 irradiated on the object OBJ. Know the intensity distribution I r .
  • the photodetector 120 measures the reflected light from the object OBJ and outputs a detection signal D r .
  • the detection signal D r is a spatial integral value of light energy (or intensity) incident on the photodetector 120 when the object OBJ is irradiated with the reference light having the intensity distribution I r . Therefore, as the photodetector 120, a single-pixel photodetector (photodetector) can be used as the photodetector 120.
  • the photodetector 120 outputs a plurality of detection signals D 1 to D M corresponding to a plurality of M intensity distributions I 1 to I M, respectively.
  • the arithmetic processing unit 130 includes a pattern generator 132 and a reconstruction processing unit 134.
  • the arithmetic processing unit 130 can be implemented by a combination of a processor (hardware) such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a microcomputer, and a software program executed by the processor (hardware).
  • a processor such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a microcomputer
  • the arithmetic processing unit 130 may be a combination of a plurality of processors. Alternatively, the arithmetic processing unit 130 may be composed of only hardware.
  • the intensity distribution of the reference light S1 generated by the pseudo heat light source 110 may be randomly generated each time.
  • a set of a plurality of intensity distributions I 1 to I M may be defined in advance.
  • a set of the plurality of pattern signals PTN 1 to PTN M defining the plurality of intensity distributions I 1 to I M may be stored in advance in a memory (pattern memory) inside the pattern generator 132.
  • the reconstruction processing unit 134 reconstructs the restored image G(x, y) of the object OBJ by correlating the plurality of intensity distributions I 1 to I M and the plurality of detected intensities b 1 to b M.
  • the detection intensities b 1 to b M are based on the detection signals D 1 to D M. The relationship between the detection intensity and the detection signal may be determined in consideration of the type and method of the photodetector 120.
  • the detection signal D r is assumed to represent the amount of light received at a certain time (or a minute time), that is, an instantaneous value.
  • the detection signal D r may be sampled multiple times during the irradiation period, and the detection intensity b r may be an integrated value, an average value, or a maximum value of all sampling values of the detection signal D r .
  • some of all the sampling values may be selected, and the integrated value, average value, or maximum value of the selected sampling values may be used.
  • the order xth to yth counting from the maximum value may be extracted, sampling values lower than an arbitrary threshold value may be excluded, or the magnitude of signal fluctuation may be excluded. It is also possible to extract a sampling value in a range where is small.
  • the output D r of the photodetector 120 can be directly used as the detection intensity b r .
  • the conversion from the detection signal D r to the detection intensity b r may be performed by the arithmetic processing device 130 or may be performed outside the arithmetic processing device 130.
  • I r is the r-th intensity distribution
  • b r is the r-th detected intensity value
  • the pseudo heat light source 420 is built in the vehicle lamp 400. At least a part of the components (common member) 430 of the headlamp 410 is shared with the pseudo heat light source 420.
  • the above is the configuration of the vehicle lamp 400.
  • the headlight 410 is designed with a light source and an optical system for the purpose of irradiating light up to several tens of meters. Therefore, by incorporating the pseudo heat light source 420 of the imaging device in the vehicle lamp 400 and diverting some of the components of the headlight 410 to the pseudo heat light source 420, the reference light S1 can be accurately applied to a distant object. Can be irradiated. Moreover, since the number of overlapping members can be reduced, the overall cost can be reduced.
  • the present invention extends to various devices and methods understood as the block diagram of FIG. 1 or derived from the above description, and is not limited to a specific configuration.
  • more specific configuration examples and examples will be described in order to help understanding of the essence and operation of the invention and to clarify them, not to narrow the scope of the invention.
  • FIG. 3 is a diagram showing a vehicular lamp 400A according to the first embodiment.
  • the pseudo heat light source 420 shares an optical system with the headlight 410.
  • the headlamp 410 includes a light source 412 and a reflector 414.
  • the light source 412 includes a white light emitting diode (or a semiconductor laser) and its lighting circuit.
  • the reflector 414 reflects the light emitted from the light source 412 toward the front of the vehicle.
  • the reflector 414 of the headlight 410 is the common member 430 of FIG. 1 and is shared with the pseudo heat light source 420.
  • Pseudo thermal light source 420 includes light source 422 and patterning device 424. The beam whose intensity distribution is randomized by the patterning device 424 is reflected by the reflector 414 toward the front of the vehicle.
  • the light S0 generated by the light source 422 may be infrared light or ultraviolet light.
  • the photodetector 120 is insensitive in the visible wavelength band and may be configured to have sensitivity only to the wavelength of the light S0 (reference light S1). Thereby, the sensing by the imaging device 100 is not affected by the headlight.
  • the light S0 generated by the light source 422 may include a single wavelength in the visible range or may be white light.
  • the photodetector 120 is sensitive to both the beam Sb of the headlight 410 and the reference light S1.
  • the influence of the beam Sb on the output of the photodetector 120 may be reduced by arithmetic processing.
  • the beam Sb can be regarded as direct current and the reference light S1 can be regarded as alternating current
  • the influence of the beam Sb may be reduced by a high-pass filter.
  • the offset process of subtracting the estimated value of the component caused by the beam Sb may be performed.
  • FIG. 4 is a diagram illustrating a vehicular lamp 400B according to the second embodiment. Also in the second embodiment, the pseudo heat light source 420 shares the optical system with the headlight 410.
  • the headlight 410 is a variable light distribution lamp (ADB: Adaptive Driving Beam), and includes a light source 412, a patterning device 416, and a reflector 414.
  • the light source 412 includes a white LED or LD and its lighting circuit.
  • the patterning device 416 is, for example, a DMD, and spatially modulates the intensity distribution of the light emitted from the light source 412 so as to obtain a desired light distribution pattern.
  • the reflector 414 reflects the light flux corresponding to an on-pixel in the reflected light of the patterning device 416 to the front of the vehicle.
  • the patterning device 416 and the reflector 414 are the common member 430 shared with the pseudo heat light source 420.
  • the emitted light S0 of the light source 422 is incident on the patterning device 416 and is randomly modulated, and the reference light S1 is generated.
  • the patterning device 416 since the patterning device 416 is shared, it is necessary to suppress mutual influence of the pattern (intensity distribution) of the beam Sb and the pattern of the reference light S1.
  • the light source 422 and the light source 412 may be complementarily turned on.
  • the light source 412 and the light source 422 are alternately turned on in a time-sharing manner, and during the lighting period of the light source 412, image data (light distribution image data) PTNb corresponding to the light distribution pattern is given to the patterning device 416,
  • the random image data (random image data) PTN 1 to PTN M may be given to the patterning device 416 (first pattern control).
  • FIG. 5 is a diagram illustrating the first pattern control in the second embodiment.
  • FIG. 6 is a diagram illustrating the second pattern control in the second embodiment. ON indicates an irradiation area defined by the light distribution pattern, and OFF indicates a light shielding area defined by the light distribution pattern.
  • the random image data PTN i has pixel values in which 1 and 0 are randomly distributed.
  • the pattern of FIG. 6 can be generated by calculating the logical product of the random image data PTN i and the light distribution image data PTNb.
  • FIG. 7 is a diagram illustrating a vehicle lamp 400C according to the third embodiment.
  • the headlight 410 all the components of the headlight 410 are shared with the pseudo heat light source 420. That is, the headlight 410 has the function of the pseudo heat light source 420.
  • the reference light S1 naturally becomes white light.
  • FIG. 8 is a diagram illustrating the first control in the third embodiment. As shown in FIG. 8, the pattern of the reference light S1 may be switched a plurality of times during one sensing period Ts.
  • the second pattern control of the second embodiment may be performed.
  • FIG. 9 is a diagram illustrating the third control in the third embodiment.
  • Some patterning devices such as DMDs are capable of gradation control.
  • corresponding pixel values of the random image data PTN i and the light distribution image data PTNb may be added together and given to the patterning device 416.
  • the intensity distribution of the outgoing beam of the vehicular lamp 400 changes randomly with time with the base level determined by the light distribution image data PTNb as a reference.
  • the pseudo thermal light source 420 is composed of the combination of the light source 422 and the patterning device 424, but it is not limited thereto.
  • the pseudo heat light source 420 is composed of an array of a plurality of semiconductor light sources (LED (light emitting diode) or LD (laser diode)) arranged in a matrix, and can control ON/OFF (or brightness) of each semiconductor light source. You may comprise.
  • the illumination device 110 is composed of the combination of the light source 112 and the patterning device 114, but it is not limited thereto.
  • the illuminating device 110 is composed of an array of a plurality of semiconductor light sources (LED (light emitting diode) or LD (laser diode)) arranged in a matrix, and it is possible to control ON/OFF (or brightness) of each semiconductor light source. You may comprise.
  • FIG. 10 is a block diagram of the object identification system 10.
  • the object identification system 10 is mounted on a vehicle such as an automobile or a motorcycle and determines the type (category) of an object OBJ existing around the vehicle.
  • the object identification system 10 includes an imaging device 100 and an arithmetic processing device 40. As described above, the imaging apparatus 100 irradiates the object OBJ with the reference light S1 and measures the reflected light S2 to generate the restored image G of the object OBJ.
  • the arithmetic processing device 40 processes the output image G of the imaging device 100 and determines the position and type (category) of the object OBJ.
  • the classifier 42 of the arithmetic processing device 40 receives the image G as an input and determines the position and type of the object OBJ included in the image G.
  • the classifier 42 is implemented based on the model generated by machine learning.
  • the algorithm of the classifier 42 is not particularly limited, but YOLO (You Only Look Once), SSD (Single Shot Multi Box Detector), R-CNN (Region-based Convolutional Neural Network), SPPnet (Spatial Pyramid Pooling), Faster R-CNN , DSSD (Deconvolution-SSD), MaskR-CNN, etc. can be adopted, or an algorithm developed in the future can be adopted.
  • Information about the object OBJ detected by the arithmetic processing unit 40 may be used for light distribution control of the vehicular lamp 200. Specifically, an appropriate light distribution pattern can be generated based on the information on the type and position of the object OBJ generated by the arithmetic processing device 40.
  • Information regarding the object OBJ detected by the arithmetic processing unit 40 may be transmitted to the vehicle-side ECU.
  • the vehicle-side ECU may perform automatic driving based on this information.
  • the above is the configuration of the object identification system 10.
  • noise resistance is significantly improved. For example, it is difficult to recognize the object OBJ with naked eyes when it is raining, snowing, or traveling in fog, but even in such a situation, the restored image G of the object OBJ is not affected by rain, snow, or fog. Can be obtained.
  • FIG. 11 is a diagram showing an automobile.
  • the automobile 300 includes vehicle lamps 302L and 302R.
  • the pseudo heat light source 420 is built in at least one of the vehicular lamps 302L and 302R in a mode in which a part of the hardware is shared with the headlight.
  • FIG. 12 is a block diagram showing a vehicle lamp 200 including an object detection system 210.
  • the vehicle lamp 200 constitutes a lamp system 310 together with the vehicle-side ECU 304.
  • the vehicular lamp 200 includes a light source 202, a lighting circuit 204, and an optical system 206. Further, the vehicle lighting device 200 is provided with an object detection system 210.
  • the object detection system 210 corresponds to the object identification system 10 described above, and includes the imaging device 100 and the arithmetic processing device 40.
  • Information about the object OBJ detected by the arithmetic processing unit 40 may be used for light distribution control of the vehicular lamp 200.
  • the lamp-side ECU 208 generates an appropriate light distribution pattern based on the information regarding the type and the position of the object OBJ generated by the arithmetic processing device 40.
  • the lighting circuit 204 and the optical system 206 operate so that the light distribution pattern generated by the lamp-side ECU 208 is obtained.
  • Information regarding the object OBJ detected by the arithmetic processing unit 40 may be transmitted to the vehicle-side ECU 304.
  • the vehicle-side ECU may perform automatic driving based on this information.
  • FIG. 13 is a diagram showing the imaging apparatus 100 according to the second embodiment.
  • the imaging device 100 is a correlation function image sensor that uses the principle of ghost imaging, and includes an illumination device 110, a photodetector 120, and an arithmetic processing device 130.
  • the imaging device 100 is also called a quantum radar camera.
  • the lighting device 110 is a pseudo heat light source, and generates the reference light S1 having the intensity distribution I that can be regarded as substantially random, and irradiates the object OBJ.
  • FIG. 14 is a diagram illustrating the intensity distribution of the reference light S1 according to the second embodiment. In the figure, the part where the intensity is zero is shown in white, and the part where the intensity is not zero is shown in black.
  • the illumination device 110 irradiates the reference light S1 in which the intensity distribution I(x, y) in the irradiation section is substantially random, while switching the section (referred to as an irradiation section) 602_i that irradiates light.
  • the intensity in the sections other than the irradiation section (called non-irradiation section) is zero.
  • Each of the plurality of sections 602_1 to 602_N is irradiated with the reference light S1 having M random intensity distributions. Therefore, the total number of irradiation times per sensing is M ⁇ N.
  • the i-th section (1 ⁇ i ⁇ N) is selected, the j-th intensity distribution is I i,j , and the reference light S1 at that time is shown as S1 i,j .
  • the photodetector 120 measures the reflected light from the object OBJ and outputs a detection signal D.
  • the detection signal D i,j is a spatial integral value of light energy (or intensity) incident on the photodetector 120 when the object OBJ is irradiated with the reference light having the intensity distribution I i,j . Therefore, as the photodetector 120, a single-pixel photodetector (photodetector) can be used as the photodetector 120.
  • Reference light S1 i,j (i ⁇ 1 to N) having M ⁇ N intensity distributions I 1,1 to I 1,M , I 2,1 to I 2,M , I N,1 to I N,M , J ⁇ 1 to M), the photodetector 120 outputs M ⁇ N detection signals D i,j (i ⁇ 1 to N, j ⁇ 1 to M).
  • the order of irradiation is not particularly limited.
  • the next irradiation section may be selected.
  • the order of selecting the irradiation sections is not particularly limited, and the irradiation sections can be selected according to a predetermined rule.
  • the sections in the first row may be selected in order from left to right, and after moving to the rightmost, the next row may be moved.
  • the sections in the first column may be selected in order from top to bottom, and after moving to the bottom, move to the next row.
  • the illuminator 110 may include, for example, a light source 112 that generates a light S0 having a uniform intensity distribution, and a patterning device 114 that can spatially modulate the intensity distribution of the light S0.
  • the light source 112 may use a laser, a light emitting diode, or the like.
  • the wavelength and spectrum of the reference light S1 are not particularly limited, and may be white light having a plurality of or continuous spectra, or monochromatic light having a predetermined wavelength.
  • the patterning device 114 As the patterning device 114, a DMD (Digital Micromirror Device) or a liquid crystal device can be used. In this embodiment, the patterning device 114 covers the entire measurement range 600 and has the ability to simultaneously illuminate the entire measurement range 600, but turns off the pixels corresponding to the non-illuminated sections of the patterning device 114. As a result, it is possible to give a random pattern only to the irradiation section.
  • a pattern signal PTN i,j (image data) designating the intensity distribution I i,j is given to the patterning device 114 from the arithmetic processing unit 130. Therefore, the arithmetic processing unit 130 knows the current position of the irradiation section and the intensity distribution I i,j of the reference light S1.
  • the arithmetic processing unit 130 includes a pattern generator 132 and a reconstruction processing unit 134.
  • the pattern generator 132 may randomly generate the intensity distribution I i,j of the reference light S1 each time.
  • the pattern generator 132 may include a pseudo random signal generator.
  • a set of a plurality of intensity distributions I i,j may be defined in advance.
  • a plurality (for example, M) of sets of intensity distributions I 1 to I M having the same size as the partition 602 may be defined in advance.
  • I 1 to I M may be assigned to the irradiation section in order or randomly.
  • a set of a plurality of pattern signals defining a plurality of intensity distributions I 1 to I M may be held in advance in a memory (pattern memory) inside the pattern generator 132.
  • the arithmetic processing unit 130 can be implemented by a combination of a processor (hardware) such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a microcomputer, and a software program executed by the processor (hardware).
  • a processor such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a microcomputer
  • the arithmetic processing unit 130 may be a combination of a plurality of processors. Alternatively, the arithmetic processing unit 130 may be composed of only hardware.
  • the reconstruction processing unit 134 for each of the plurality of sections 602_1 to 602_N (602_i), a plurality of detection intensities b i,1 to b i,M and an intensity distribution I i of the reference light S1 i,1 to S1 i,M .
  • a plurality of detection intensities b i,1 to b i,M and an intensity distribution I i of the reference light S1 i,1 to S1 i,M .
  • the detection intensities b i,1 to b i,M are based on the detection signals D i,1 to D i,M .
  • the relationship between the detection intensity b i,j and the detection signal D i,j may be determined in consideration of the type and method of the photodetector 120.
  • the detection signal D i,j is assumed to represent the amount of received light at a certain time (or a minute time), that is, an instantaneous value.
  • the detection signal D i,j may be sampled a plurality of times during the irradiation period, and the detection intensity b i,j may be an integral value, an average value, or a maximum value of all sampling values of the detection signal D i,j .
  • some of all the sampling values may be selected, and the integrated value, average value, or maximum value of the selected sampling values may be used.
  • the order xth to yth counting from the maximum value may be extracted, sampling values lower than an arbitrary threshold value may be excluded, or the magnitude of signal fluctuation may be excluded. It is also possible to extract a sampling value in a range where is small.
  • the output D i,j of the photodetector 120 can be directly used as the detection intensity b i,j .
  • the conversion from the detection signal D i,j to the detection intensity b i,j may be performed by the arithmetic processing device 130 or may be performed outside the arithmetic processing device 130.
  • the correlation function of Expression (2) is used to restore the image G i of the i-th (1 ⁇ i ⁇ N) section 602 — i .
  • I i,j is the j-th (1 ⁇ j ⁇ M) intensity distribution
  • b i,j is the j-th detected intensity value.
  • the number of calculations in this imaging apparatus 100 is as follows.
  • the number of pixels in the entire measurement range is X ⁇ Y, and the number of pixels in the horizontal and vertical directions in one section is x and y.
  • X ⁇ Y (x ⁇ y) ⁇ N.
  • the intensity distribution of the reference light is stored in the memory in the pattern generator 132, the intensity distribution of one section (x ⁇ y) may be held instead of the entire irradiation area (X ⁇ Y).
  • the memory capacity can be reduced.
  • ⁇ Reduction in the number of calculations means that the calculation time can be shortened when using the same speed processor.
  • a slower (and thus cheaper) processor can be employed to finish the process in the same amount of time.
  • FIG. 15 is a diagram for explaining the trade-off between calculation time and measurement time.
  • the number N of partitions may be set so that the decrease amount ⁇ 1 of the calculation time due to the division is larger than the increase amount ⁇ 2 of the measurement time. As a result, the frame rate required for in-vehicle use can be realized.
  • the number of irradiations M is the same for each section, but the number of irradiations M may be different for each section.
  • the irradiation number M i the more accurate the image can be restored, but depending on the position of the section, the accuracy may not be required so much. Therefore, by optimizing the irradiation number for each section, the number of calculations (calculation time) and the measurement time can be adjusted for each section.
  • 16(a) and 16(b) are diagrams showing a modified example of a section. As shown in FIG. 16A, it may be divided into horizontally long sections. Alternatively, it may be divided into vertically long sections.
  • the sizes (number of pixels) of a plurality of partitions are the same, but this is not the case. As shown in FIG. 16B, the number of pixels may be different for each section.
  • the patterning device 114 that covers the entire measurement range 600 is used, but it is not limited thereto.
  • the illuminating device 110 having the irradiation ability for one section may be provided, and the emitted light may be scanned in the horizontal direction or the horizontal direction using the movable mirror.
  • the illumination device 110 is configured by the combination of the light source 112 and the patterning device 114 in the second embodiment, the present invention is not limited to this.
  • the illuminating device 110 is composed of an array of a plurality of semiconductor light sources (LED (light emitting diode) or LD (laser diode)) arranged in a matrix, and it is possible to control ON/OFF (or brightness) of each semiconductor light source. You may comprise.
  • the imaging device 100 can be used for the object identification system 10 of FIG.
  • the imaging device 100 described in the second embodiment as the sensor of the object identification system 10 the following advantages can be obtained.
  • the use of the imaging device 100 that is, the quantum radar camera significantly improves noise resistance. For example, when it is raining, snowing, or traveling in fog, it is difficult for the naked eye to recognize the object OBJ, but by using the imaging device 100, the object OBJ is not affected by rain, snow, or fog.
  • the restored image G can be obtained.
  • the calculation range can be reduced by dividing the measurement range into multiple sections and restoring the image for each section. This makes it possible to increase the frame rate or select an inexpensive processor as the arithmetic processing unit.
  • the number N of sections may be adaptively changed according to the traveling environment.
  • the imaging device 100 described in the second embodiment can be mounted on the vehicle shown in FIG. 11 and may be incorporated in the vehicle lighting device shown in FIG.
  • FIG. 17 is a diagram showing the imaging apparatus 100 according to the third embodiment.
  • the imaging device 100 is a correlation function image sensor that uses the principle of ghost imaging, and includes an illumination device 110, a photodetector 120, and an arithmetic processing device 130.
  • the imaging device 100 is also called a quantum radar camera.
  • the lighting device 110 is a pseudo heat light source, and generates the reference light S1 having the intensity distribution I(x, y) that can be regarded as substantially random, and irradiates the object OBJ. Irradiation of the reference light S1 onto the object OBJ is performed while changing its intensity distribution according to a plurality of M patterns.
  • the lighting device 110 includes a light source 112 and a patterning device 114.
  • the light source 112 generates the light S0 having a uniform intensity distribution.
  • the light source 112 may use a laser, a light emitting diode, or the like.
  • the wavelength and spectrum of the reference light S1 are not particularly limited, and may be white light having a plurality of or continuous spectra, or monochromatic light having a predetermined wavelength.
  • the wavelength of the reference light S1 may be infrared or ultraviolet.
  • the patterning device 114 has a plurality of pixels arranged in a matrix, and the intensity distribution I of light can be spatially modulated based on the combination of ON and OFF of the plurality of pixels.
  • a pixel in an on state is referred to as an on pixel
  • a pixel in an off state is referred to as an off pixel. Note that, in the following description, for ease of understanding, each pixel takes only two values (1, 0) of ON and OFF, but the present invention is not limited to this and may take an intermediate gradation.
  • a reflective DMD Digital Micromirror Device
  • a transmissive liquid crystal device can be used as the patterning device 114.
  • a pattern signal PTN (image data) generated by the pattern generator 116 is applied to the patterning device 114.
  • the patterning device 114 is assumed to be a DMD.
  • the photodetector 120 measures the reflected light from the object OBJ and outputs a detection signal D r .
  • the detection signal D r is a spatial integral value of light energy (or intensity) incident on the photodetector 120 when the object OBJ is irradiated with the reference light having the intensity distribution I r . Therefore, as the photodetector 120, a single-pixel photodetector (photodetector) can be used as the photodetector 120.
  • the photodetector 120 outputs a plurality of detection signals D 1 to D M corresponding to a plurality of M intensity distributions I 1 to I M, respectively.
  • the arithmetic processing unit 130 includes a reconstruction processing unit 134.
  • the reconstruction processing unit 134 reconstructs the restored image G(x, y) of the object OBJ by correlating the plurality of intensity distributions I 1 to I M and the plurality of detected intensities b 1 to b M.
  • the detection intensities b 1 to b M are based on the detection signals D 1 to D M.
  • the relationship between the detection intensity and the detection signal may be determined in consideration of the type and method of the photodetector 120.
  • the detection signal D r is assumed to represent the amount of light received at a certain time (or a minute time), that is, an instantaneous value.
  • the detection signal D r may be sampled multiple times during the irradiation period, and the detection intensity b r may be an integrated value, an average value, or a maximum value of all sampling values of the detection signal D r .
  • some of all the sampling values may be selected, and the integrated value, average value, or maximum value of the selected sampling values may be used.
  • the order xth to yth counting from the maximum value may be extracted, sampling values lower than an arbitrary threshold value may be excluded, or the magnitude of signal fluctuation may be excluded. It is also possible to extract a sampling value in a range where is small.
  • the output D r of the photodetector 120 can be directly used as the detection intensity b r .
  • the conversion from the detection signal D r to the detection intensity b r may be performed by the arithmetic processing device 130 or may be performed outside the arithmetic processing device 130.
  • the arithmetic processing unit 130 can be implemented by a combination of a processor (hardware) such as a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a microcomputer, and a software program executed by the processor (hardware).
  • the arithmetic processing unit 130 may be a combination of a plurality of processors. Alternatively, the arithmetic processing unit 130 may be composed of only hardware.
  • the pattern generator 116 may be mounted inside the arithmetic processing unit 130.
  • FIGS. 18A to 18C are views for explaining the pixels of the DMD that is the patterning device 114.
  • the DMD is an array of a plurality of pixels PIX arranged in a matrix of m rows and n columns.
  • each pixel PIX is a square mirror, and can be tilted in the ON direction and the OFF direction about a hinge provided diagonally as an axis.
  • the patterning device 114 is configured so that all pixels can be independently turned on and off.
  • the shape of the matrix is simplified and shown as shown in FIG.
  • the pattern generator 116 controls the intensity distribution of the reference light (that is, the pattern signal PTN) in units of the pixel block B including at least one pixel, and the pixel block B is variable.
  • the pixel block B can be understood as a set of continuous (adjacent) ON pixels (or a set of OFF pixels, or a set of ON pixels and OFF pixels). In Example 3.1, the size of the pixel block B is variable.
  • FIGS. 19A to 19D are diagrams showing pixel blocks B having different sizes. The size can be grasped as the number of pixels (that is, the area) included in the pixel block B.
  • FIGS. 19A to 19D show pixel blocks B 1 ⁇ 1 to B 4 ⁇ 4 of vertical and horizontal 1 ⁇ 1 pixels, 2 ⁇ 2 pixels, 3 ⁇ 3 pixels, and 4 ⁇ 4 pixels, respectively. Pixels included in the same pixel block B are in the same state (on, off).
  • FIGS. 20A and 20B are diagrams showing examples of pattern signals (image data) PTN based on pixel blocks B having different sizes.
  • the pixel block B 2 ⁇ 2 of 2 ⁇ 2 pixels in FIG. 19B is applied, and ON/OFF is controlled for each pixel block B 2 ⁇ 2 .
  • the pattern signal PTN changes in M ways per sensing.
  • the hatched pixel PIX is an ON pixel, and the pixel PIX that is not hatched is an OFF pixel.
  • the pixel block B 4 ⁇ 4 of 4 ⁇ 4 pixels in FIG. 19D is applied, and ON/OFF is controlled for each pixel block B 4 ⁇ 4 .
  • the number M of the pattern signals PTN may be different according to the size of the pixel block. Generally, by increasing the size of the pixel block and decreasing the number of pixel blocks, the pattern per sensing is performed once. It is possible to reduce the number M of
  • FIG. 21 is a diagram showing a modification of the pattern control.
  • the arrangement of 4 ⁇ 4 pixel pixel blocks B 4 ⁇ 4 is not perfectly aligned in the horizontal direction, but is offset by 2 pixels in the horizontal direction in some rows.
  • the pattern control according to the embodiment 3.1. It can be understood that this pattern control dynamically changes the effective resolution of the patterning device 114.
  • the calculation amount in the reconstruction processing unit 134 increases according to the resolution. However, in a situation where spatial resolution is not so required, the calculation amount can be reduced by increasing the size of the pixel block B.
  • Example 3.2 In Example 3.1, the same pixel block B was included in one pattern, but this is not the case.
  • the layout of pixel blocks having different sizes is defined in advance and selected according to the running scene.
  • 22A and 22B are diagrams illustrating the layout of the pixel blocks B having different sizes according to the running scene.
  • the pixel block B S having a smaller size is arranged in the lower region, and the pixel block B L having a larger size is arranged in the upper region.
  • the sky a space in which no vehicle or pedestrian exists
  • the size of the pixel block B is increased to reduce the resolution.
  • the lower side corresponds to the road surface and there is a high possibility that an important object (or road surface sign) is present. Therefore, the size of the pixel block B is reduced to improve the resolution.
  • a pixel block B S having a smaller size is arranged closer to the center, and a pixel block B L having a larger size is arranged closer to the outer periphery.
  • the vanishing point is located near the center of the screen, and a distant oncoming vehicle appears from the vanishing point, the size is small at the beginning, and the size increases as the vehicle approaches.
  • a plurality of layouts are defined in advance, and one suitable for the running scene is adaptively selected from the plurality of layouts. May be dynamically changed.
  • FIG. 23 is a diagram illustrating a dynamic layout of pixel blocks B having different sizes.
  • a pattern of pixel blocks B 2 ⁇ 2 of uniform size (2 ⁇ 2 pixels) is used.
  • the position of the object OBJ is estimated from the image restored in this frame 1.
  • the pixel block B having a smaller size may be arranged in the area where the object OBJ exists, and the size of the pixel block B may be increased as the distance from the pixel block B increases.
  • Modifications related to Examples 3.1 to 3.3 Modification 1
  • the pixel block B has a square shape in the above description, the shape is not limited thereto.
  • 24A to 24C are diagrams showing a pixel block B according to Modification 1. As shown in FIG. The pixel block B is a horizontally long rectangle, and its size dynamically changes.
  • Modification 2 In Examples 3.1 to 3.3, the vertical direction and the horizontal direction are changed on the same scale, but only the number of pixels in the vertical direction or only the number of pixels in the horizontal direction may be changed.
  • 25A to 25D are diagrams showing a pixel block B according to the second modification. In this example, the pixel block B has a variable number of pixels in the horizontal direction.
  • Example 3.4 In Embodiments 3.1 to 3.3, the case where the size of the pixel block B is changed has been described. In Example 3.4, the shape of the pixel block B is adaptively changed.
  • 170(a) to 170(d) are diagrams showing pixel blocks B having different shapes.
  • 170(a) shows a pixel block B X long in the horizontal direction (X direction)
  • FIG. 170(b) shows a pixel block B Y long in the vertical direction (Y direction)
  • FIG. , A diagonally long pixel block B XY is shown.
  • FIG. 170(d) shows a basic square pixel block B S.
  • the pixel blocks B in FIGS. 170(a) to 170(d) have the same size.
  • 27A to 27C are diagrams showing an example of the pattern signal PTN based on pixel blocks having different shapes. At least one of the length and the position of the pixel block B is randomly determined.
  • 27A shows an example of the pattern signal PTN X using the horizontally long pixel block B X
  • FIG. 27B shows an example of the pattern signal PTN Y using the vertically long pixel block B Y.
  • (C) shows an example of the pattern signal PTN XY using the diagonal pixel blocks B XY .
  • the horizontal pixel block B S is used as compared with the case where the square pixel block B S is used. Since the effective resolution in the direction decreases, the sharpness of the image in the horizontal direction decreases, or the detection accuracy of the horizontal position decreases, but the capture time (in other words, the exposure time) increases and the detection intensity D Is increased, the S/N ratio is increased to facilitate detection (that is, the sensitivity is increased).
  • the detection time is shorter because the capture time is shorter than when the square pixel block B S is used, and the detection sensitivity is reduced.
  • the vertical resolution is improved and the vertical position detection accuracy is increased.
  • the detection sensitivity can be increased for an object moving in the direction in which the pixel block B extends, and a sharp image can be obtained or the position detection accuracy can be increased for an object moving in a direction perpendicular to the detection direction.
  • FIGS. 28A and 28B are views for explaining sensing based on the pattern signal PTN having the pixel block B having a characteristic shape.
  • the pattern signal PTN S including the pixel block B S of a certain shape square of 5 ⁇ 5 pixels
  • the position of the tip of the object OBJ is determined to be X′, and there is an error from the actual position X of the tip of the object OBJ.
  • the shape of the pixel block B is switched in order to positively capture a certain object OBJ, but this is not the only option, and in order to erase a specific object OBJ
  • the shape of the block B may be used. For example, it can be said that rain and snow are noise for an in-vehicle sensing device and there is no need to restore an image. Since the moving directions of rain and snow are constant, optimizing the shape of the pixel block B makes it easier to eliminate the influence of rain and snow.
  • a pixel block B having a shape that is short in the vertical direction (vertical direction), in other words, long in the horizontal direction, is suitable.
  • the light of a specific pixel block reaches the object OBJ on the other side of the rain and returns to the photodetector 120 on the front side of the rain,
  • the probability that the light of a specific pixel block will be significantly affected (blocked) by the raindrops is reduced. That is, since the influence of rain on each pixel is made uniform, the process of removing the influence of rain (noise canceling) becomes easy.
  • Example 3.5 In Examples 3.1 to 3.4 described above, all the pixels included in the same pixel block B are on (or off). On the other hand, in Example 3.5, the same pixel block B includes both ON pixels and OFF pixels. That is, the pixel block B includes two or more ON pixels and OFF pixels which are arranged in a predetermined manner. Such a pixel block is called a pattern block.
  • the illumination device 110 defines the intensity distribution by the combination of pattern blocks.
  • FIGS. 29A to 29D are diagrams for explaining the pattern block PB according to the embodiment 3.5. 29A to 29D show distributions (patterns) of ON pixels and OFF pixels in the pattern block PB.
  • the off pixels are arranged along the four sides of the pattern block PB, and in FIG. 29B, the off pixels are arranged along the two adjacent sides of the pattern block PB.
  • the ON pixels are arranged so as to cross diagonally.
  • the ON pixels are arranged so as to cross each other vertically and horizontally.
  • FIG. 30A and 30B are diagrams showing examples of pattern signals based on the combination of pattern blocks.
  • FIG. 30A is an example of the pattern signal formed by the pattern block of FIG. 29A.
  • FIG. 30B is an example of the pattern signal formed by the pattern block of FIG. 29D.
  • 30A and 30B have the same arrangement of the pattern blocks that are turned on.
  • the spatial incoherence can be improved as described below.
  • FIGS. 31A and 31B are diagrams for explaining the improvement of the spatial incoherence of the reference light.
  • a light flux emitted from a certain light source travels with a certain divergence angle.
  • the spread of the luminous flux for each pixel does not matter.
  • the vehicle-mounted imaging apparatus 100 needs to detect an object in the far field, the spread of the light flux causes a problem.
  • two light fluxes emitted from two adjacent ON areas (or pixels) A and B of the illumination device 110 are located at the position of the object OBJ far from the illumination device 110. Overlap in. Such overlapping of the light beams reduces the spatial incoherence.
  • the number of continuous ON pixels can be limited to two. This means that an OFF pixel is inserted between two adjacent ON regions. As a result, as shown in FIG. 31B, the two adjacent on-regions A and B can be spatially separated, so that the overlapping of the light fluxes can be reduced even for the reference light with which the object OBJ is irradiated. , Spatial incoherence can be improved. Also in the case of using the pattern block B of FIG. 29C, the number of continuous ON pixels can be set to one or two in the horizontal and vertical directions.
  • the pattern block B shown in FIGS. 29A to 29C may be used, and when the object OBJ is close, a normal pattern block (or pixel block) may be used. ..
  • the intensity distribution that can improve the spatial incoherence may be generated based on a predetermined constraint condition without using the pattern block PB.
  • FIGS. 32(a) and 32(b) are diagrams showing examples of intensity distributions that can improve spatial incoherence.
  • on/off of a plurality of pixels is randomly determined under the constraint that adjacent pixels are not turned on.
  • FIG. 32A adjacency of ON pixels in the vertical, horizontal, and diagonal directions is prohibited.
  • FIG. 32B adjacency of ON pixels in the diagonal direction is allowed, and adjacency of ON pixels in the vertical direction and the horizontal direction is prohibited.
  • FIGS. 33A to 33D are diagrams for explaining the pattern control with the lighting rate as a constraint condition.
  • the lighting rates (the ratio of the number of ON pixels to the total number of pixels) are different, and the lighting rates are 20%, 40%, 60%, and 80%, respectively.
  • PRBS pseudo random signal
  • Increasing the lighting rate will increase the amount of light, so it will be possible to sense objects farther away. Alternatively, it is possible to detect an object having a lower reflectance or an object having a smaller reflection area. By increasing the lighting rate, the amount of reflected light can be increased even in a dense fog environment where the light attenuation rate is high, and thus the detection sensitivity can be increased.
  • the lighting rate may be reduced when detecting an object with high reflectance or a large object.
  • the lighting rate is dynamically controlled according to the traveling environment to improve the visibility from the driver and give attention and warning to other traffic participants. Glare to preceding vehicles, oncoming vehicles, and pedestrians can be reduced.
  • the reference light S1 can be made to blink in a pseudo manner, and the driver of the own vehicle and other traffic participants can be given a warning or an alarm.
  • the lighting rate for all pixels is defined, but the lighting rate may be defined for each area by dividing into a plurality of areas.
  • 34(a) and 34(b) are diagrams for explaining the control of the lighting rate according to the modification. For example, if the lighting rate is 50% and the PRBS is generated with the mark rate through all pixels set to 50%, ON pixels are concentrated in the upper half and OFF pixels are concentrated in the lower half, resulting in uneven brightness. .. Therefore, as shown in FIG. 34(a), unevenness in brightness can be reduced by generating PRBS with a mark ratio of 50% for each of the upper half region and the lower half region to define the intensity distribution.
  • the lighting rate may be independently designated for each area.
  • control such as lowering the lighting rate can be performed in the area where the oncoming vehicle and the preceding vehicle exist.
  • the illuminating device 110 is configured by the combination of the light source 112 and the patterning device 114, but it is not limited thereto.
  • the illuminating device 110 is configured by an array of a plurality of semiconductor light sources (LED (light emitting diode) or LD (laser diode)) arranged in a matrix, and ON/OFF (or brightness) of each semiconductor light source can be controlled. You may comprise.
  • This imaging device 100 can be used for the object identification system 10 of FIG.
  • the imaging device 100 described in the second embodiment as the sensor of the object identification system 10 the following advantages can be obtained.
  • the imaging device 100 that is, the quantum radar camera
  • noise resistance is significantly increased.
  • the imaging device 100 that is, the quantum radar camera
  • the object OBJ is not affected by rain, snow, or fog.
  • the restored image G can be obtained.
  • the detection targets of the vehicle-mounted imaging device 100 are various, such as cars, people, motorcycles and bicycles, structures and animals and plants.
  • the situation in which the imaging device is used also changes greatly depending on the traveling environment such as weather, time of day, traveling road, traveling speed, and the like.
  • the imaging device itself moves, the objects also move, and their relative movement directions are various.
  • the imaging device 100 described in the third embodiment can be mounted on the vehicle shown in FIG. 11 and may be built in the vehicle lighting device shown in FIG.
  • Embodiment 4 (Outline of Embodiment 4)
  • Embodiment 4 described below relates to an imaging apparatus using the principle of ghost imaging.
  • the imaging device includes an illumination that illuminates an object while changing the intensity distribution of the reference light in a plurality of M ways, a photodetector that measures reflected light from the object for each of the plurality of intensity distributions I 1 to I M , and a plurality of photodetectors. And an arithmetic processing unit for reconstructing a restored image of the object by taking a correlation between the intensity distributions I 1 to I M of the above and a plurality of detection intensities b 1 to b M based on the output of the photodetector.
  • the plurality of intensity distributions I 1 to I M can be determined by the following processing.
  • (I) Model the transfer characteristics of the path from the illumination to the photodetector through the object.
  • (Ii) Define a reference object and a corresponding reference image.
  • (Iii) Initial values are given to the plurality of intensity distributions I 1 to I M.
  • (V) Reconstruct the restored image of the reference object by correlating the plurality of intensity distributions I 1 to I M and the plurality of estimated values b 1 to b M.
  • Each of the plurality of intensity distributions I 1 to I M is modified so that the error between the restored image and the reference image becomes small.
  • the number of irradiations can be reduced by defining the reference image according to the assumed subject and optimizing the pattern.
  • a plurality of N sets (N ⁇ 2) of reference objects and reference images may be defined.
  • versatility can be improved.
  • the error may be represented by the objective function F(I) of Expression (4).
  • W is the width of the image
  • H is the height of the image
  • T i (x, y) is the i-th reference image
  • G i (x, y, I) is the i-th restored image.
  • a plurality of sets of a plurality of intensity distributions I 1 to I M may be prepared and one set according to the traveling environment may be selectively used. As a result, the image quality can be improved as compared with the case where the same set of intensity distributions is always used in various traveling environments.
  • FIG. 35 is a diagram showing the imaging apparatus 100 according to the fourth embodiment.
  • the imaging device 100 is a correlation function image sensor that uses the principle of ghost imaging, and includes an illumination 110, a photodetector 120, and a calculation processing device 130.
  • the imaging device 100 is also called a quantum radar camera.
  • the illumination 110 is a pseudo heat light source and generates the reference light S1 having the intensity distribution I(x, y) that can be regarded as substantially random and irradiates the object OBJ. Irradiation of the reference light S1 onto the object OBJ is performed while changing its intensity distribution according to a plurality of M patterns. Illumination 110 may include, for example, a light source 112 that produces light S0 having a uniform intensity distribution, and a patterning device 114 that is capable of spatially modulating the intensity distribution I of this light S0.
  • the light source 112 may use a laser, a light emitting diode, or the like.
  • the wavelength and spectrum of the reference light S1 are not particularly limited, and may be white light having a plurality of or continuous spectra, or monochromatic light having a predetermined wavelength.
  • a DMD Digital Micromirror Device
  • a liquid crystal device can be used as the patterning device 114.
  • the pattern signal PTN (image data) designating the intensity distribution I is given to the patterning device 114 from the arithmetic processing unit 130. Therefore, the arithmetic processing unit 130 currently outputs the reference light S1 irradiated to the object OBJ. Know the intensity distribution I r .
  • the photodetector 120 measures the reflected light from the object OBJ and outputs a detection signal D r .
  • the detection signal D r is a spatial integral value of light energy (or intensity) incident on the photodetector 120 when the object OBJ is irradiated with the reference light having the intensity distribution I r . Therefore, the photodetector 120 can use a single pixel device (photodetector).
  • the photodetector 120 outputs a plurality of detection signals D 1 to D M corresponding to a plurality of M intensity distributions I 1 to I M, respectively.
  • the arithmetic processing unit 130 includes a pattern generator 132 and a reconstruction processing unit 134.
  • the intensity distribution of the reference light S1 generated by the illumination 110 is randomly generated, but in the present embodiment, a set of a plurality of predetermined intensity distributions I 1 to I M is used. Therefore, a set of a plurality of pattern signals PTN 1 to PTN M defining a plurality of intensity distributions I 1 to I M is held in advance in a memory (pattern memory) inside the pattern generator 132.
  • the reconstruction processing unit 134 reconstructs the restored image G(x, y) of the object OBJ by correlating the plurality of intensity distributions I 1 to I M and the plurality of detected intensities b 1 to b M.
  • the detection intensities b 1 to b M are based on the detection signals D 1 to D M.
  • the relationship between the detection intensity b and the detection signal D may be determined in consideration of the type and method of the photodetector 120.
  • the detection signal D r is assumed to represent the amount of received light at a certain time (or a minute time), that is, the instantaneous value.
  • the detection signal D r may be sampled multiple times during the irradiation period, and the detection intensity b r may be an integrated value, an average value, or a maximum value of all sampling values of the detection signal D r .
  • some of all the sampling values may be selected, and the integrated value, average value, or maximum value of the selected sampling values may be used.
  • the x-th to the y-th order from the maximum value may be extracted, sampling values lower than an arbitrary threshold value may be excluded, or the magnitude of signal fluctuation may be excluded. It is also possible to extract a sampling value in a range where is small.
  • the output D r of the photodetector 120 can be directly used as the detection intensity b r .
  • the conversion from the detection signal D r to the detection intensity b r may be performed by the arithmetic processing device 130 or may be performed outside the arithmetic processing device 130.
  • the above is the basic configuration of the entire imaging apparatus 100.
  • the plurality of intensities I 1 to I M are determined in advance by using a computer.
  • FIG. 36 is a flowchart showing a method of determining a set of a plurality of intensity distributions I 1 to I M.
  • the transfer characteristic of the path from the illumination 110 to the photodetector 120 via the object OBJ is modeled (S100).
  • the transfer characteristics include the transfer characteristics of light from the illumination 110 to the object OBJ, the reflection characteristics of the object OBJ, the propagation characteristics of light from the object OBJ to the photodetector 120, and the conversion characteristics of the photodetector 120. Be done.
  • FIG. 37 is a diagram illustrating the relationship between the reference object and the reference image T(x, y).
  • the pixel value of the reference image T(x,y) is normalized with 0 to 1.
  • the pixel value of each pixel p represents the reflectance of the corresponding portion of the reference object. For example, when the pixel value of a certain pixel p is 1, the reflectance of the corresponding reference object is 1 (that is, 100%), and when the pixel value is 0, the reflectance of the corresponding reference object is 0 ( That is, 0%), and when the pixel value is 0.5, the reflectance of the corresponding reference object can be associated as 0.5 (that is, 50%).
  • the detected intensities b 1 to b 1 when the reference light S1 having a plurality of intensity distributions I 1 to I M is irradiated to the reference object OBJ estimate of b M b ⁇ 1 calculates the ⁇ b ⁇ M (S106).
  • the light is not attenuated in the optical path from the illumination 110 to the object OBJ, and the reference light S1 is emitted over the entire rectangle including the reference object OBJ (rectangle shown by the broken line on the right side of FIG. 37). Further, it is assumed that the light is not attenuated in the optical path from the object OBJ to the photodetector 120, and all the reflected light from the object OBJ is incident on the photodetector 120.
  • the estimated value b ⁇ r of the detected intensity when the reference light whose intensity distribution is I r (x, y) is applied to the reference object is represented by Expression (6).
  • W represents the width of the image
  • H represents the height of the image.
  • a combination (or state) of the current intensity distributions I 1 (x, y) to I M (x, y) is denoted by I.
  • the restored image G(x, y, I) is reconstructed using the set I of the intensity distribution based on the correlation function of Expression (7) (S108).
  • Expression (7) is obtained by replacing the detection intensity b r in Expression (5) with the estimated value b ⁇ r .
  • the reference image T(x,y) corresponds to the correct answer of the restored image G(x,y,I). Therefore, the error ⁇ between the restored image G(x,y,I) and the reference image T(x,y) is calculated (S110), and each of the plurality of intensity distributions I 1 to I M is reduced so as to reduce the error ⁇ . It is corrected (S114).
  • the number of irradiations can be reduced by defining the reference image according to the assumed subject and optimizing the pattern.
  • the error ⁇ in this case may be represented by the objective function F(I) of Expression (8).
  • T i (x, y) represents the i-th set of reference images.
  • the algorithm for minimizing the error ⁇ is not particularly limited, and a known algorithm can be used. For example, stochastic gradient descent can be used for the minimization. This problem can be formulated by the following equation (9).
  • I ⁇ is a set of optimal intensity distributions I 1 to I M. Since the pixel values of the intensity distributions I 1 to I M do not have a negative value, a non-negative constraint condition can be set.
  • the number M of the intensity distributions I 1 to I M is set to 100 , 500 , and 1000 , and optimum intensity distribution sets I 100 ⁇ , I 500 ⁇ , and I 1000 ⁇ are obtained.
  • FIG. 39 is a diagram showing a restored image when an optimized intensity distribution set is used.
  • the leftmost image is a correct answer image, and photographs of the alphabet K, frog, train, and truck are used in order from the top.
  • Below the restored image the numerical value of PSNR indicating the error from the correct image is shown. Images of frogs, trains, and trucks are taken from CIFAR-10 (Alex Krizhevsky, “Learning multiple layers of features from tiny images” 2009). The larger the numerical value of PSNR, the smaller the error.
  • FIG. 40 is a diagram showing a restored image when a set of random intensity distributions is used.
  • the PSNR is about 9.578 even when irradiation is performed 10,000 times.
  • a plurality of intensity distribution sets may be prepared and switched according to the traveling environment.
  • the imaging device 100 it is assumed that there is no light attenuation or the like between the imaging device 100 and the object OBJ. This can be associated when the visibility in fine weather is good.
  • the set of intensity distributions obtained under this assumption is also effective in situations where the vehicle is driving in rainfall, snowfall, or thick fog, but depending on the driving environment such as rainfall, snowfall, or thick fog, the intensity distribution By switching the set of, the error of the restored image G can be further reduced.
  • the transfer characteristics may be modeled in consideration of their influences.
  • the calculation formula of the estimated value b ⁇ r of the detection intensity b r is modified from the formula (6).
  • the set of intensity distributions may be optimized based on the modified estimated value b ⁇ r .
  • a reference object is photographed in each driving environment (that is, rainfall, snowfall, and thick fog), and the obtained image is used as a reference.
  • the above machine learning may be performed as an image.
  • the modeling of the transfer characteristic (light propagation characteristic) can be simplified.
  • a driving environment in addition to or in addition to the difference of rain, snow, fog, etc., it is suitable for each driving environment in consideration of daytime driving and nighttime driving, low speed driving and high speed driving.
  • a set of intensity distributions may be prepared.
  • the illumination 110 is composed of the combination of the light source 112 and the patterning device 114, but it is not limited thereto.
  • the illumination 110 is composed of an array of a plurality of semiconductor light sources (LED (light emitting diode) or LD (laser diode)) arranged in a matrix, and it is possible to control ON/OFF (or brightness) of each semiconductor light source. You may comprise.
  • This imaging device 100 can be used for the object identification system 10 of FIG.
  • the imaging device 100 described in the fourth embodiment as the sensor of the object identification system 10 the following advantages can be obtained.
  • the use of the imaging device 100 that is, the quantum radar camera significantly improves noise resistance. For example, when it is raining, snowing, or traveling in fog, it is difficult for the naked eye to recognize the object OBJ, but by using the imaging device 100, the object OBJ is not affected by rain, snow, or fog.
  • the restored image G can be obtained.
  • the imaging device 100 described in the fourth embodiment can be mounted on the automobile of FIG. 11 and may be incorporated in the vehicle lamp of FIG.
  • the method using correlation calculation has been described as the ghost imaging (or single pixel imaging) method, but the image reconstruction method is not limited to this.
  • an analytical method using Fourier transform or Hadamard inverse transform instead of the correlation calculation, a method for solving an optimization problem such as sparse modeling, and an algorithm using AI/machine learning, The image may be reconstructed.
  • the present invention relates to a vehicle lamp.
  • OBJ... Object, 10... Object identification system 20... Imaging device, 40... Arithmetic processing device, 42... Classifier, 100... Imaging device, 110... Illumination, 120... Photodetector, 130... Arithmetic processing device, 132... Pattern Generator, 134... Reconstruction processing unit, 200... Vehicle lamp, 202... Light source, 204... Lighting circuit, 206... Optical system, 300... Automotive, 302... Headlight, 310... Lamp system, 304... Vehicle side ECU , 400... Vehicle lamp, 402... Housing, 404... Cover, 410... Headlight, 412... Light source, 414... Reflector, 416... Patterning device, 420... Pseudo thermal light source, 422... Light source, 424... Patterning device, 430... Common member.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)

Abstract

L'invention concerne un appareil d'éclairage de véhicule (400) comprenant un phare (410) et une source de lumière pseudo-thermique (420). La source de lumière de pseudo-thermique (420) projette une lumière de référence (S1) sur un objet, modifiant la distribution d'intensité de la lumière de M différentes manières. La source de lumière pseudo-thermique (420), conjointement avec un détecteur de lumière (120) et un dispositif de traitement arithmétique (130), constitue un dispositif d'imagerie (100). Au moins certains éléments constitutifs (430) du phare (410) sont partagés avec la source de lumière pseudo-thermique (420).
PCT/JP2019/050170 2018-12-27 2019-12-20 Appareil d'éclairage pour véhicule et véhicule Ceased WO2020137908A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2020563214A JP7408572B2 (ja) 2018-12-27 2019-12-20 車両用灯具および車両
CN201980085931.7A CN113227838B (zh) 2018-12-27 2019-12-20 车辆用灯具及车辆

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
JP2018244951 2018-12-27
JP2018245517 2018-12-27
JP2018-245517 2018-12-27
JP2018-244951 2018-12-27
JP2019-002820 2019-01-10
JP2019002820 2019-01-10
JP2019002818 2019-01-10
JP2019002819 2019-01-10
JP2019-002819 2019-01-10
JP2019-002818 2019-01-10

Publications (1)

Publication Number Publication Date
WO2020137908A1 true WO2020137908A1 (fr) 2020-07-02

Family

ID=71127977

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/050170 Ceased WO2020137908A1 (fr) 2018-12-27 2019-12-20 Appareil d'éclairage pour véhicule et véhicule

Country Status (3)

Country Link
JP (1) JP7408572B2 (fr)
CN (1) CN113227838B (fr)
WO (1) WO2020137908A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020218282A1 (fr) * 2019-04-22 2020-10-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule, automobile et procédé d'imagerie
WO2021193646A1 (fr) * 2020-03-26 2021-09-30 株式会社小糸製作所 Dispositif d'imagerie, éclairage de véhicule et véhicule
JPWO2022044961A1 (fr) * 2020-08-28 2022-03-03
WO2022270476A1 (fr) * 2021-06-22 2022-12-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule et véhicule
WO2023074759A1 (fr) * 2021-10-27 2023-05-04 株式会社小糸製作所 Appareil d'imagerie, accessoire de lampe de véhicule et véhicule
CN116457700A (zh) * 2020-11-16 2023-07-18 株式会社小糸制作所 感测装置、车辆用灯具、车辆
WO2023171732A1 (fr) * 2022-03-11 2023-09-14 スタンレー電気株式会社 Feu de véhicule
JP7490161B1 (ja) * 2023-11-09 2024-05-24 三菱電機株式会社 画像取得装置、及び、画像取得方法
JP2025502837A (ja) * 2021-12-30 2025-01-28 深▲ジェン▼引望智能技術有限公司 制御方法、ライダ及び端末デバイス

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0112673B2 (fr) * 1983-06-01 1989-03-01 Sumitomo Beekuraito Kk
US20140029850A1 (en) * 2011-09-28 2014-01-30 U.S. Army Research Laboratory ATTN:RDRL-LOC-I System and method for image improved image enhancement
WO2017073737A1 (fr) * 2015-10-28 2017-05-04 国立大学法人東京大学 Dispositif d'analyse
WO2018124285A1 (fr) * 2016-12-29 2018-07-05 国立大学法人東京大学 Dispositif d'imagerie et procédé d'imagerie
JP2018155658A (ja) * 2017-03-17 2018-10-04 株式会社東芝 物体検出装置、物体検出方法、および物体検出プログラム
JP2018156862A (ja) * 2017-03-17 2018-10-04 トヨタ自動車株式会社 車両用前照灯装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7415126B2 (en) * 1992-05-05 2008-08-19 Automotive Technologies International Inc. Occupant sensing system
US6830189B2 (en) * 1995-12-18 2004-12-14 Metrologic Instruments, Inc. Method of and system for producing digital images of objects with subtantially reduced speckle-noise patterns by illuminating said objects with spatially and/or temporally coherent-reduced planar laser illumination
DE29825026U1 (de) * 1997-04-02 2004-07-01 Gentex Corp., Zeeland Steuerungssystem, um Fahrzeugscheinwerfer automatisch zu dimmen
JP2012146621A (ja) * 2010-12-20 2012-08-02 Stanley Electric Co Ltd 車両用灯具
EP3015882B1 (fr) * 2013-06-27 2020-12-09 Panasonic Semiconductor Solutions Co., Ltd. Dispositif de mesure de distance
CN103472456B (zh) * 2013-09-13 2015-05-06 中国科学院空间科学与应用研究中心 一种基于稀疏孔径压缩计算关联的主动成像系统及方法
US10473916B2 (en) * 2014-09-30 2019-11-12 Washington University Multiple-view compressed-sensing ultrafast photography (MV-CUP)
JP6770892B2 (ja) * 2014-12-25 2020-10-21 株式会社小糸製作所 点灯回路および車両用灯具
DE102015120204A1 (de) * 2015-11-23 2017-05-24 Hella Kgaa Hueck & Co. Verfahren zum Betreiben von mindestens einem Scheinwerfer eines Fahrzeuges
CN106019307A (zh) * 2016-05-18 2016-10-12 北京航空航天大学 一种基于阵列光源的单像素成像系统及方法
JP2018036102A (ja) * 2016-08-30 2018-03-08 ソニーセミコンダクタソリューションズ株式会社 測距装置、および、測距装置の制御方法
US10671873B2 (en) * 2017-03-10 2020-06-02 Tusimple, Inc. System and method for vehicle wheel detection
JP6412673B1 (ja) * 2017-07-21 2018-10-24 学校法人玉川学園 画像処理装置及び方法、並びに、プログラム
CN108710215A (zh) * 2018-06-20 2018-10-26 深圳阜时科技有限公司 一种光源模组、3d成像装置、身份识别装置及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0112673B2 (fr) * 1983-06-01 1989-03-01 Sumitomo Beekuraito Kk
US20140029850A1 (en) * 2011-09-28 2014-01-30 U.S. Army Research Laboratory ATTN:RDRL-LOC-I System and method for image improved image enhancement
WO2017073737A1 (fr) * 2015-10-28 2017-05-04 国立大学法人東京大学 Dispositif d'analyse
WO2018124285A1 (fr) * 2016-12-29 2018-07-05 国立大学法人東京大学 Dispositif d'imagerie et procédé d'imagerie
JP2018155658A (ja) * 2017-03-17 2018-10-04 株式会社東芝 物体検出装置、物体検出方法、および物体検出プログラム
JP2018156862A (ja) * 2017-03-17 2018-10-04 トヨタ自動車株式会社 車両用前照灯装置

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12003839B2 (en) 2019-04-22 2024-06-04 Koito Manufacturing Co., Ltd. Imaging apparatus using ghost imaging
WO2020218282A1 (fr) * 2019-04-22 2020-10-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule, automobile et procédé d'imagerie
WO2021193646A1 (fr) * 2020-03-26 2021-09-30 株式会社小糸製作所 Dispositif d'imagerie, éclairage de véhicule et véhicule
JPWO2021193646A1 (fr) * 2020-03-26 2021-09-30
JP7625582B2 (ja) 2020-03-26 2025-02-03 株式会社小糸製作所 イメージング装置および車両用灯具、車両
US12108164B2 (en) 2020-03-26 2024-10-01 Koito Manufacturing Co., Ltd. Imaging apparatus
JPWO2022044961A1 (fr) * 2020-08-28 2022-03-03
CN115989430A (zh) * 2020-08-28 2023-04-18 株式会社小糸制作所 成像装置、成像方法以及车辆用灯具、车辆
CN116457700A (zh) * 2020-11-16 2023-07-18 株式会社小糸制作所 感测装置、车辆用灯具、车辆
WO2022270476A1 (fr) * 2021-06-22 2022-12-29 株式会社小糸製作所 Dispositif d'imagerie, phare de véhicule et véhicule
WO2023074759A1 (fr) * 2021-10-27 2023-05-04 株式会社小糸製作所 Appareil d'imagerie, accessoire de lampe de véhicule et véhicule
JP2025502837A (ja) * 2021-12-30 2025-01-28 深▲ジェン▼引望智能技術有限公司 制御方法、ライダ及び端末デバイス
WO2023171732A1 (fr) * 2022-03-11 2023-09-14 スタンレー電気株式会社 Feu de véhicule
JP7490161B1 (ja) * 2023-11-09 2024-05-24 三菱電機株式会社 画像取得装置、及び、画像取得方法
WO2025099889A1 (fr) * 2023-11-09 2025-05-15 三菱電機株式会社 Dispositif et procédé d'acquisition d'image

Also Published As

Publication number Publication date
CN113227838A (zh) 2021-08-06
JP7408572B2 (ja) 2024-01-05
CN113227838B (zh) 2024-07-12
JPWO2020137908A1 (ja) 2021-11-11

Similar Documents

Publication Publication Date Title
JP7408572B2 (ja) 車両用灯具および車両
KR102472631B1 (ko) Lidar 시스템 및 방법
US10558866B2 (en) System and method for light and image projection
JP2023002783A (ja) Lidarシステム及び方法
US12047667B2 (en) Imaging device
CN113383280A (zh) 用于经由雾的图像增强的弹道光调制
US12108164B2 (en) Imaging apparatus
WO2021079810A1 (fr) Dispositif d'imagerie, phare de véhicule, véhicule et procédé d'imagerie
JP7656584B2 (ja) センサ、自動車および周囲環境のセンシング方法
JP7161337B2 (ja) 車両用灯具
WO2023085329A1 (fr) Système d'imagerie, unité de détection, accessoire pour lampe de véhicule, et véhicule
CN119816431A (zh) 车辆用前照灯

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19904219

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020563214

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19904219

Country of ref document: EP

Kind code of ref document: A1