WO2025173536A1 - Système de mesure de distance, capteur d'image et procédé de commande de système de mesure de distance - Google Patents
Système de mesure de distance, capteur d'image et procédé de commande de système de mesure de distanceInfo
- Publication number
- WO2025173536A1 WO2025173536A1 PCT/JP2025/002740 JP2025002740W WO2025173536A1 WO 2025173536 A1 WO2025173536 A1 WO 2025173536A1 JP 2025002740 W JP2025002740 W JP 2025002740W WO 2025173536 A1 WO2025173536 A1 WO 2025173536A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- digital
- unit
- image sensor
- camera module
- digital signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
- G01S17/894—3D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4861—Circuits for detection, sampling, integration or read-out
- G01S7/4863—Detector arrays, e.g. charge-transfer gates
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/705—Pixels for depth measurement, e.g. RGBZ
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/78—Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters
Definitions
- the analog gain margin narrows when considering EMI (Electro Magnetic Interference), heat issues, middleware adjustments, and the like.
- EMI Electro Magnetic Interference
- the distance measurement accuracy of the rear seat camera may deteriorate compared to the front seat camera.
- the distance measurement accuracy may also deteriorate on the front seat side, and the following reasons may be considered: (1) Due to the influence of external light during the day, signals become saturated in some areas of the ranging image. (2) Due to the influence of street lights at night, signals are saturated in some areas of the ranging image. (3) Light is absorbed by the skin, clothing, and interior equipment of the driver and passengers and does not reach the vehicle. (4) Due to the combined effects of (1) to (3).
- This technology was developed in light of these circumstances, and aims to improve the distance measurement accuracy in systems that use the iToF method for distance measurement.
- the pixel array section may be divided into a predetermined number of pixel blocks, and the unit may further perform a synthesis process before the distance measurement calculation process, in which the amount of received light is calculated for each pixel block, and if the amount of received light is not within a predetermined range, the digital signal corresponding to the first digital gain is replaced with the digital signal corresponding to the second digital gain. This has the effect of suppressing signal saturation and insufficient signal level.
- the camera module may further include a calculation unit that calculates distance using the iToF method based on the digital signal. This reduces the amount of processing required by the unit.
- the camera module may further include a first serializer that converts the first parallel data into first serial data and transmits it to the unit. This has the effect of reducing the number of signal lines between the camera module and the unit.
- the camera module may include a first image sensor, the pixel array unit, the analog-to-digital conversion unit, and the digital gain processing unit may be disposed within the first image sensor, and the first serializer may convert the first parallel data from the first image sensor into the first serial data. This has the effect of reducing the number of signal lines between the camera module and the unit.
- the camera module may include a first image sensor, and the pixel array unit, the analog-to-digital conversion unit, the digital signal processing unit, and the first serializer may be disposed within the first image sensor. This has the effect of reducing the number of signal lines between the first image sensor and the unit.
- the camera module may include a first image sensor in which the pixel array unit, the analog-to-digital conversion unit, and the digital signal processing unit are arranged, and a second image sensor in which visible light pixels that receive visible light are arranged. This has the effect of improving the performance of the ranging system.
- the second image sensor may further include an array of IR (Infra-Red) pixels that receive IR light. This improves performance at night, etc.
- IR Infra-Red
- the camera module may further include a serializer that converts parallel data from the first and second image sensors into serial data and transmits it to the unit. This has the effect of reducing the number of signal lines between the camera module and the unit.
- the camera module may further include a first serializer that converts the first parallel data into first serial data and transmits it to the unit, and a second serializer that converts the second parallel data into second serial data and transmits it to the unit. This has the effect of reducing the number of signal lines between the camera module and the unit.
- the first serializer may convert the first parallel data from the first image sensor into the first serial data
- the second serializer may convert the second parallel data from the second image sensor into the second serial data. This has the effect of reducing the number of signal lines between the camera module and the unit.
- the first serializer may be disposed within the first image sensor, and the second serializer may be disposed within the second image sensor. This has the effect of reducing the number of signal lines between the first and second image sensors and the unit.
- the plurality of pixels may further include visible light pixels that receive visible light. This has the effect of improving the performance of the ranging system.
- the unit may set the parameters of the area every time a certain period of time elapses. This has the effect of improving the performance of the ranging system.
- a second aspect of the present technology is an image sensor comprising a pixel array section in which a plurality of pixels, including iToF pixels that generate analog signals for calculating distance using the iToF method, are arranged in a two-dimensional grid pattern; an analog-to-digital conversion section that converts the analog signals into digital signals; and a digital gain processing section that performs processing to increase or decrease the digital signals using a first digital gain and output the digital signals, and processing to increase or decrease the digital signals within a predetermined area of the pixel array section using a second digital gain and output the digital signals.
- This has the effect of improving distance measurement accuracy.
- the area may be divided into a predetermined number of pixel blocks, and the digital gain processing unit may further perform a synthesis process in which the amount of received light is calculated for each pixel block, and if the amount of received light is not within the range, the digital signal corresponding to the first digital gain is replaced with the digital signal corresponding to the second digital gain. This reduces the amount of processing required by a unit external to the image sensor.
- the image sensor may further include an image processing unit that processes the digital signal from the digital gain processing unit, the area being divided into a predetermined number of pixel blocks, and the image processing unit calculating the amount of received light for each pixel block, and if the amount of received light is within a predetermined range, outputting the digital signal increased or decreased by the first digital gain, and if the amount of received light is not within the range, outputting the digital signal increased or decreased by the second digital gain.
- This has the effect of reducing the amount of processing by a unit external to the image sensor.
- FIG. 1 is a block diagram showing an example configuration of a distance measuring system according to a first embodiment of the present technology; A block diagram showing an example configuration of a camera module and ECU (Electronic Control Unit) in a first embodiment of the present technology. A block diagram showing an example configuration of a camera module and ECU with an added subsequent IC (Integrated Circuit) in the first embodiment of the present technology.
- ECU Electronic Control Unit
- FIG. 10 is a block diagram showing a configuration example of a camera module and an ECU when a subsequent IC is added only to the iToF side and a serializer is arranged for each image sensor according to a second embodiment of the present technology.
- FIG. 10 is a block diagram illustrating a configuration example of a camera module and an ECU when a subsequent IC is added only to the RGB side and a serializer is arranged for each image sensor according to a second embodiment of the present technology.
- FIG. 10 is a block diagram showing a configuration example of a camera module and an ECU when a subsequent IC is added only to the RGB side and a serializer is arranged for each image sensor according to a second embodiment of the present technology.
- FIG. 11 is a block diagram showing an example configuration of a distance measuring system according to a third embodiment of the present technology.
- FIG. 11 is a block diagram showing a configuration example of an image sensor according to a third embodiment of the present technology.
- FIG. 11 is a block diagram showing a configuration example of a camera module and an ECU according to a third embodiment of the present technology.
- FIG. 11 is a block diagram illustrating a configuration example of a camera module and an ECU to which a subsequent IC is added according to a third embodiment of the present technology.
- FIG. 11 is a block diagram illustrating a configuration example of a camera module and an ECU when a serializer is disposed in an image sensor according to a third embodiment of the present technology.
- FIG. 11 is a block diagram showing an example configuration of a distance measuring system according to a third embodiment of the present technology.
- FIG. 11 is a block diagram showing a configuration example of an image sensor according to a third embodiment of the present technology.
- FIG. 11 is
- FIG. 13 is a block diagram showing another example of an image sensor according to the third embodiment of the present technology.
- FIG. 1 is a diagram summarizing features of each configuration of first to third embodiments of the present technology.
- 13 is a flowchart illustrating an example of an operation of an ECU according to a fourth embodiment of the present technology.
- FIG. 19 is a diagram illustrating an example of setting an area according to a fourth embodiment of the present technology.
- FIG. 19 is a diagram illustrating an example of area resetting according to the fourth embodiment of the present technology.
- 1 is a block diagram showing an example of a schematic configuration of a vehicle control system;
- FIG. 2 is an explanatory diagram showing an example of the installation positions of an outside-vehicle information detection unit and an imaging unit.
- First embodiment [Example of distance measurement system configuration] 1 is a block diagram showing an example configuration of a ranging system according to a first embodiment of the present technology.
- This ranging system is a system for performing ranging by the iToF method, and includes a camera module 100 and an ECU 300.
- This ranging system is applied to, for example, a vehicle control system described below.
- LD111 emits illumination light and irradiates it onto the subject. Pulsed near-infrared light (in other words, pulsed light) is used as the illumination light.
- LDD112 drives LD111 in synchronization with a rectangular wave light emission control signal CLKp.
- the frequency of this light emission control signal CLKp is, for example, 20 megahertz (MHz). Note that the frequency of the light emission control signal CLKp is not limited to 20 megahertz (MHz), and may be 5 megahertz (MHz) or 150 megahertz (MHz) or higher.
- the lens group 121 focuses the reflected light from the irradiated light and guides it to the image sensor 201.
- the image sensor 201 receives reflected light (in other words, pulsed light) and generates RAW data. This image sensor 201 supplies parallel data including the RAW data to the serializer 151. Note that the image sensor 201 is an example of the first image sensor described in the claims.
- the ECU 300 also includes a deserializer 311, a control circuit 312, an I2C interface 313, and a SoC (System-on-a-Chip) 314.
- a deserializer 311 a control circuit 312, an I2C interface 313, and a SoC (System-on-a-Chip) 314.
- SoC System-on-a-Chip
- serializer 151 is placed outside the image sensor 201, but this configuration is not limited to this.
- the timing control unit 212 controls the operation timing of the vertical drive circuit 211, DAC 213, column ADC 215, horizontal transfer control unit 216, digital gain processing unit 250, and image processing unit 217.
- DAC 213 generates a sawtooth ramp signal through DA (Digital to Analog) conversion and supplies it to column ADC 215.
- the slope of this ramp signal is set in accordance with control information from ECU 300.
- the analog gain of column ADC 215 is controlled according to this slope.
- the column ADC 215 has an ADC arranged for each column of the pixel array unit 214.
- the ADC converts the analog signal from the corresponding column into a digital signal and supplies it to the digital gain processing unit 250 under the control of the horizontal transfer control unit 216.
- the column ADC 215 is an example of the analog-to-digital conversion unit described in the claims.
- the image processing unit 217 performs various image processing operations on the digital signal from the digital gain processing unit 250. This image processing unit 217 supplies data in which the processed signals are arranged as RAW data to the serializer 151.
- FIG. 6 is a circuit diagram showing an example configuration of an iToF pixel 220 according to the first embodiment of the present technology.
- the iToF pixel 220 includes a photodiode 221, a charge discharging transistor 222, transfer transistors 223 and 224, reset transistors 225 and 226, and floating diffusion layers 227 and 228.
- the iToF pixel 220 further includes conversion transistors 229 and 230, additional capacitances 231 and 232, amplification transistors 233 and 234, and selection transistors 235 and 236.
- nMOS n-channel metal oxide semiconductor
- the conversion transistors 229 and 230 and the additional capacitances 231 and 232 are not essential and may be provided as needed.
- the photodiode 221 photoelectrically converts incident light to generate an electric charge.
- the charge drain transistor 222 drains and initializes electric charge from the photodiode 221, floating diffusion layers 227 and 228, and additional capacitances 231 and 232 in accordance with the drive signal OFG from the vertical drive circuit 211.
- the transfer transistor 223 transfers charge from the photodiode 221 to the floating diffusion layer 227 in accordance with the drive signal TRGa from the vertical drive circuit 211.
- Floating diffusion layers 227 and 228 accumulate electric charge and generate a voltage according to the amount of charge.
- the amplifier transistor 233 forms a source follower circuit and supplies a voltage corresponding to the voltage of the floating diffusion layer 227 to the selection transistor 235.
- the amplifier transistor 234 forms a source follower circuit and supplies a voltage corresponding to the voltage of the floating diffusion layer 228 to the selection transistor 236.
- each of the transfer transistors, floating diffusion layers, amplification transistors, and selection transistors is provided in pairs.
- One of these paired circuits is called the A tap, and the other is called the B tap.
- the ADC for each column is alternately connected to VSLa and VSLb, and alternately performs AD conversion on the analog signals of the A tap and B tap.
- the distance d to the object is calculated from Q1, Q2, Q3, and Q4.
- Q1 and Q2 detected during the Q1Q2 detection period of 1/60 seconds, and Q3 and Q4 detected during the Q3Q4 detection period of 1/60 seconds are required for distance measurement. Therefore, distance measurement is performed at intervals of, for example, 1/30 seconds.
- the iToF pixel 220 transfers charge corresponding to the amount of charge from timing T50 (0 degrees) to timing T52 (180 degrees) within the Q1Q2 detection period to the floating diffusion layer 227 on the A tap side. If reflected light begins to be emitted at timing T51, a charge amount q1 corresponding to the amount of light received from timing T51 to T52 is transferred.
- the iToF pixel 220 transfers charge corresponding to the amount of light received from timing T52 (180 degrees) to timing T54 (360 degrees) within the Q1Q2 detection period to the floating diffusion layer 228 on the B tap side.
- a charge amount q2 corresponding to the amount of light received from timing T52 to T53 is transferred.
- the iToF pixel 220 then transfers to the floating diffusion layer 227 an amount of charge corresponding to the amount of light received from timing T55 (90 degrees) to timing T57 (270 degrees) within the Q3Q4 detection period. If reflected light begins to be emitted at timing T56, an amount of charge q3 corresponding to the amount of light received from timing T56 to T57 is transferred.
- the iToF pixel 220 transfers to the floating diffusion layer 228 an amount of charge corresponding to the amount of light received from timing T57 (270 degrees) to timing T59 (90 degrees) within the Q3Q4 detection period. If the reflected light stops emitting at timing T58, an amount of charge q4 corresponding to the amount of light received from timing T57 to T58 is transferred.
- the iToF pixel 220 then detects the cumulative values of q1 and q2 within the Q1Q2 detection period as Q1 and Q2, and sequentially outputs an analog signal indicating Q1 and an analog signal indicating Q2.
- the iToF pixel 220 also detects the cumulative values of q3 and q4 within the Q3Q4 detection period as Q3 and Q4, and sequentially outputs an analog signal indicating Q3 and an analog signal indicating Q4.
- the downstream digital gain processing unit 250 increases or decreases Q1 to Q4 after AD conversion using digital gain.
- the RAW data from the image sensor 201 includes digital signals Q1, Q2, Q3, and Q4 for each pixel.
- the downstream ECU 300 calculates the distance d for each pixel using the increased or decreased Q1, Q2, Q3, and Q4 using the following formula:
- d is in meters (m)
- c is the speed of light
- m is in meters per second (m/s)
- tan ⁇ 1 is the inverse of the tangent function.
- [Configuration example of digital gain processing unit] 8 is a block diagram showing an example configuration of the digital gain processing unit 250 according to the first embodiment of the present technology.
- the digital gain processing unit 250 includes data selection units 251, 252, and 253, multiplication units 254, 255, and 256, and a data output adjustment unit 260.
- the data selection unit 251 sequentially supplies all of the digital signals from the column ADC 215 to the multiplication unit 254.
- the data selection unit 252 sequentially selects the digital signals of pixels within a specified area from the digital signals from the column ADC 215 and supplies them to the multiplication unit 255.
- the data selection unit 253 sequentially selects the digital signals of pixels within a specified area from the digital signals from the column ADC 215 and supplies them to the multiplication unit 256.
- the multiplication unit 254 increases or decreases the digital signal by a digital gain g1 and supplies the result to the data output adjustment unit 260.
- the multiplication unit 255 increases or decreases the digital signal by a digital gain g2 and supplies the result to the data output adjustment unit 260.
- the multiplication unit 256 increases or decreases the digital signal by a digital gain g3 and supplies the result to the data output adjustment unit 260.
- the data output adjustment unit 260 adjusts the output timing of the digital signals from the multiplication units 254, 255, and 256.
- This data output adjustment unit 260 includes, for example, data buffers 261 and 262 and a selector 263.
- Data buffer 261 delays the digital signal from multiplication unit 255 and supplies it to selector 263.
- Data buffer 262 delays the digital signal from multiplication unit 256 and supplies it to selector 263.
- Selector 263 sequentially selects the digital signals from multiplication units 254, 255, and 256. This selector 263 outputs the digital signal from multiplication unit 254 to image processing unit 217 in order, pixel by pixel. Next, selector 263, for example, outputs the digital signal from multiplication unit 255 in order, pixel by pixel. Then, selector 263 outputs the digital signal from multiplication unit 256 in order, pixel by pixel.
- d indicates an example of a digital signal output by the data output adjustment unit 260.
- the data output adjustment unit 260 sequentially outputs all-pixel digital signals D0 ⁇ 2.0 , D1 ⁇ 2.0 , D11 ⁇ 2.0 , D12 ⁇ 2.0 , etc. from the multiplication unit 254.
- the data output adjustment unit 260 sequentially outputs area-specific digital signals D11 ⁇ 0.5 , D12 ⁇ 0.5 , etc. from the multiplication unit 255.
- the data output adjustment unit 260 sequentially outputs area-specific digital signals D11, D12, etc. from the multiplication unit 256.
- the area surrounded by the dotted line in the figure, a is an area where digital signal saturation and insufficient signal levels are a concern.
- subjects within the area such as the driver's clothing, skin, or interior vehicle equipment
- reflect and absorb light which could result in digital signal saturation and insufficient signal levels.
- ambient light impact mode the digital signals of all pixels are attenuated to suppress the effects of ambient light, which could result in insufficient signal levels within the area.
- night mode the digital signals of all pixels are amplified, which could result in digital signal saturation within the area.
- the image sensor 201 increases or decreases the digital signals within the area by digital gains g2 and g3 that are different from the digital gain g1 of all pixels.
- the image sensor 201 outputs RAW data including all of the digital signals illustrated in a, b, and c in the figure to the ECU 300.
- FIG. 15 shows an example of image data generated by ECU 300 in the first embodiment of the present technology.
- ECU 300 performs a synthesis process before distance measurement calculations.
- ECU 300 divides the area into a predetermined number of pixel blocks and focuses on each pixel block in turn. For example, the area is divided into pixel blocks of 9 pixels, 3 rows x 3 columns.
- ECU 300 For each pixel in the pixel block of interest, ECU 300 calculates the sum of the digital signals Q1, Q2, Q3, and Q4 after increasing or decreasing by g1 for that pixel. This sum indicates the amount of light received by the pixel. ECU 300 then calculates the average or sum of the amounts of light received by each pixel in the pixel block as a confidence value. Note that the confidence value is an example of the amount of light received as defined in the claims.
- the ECU 300 determines whether the confidence value is within a predetermined range. If the confidence value is within the range, the ECU 300 uses the digital signals Q1 to Q4 corresponding to g1 in the distance measurement calculation at a later stage.
- ECU 300 replaces digital signals Q1, Q2, Q3, and Q4 that increased or decreased with g1 with digital signals Q1, Q2, Q3, and Q4 that increased or decreased with g2 , and calculates the confidence value again.
- ECU 300 determines whether the confidence value is within a predetermined range. If the confidence value is within the range, ECU 300 uses digital signals Q1 to Q4 that correspond to g2 in the subsequent distance measurement calculation.
- digital signals D1, D2, D11, D12, D31, D32, etc. are increased or decreased by a digital gain of 1.0.
- digital signals D11 and D12 are in-area signals.
- multiplication units 255 and 256 increase or decrease D11, D12, etc. within the area by 0.5 or 2.0 and output them.
- the camera module 100 in this modified example of the first embodiment differs from the first embodiment in that two LDs and two LDDs are provided.
- FIG. 22 is a block diagram showing an example configuration of a ranging system in a first modified example of the first embodiment of the present technology.
- the ranging system in this first modified example of the first embodiment differs from the first embodiment in that an LD 113 and an LDD 114 are further provided within the camera module 100.
- the image sensor 201 for example, simultaneously emits light from LD 111 and LD 113. This increases the amount of light received compared to when only LD 111 is used, improving distance measurement accuracy.
- the addition of LD 113 and LDD 114 makes it possible to increase the amount of light received compared to when only LD 111 is used.
- Second embodiment In the first embodiment described above, only the image sensor 201 having an array of iToF pixels is provided, but an image sensor having an array of RGB pixels can also be added.
- the ranging system in this second embodiment differs from the first embodiment in that an image sensor having an array of RGB pixels is added.
- FIG. 23 is a block diagram showing an example configuration of a ranging system according to a second embodiment of the present technology.
- the ranging system according to the second embodiment further includes a lens group 122, an image sensor 202, and an EEPROM 132 within the camera module 100.
- the camera module 100 also includes an ISP (Image Signal Processor) 140 and a DDR SDRAM (Double-Data-Rate SDRAM) 133, and includes a serializer 150 instead of the serializer 151.
- ISP Image Signal Processor
- DDR SDRAM Double-Data-Rate SDRAM
- ISP140 performs various image processing operations, such as white balance correction, on image data from image sensor 202, and supplies parallel data including the processed data to serializer 150.
- DDR SDRAM 133 temporarily stores data processed by ISP140.
- the serializer 150 converts the parallel data from the image sensors 201 and 202 into serial data and transmits it to the ECU 300 via the serial interfaces 108 and 109.
- the signal processing unit 218 performs various signal processing on the digital signal. For example, in night mode, the signal processing unit 218 generates an IR image in which only the digital signals of IR pixels are arranged, or an image in which the digital signals of RGB pixels are combined with the digital signals of IR pixels. In normal mode and ambient light influence mode, the signal processing unit 218 generates an RGB image in which only the digital signals of RGB pixels are arranged. The signal processing unit 218 then supplies the generated image data to the ISP 140 as RAW data. Note that the ISP 140 can also perform some or all of the processing of the signal processing unit 218.
- the synthesis process can be performed on the camera module 100 side.
- ECU 300 can use RGB images from image sensor 202 in middleware processing and IVI operations. This can further improve the performance of ECU 300.
- ECU 300 performs the distance measurement calculations, but this configuration is not limited to this.
- the serializer 151 can be placed inside the image sensor 201.
- one of a pair of diagonally arranged G pixels in the image sensor 201 can be replaced with an IR pixel.
- the synthesis process can be performed on the camera module 100 side.
- RGB pixels are also arranged on the image sensor 201, allowing the ranging system to improve performance by utilizing RGB images.
- Figure 40 is a diagram summarizing the features of each configuration of the first to third embodiments of the present technology.
- the serializer in No. 1 and No. 2, the serializer is external to the image sensor, while in No. 3, the serializer is internal to the image sensor.
- a subsequent IC is added.
- the image sensor can generate RAW data.
- distance measurement calculations can be performed on the image sensor side, but in No. 2, this is performed by a subsequent IC, eliminating the need for distance measurement calculations on the image sensor side.
- the serializer in No. 1 and No. 2, the serializer is external to the image sensor, while in No. 3, the serializer is internal to the image sensor. Furthermore, in No. 2, a subsequent IC is added. In all of No. 1 to No. 3, the image sensor can generate RAW data. Furthermore, in No. 1 and No. 3, distance measurement calculations can be performed on the image sensor side, but in No. 2, this is performed by a subsequent IC, eliminating the need for distance measurement calculations on the image sensor side.
- the positions, sizes, and numbers of areas where the digital gain is switched are fixed, but these can also be made variable.
- the distance measuring system in this fourth embodiment differs from the first embodiment in that the positions, etc. of the areas are variable.
- FIG. 41 is a flowchart showing an example of the operation of ECU 300 in the fourth embodiment of the present technology. The operation of this fourth embodiment differs from the first embodiment in that steps S931 and S932 are further executed.
- step S920 After the synthesis process (step S920), ECU 300 determines whether a certain amount of time has elapsed since the previous area setting (step S931). For example, step S931 is executed every frame or every few frames.
- ECU 300 If a certain amount of time has passed since the previous setting (step S931: Yes), ECU 300 resets the area parameters (position, size, number, etc.) according to the time of day and the in-vehicle conditions (step S932).
- the same setting values as the previous setting may be used, or different setting values may be used.
- step S931 If a certain amount of time has not elapsed since the previous setting (step S931: No), or after step S932, ECU 300 executes step S912 and subsequent steps.
- FIG. 42 is a diagram showing an example of setting an area of a specified frame in the fourth embodiment of the present technology.
- an area is set in the area surrounded by a dotted line.
- "x0.5" and "x2.0" within the area indicate the digital gain value most frequently used within that area.
- ECU 300 set the area parameters at regular intervals depending on the conditions inside the vehicle, it is possible to respond more flexibly to changes in the conditions inside the vehicle than if the parameters were fixed. This allows for improved performance of the ranging system.
- the synthesis process can be performed on the camera module 100 side.
- first variant of the first embodiment, and the second and third embodiments can each be applied to the fourth embodiment.
- an LD light source such as LD111 is used as the light source
- an LED (light emitting diode) light source may be used instead of the LD light source.
- the camera module 100 sets the upper and lower thresholds of the confidence value to 50 and 100, but these values are not limited to these.
- the camera module 100 can change the upper and lower limit values depending on the application (by filter settings).
- the ECU 300 sets the area parameters at regular intervals, thereby improving the performance of the ranging system.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be realized as a device mounted on any type of moving body, such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, or a robot.
- the vehicle control system 12000 includes multiple electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
- the functional configuration of the integrated control unit 12050 also includes a microcomputer 12051, an audio/video output unit 12052, and an in-vehicle network I/F (Interface) 12053.
- the drivetrain control unit 12010 controls the operation of devices related to the vehicle's drivetrain in accordance with various programs.
- the drivetrain control unit 12010 functions as a control device for a driveforce generating device such as an internal combustion engine or drive motor that generates vehicle driveforce, a driveforce transmission mechanism that transmits driveforce to the wheels, a steering mechanism that adjusts the vehicle's steering angle, and a braking device that generates vehicle braking force.
- the outside vehicle information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
- the outside vehicle information detection unit 12030 is connected to an imaging unit 12031.
- the outside vehicle information detection unit 12030 causes the imaging unit 12031 to capture images outside the vehicle and receives the captured images.
- the outside vehicle information detection unit 12030 may perform object detection processing or distance detection processing for people, cars, obstacles, signs, characters on the road surface, etc. based on the received images.
- the microcomputer 12051 can calculate control target values for the driving force generating device, steering mechanism, or braking device based on information inside and outside the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040, and output control commands to the drive system control unit 12010.
- the microcomputer 12051 can perform cooperative control aimed at realizing the functions of an ADAS (Advanced Driver Assistance System), including vehicle collision avoidance or impact mitigation, following driving based on the distance between vehicles, maintaining vehicle speed, vehicle collision warning, or vehicle lane departure warning.
- ADAS Advanced Driver Assistance System
- the audio/video output unit 12052 transmits at least one audio and/or video output signal to an output device capable of visually or audibly notifying vehicle occupants or the outside of the vehicle of information.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
- the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
- Figure 45 shows an example of the installation location of the imaging unit 12031.
- the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as the front nose, side mirrors, rear bumper, back door, and the top of the windshield inside the vehicle cabin of the vehicle 12100.
- the imaging unit 12101 provided on the front nose and the imaging unit 12105 provided on the top of the windshield inside the vehicle cabin mainly capture images of the front of the vehicle 12100.
- the imaging units 12102 and 12103 provided on the side mirrors mainly capture images of the sides of the vehicle 12100.
- the imaging unit 12104 provided on the rear bumper or back door mainly captures images of the rear of the vehicle 12100.
- the imaging unit 12105 provided on the top of the windshield inside the vehicle cabin is mainly used to detect leading vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, etc.
- Imaging range 12111 indicates the imaging range of imaging unit 12101 provided on the front nose
- imaging ranges 12112 and 12113 indicate the imaging ranges of imaging units 12102 and 12103 provided on the side mirrors, respectively
- imaging range 12114 indicates the imaging range of imaging unit 12104 provided on the rear bumper or back door.
- At least one of the image capturing units 12101 to 12104 may have a function for acquiring distance information.
- at least one of the image capturing units 12101 to 12104 may be a stereo camera consisting of multiple image capturing elements, or an image capturing element having pixels for phase difference detection.
- the microcomputer 12051 can calculate the distance to each three-dimensional object within the imaging ranges 12111 to 12114 and the change in this distance over time (relative speed with respect to the vehicle 12100), thereby extracting as a preceding vehicle the three-dimensional object that is the closest three-dimensional object on the path of the vehicle 12100 and traveling in approximately the same direction as the vehicle 12100 at a predetermined speed (e.g., 0 km/h or higher). Furthermore, the microcomputer 12051 can set the inter-vehicle distance that should be maintained in advance in front of the preceding vehicle, and perform automatic braking control (including follow-up stop control) and automatic acceleration control (including follow-up start control). In this way, cooperative control can be performed for the purpose of autonomous driving, which allows the vehicle to travel autonomously without relying on driver operation.
- automatic braking control including follow-up stop control
- automatic acceleration control including follow-up start control
- the microcomputer 12051 can classify and extract three-dimensional object data regarding three-dimensional objects into categories such as motorcycles, standard vehicles, large vehicles, pedestrians, utility poles, and other three-dimensional objects, and use this data for automatic obstacle avoidance. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into those that are visible to the driver of the vehicle 12100 and those that are difficult to see.
- the microcomputer 12051 determines the collision risk, which indicates the risk of collision with each obstacle, and when the collision risk is equal to or exceeds a set value and a collision is possible, it can provide driving assistance to avoid a collision by outputting an alarm to the driver via the audio speaker 12061 or the display unit 12062, or by performing forced deceleration or evasive steering via the drivetrain control unit 12010.
- At least one of the image capturing units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize pedestrians by determining whether or not a pedestrian is present in the images captured by the image capturing units 12101 to 12104. Such pedestrian recognition is performed, for example, by extracting feature points in the images captured by the image capturing units 12101 to 12104 as infrared cameras, and performing pattern matching processing on a series of feature points that indicate the outline of an object to determine whether or not the object is a pedestrian.
- the audio/video output unit 12052 controls the display unit 12062 to superimpose a rectangular outline on the recognized pedestrian for emphasis.
- the audio/video output unit 12052 may also control the display unit 12062 to display an icon or the like indicating the pedestrian in a desired position.
- the technology disclosed herein can be applied to the in-vehicle information detection unit 12040 and microcomputer 12051.
- the camera module 100 and ECU 300 in FIG. 1 can be applied to the in-vehicle information detection unit 12040 and microcomputer 12051.
- a camera module including: a pixel array unit in which a plurality of pixels, including iToF pixels that generate analog signals for calculating distance by an iToF (indirect Time of Flight) method, are arranged in a two-dimensional grid; an analog-to-digital converter unit that converts the analog signals into digital signals; and a digital gain processor that performs a process of increasing or decreasing the digital signals by a first digital gain and outputting the digital signals, and a process of increasing or decreasing the digital signals within a predetermined area of the pixel array unit by a second digital gain and outputting the digital signals; and a unit for processing a signal from the camera module.
- the distance measuring system wherein the unit performs distance measurement calculation processing to calculate the distance by the iToF method based on the output digital signal.
- the pixel array unit is divided into a predetermined number of pixel blocks, The distance measurement system described in (2) above, wherein the unit further performs a synthesis process before the distance measurement calculation process, in which the amount of received light is calculated for each pixel block, and if the amount of received light is not within a predetermined range, the synthesis process replaces the digital signal corresponding to the first digital gain with the digital signal corresponding to the second digital gain.
- the camera module further includes a calculation unit that calculates the distance using an iToF method based on the digital signal.
- the camera module further includes a first serializer that converts first parallel data into first serial data and transmits the first serial data to the unit.
- the camera module includes a first image sensor; the pixel array unit, the analog-to-digital conversion unit, and the digital gain processing unit are disposed within the first image sensor,
- the camera module includes a first image sensor; The ranging system according to (5), wherein the pixel array unit, the analog-to-digital conversion unit, the digital signal processing unit, and the first serializer are arranged within the first image sensor.
- the camera module includes: a first image sensor in which the pixel array unit, the analog-to-digital conversion unit, and the digital signal processing unit are arranged; The distance measuring system according to (1), further comprising: a second image sensor in which visible light pixels that receive visible light are arranged. (9) The distance measuring system according to (8), wherein the second image sensor further includes an array of IR (Infra-Red) pixels that receive IR light. (10) The ranging system according to (8), wherein the camera module further includes a serializer that converts parallel data from the first and second image sensors into serial data and transmits the serial data to the unit.
- a pixel array unit in which a plurality of pixels including iToF pixels that generate analog signals for calculating distance by the iToF method are arranged in a two-dimensional lattice pattern; an analog-to-digital converter that converts the analog signal into a digital signal; an image sensor comprising: a digital gain processing unit that performs processing to increase or decrease the digital signal by a first digital gain and output the result; and processing to increase or decrease the digital signal within a predetermined area of the pixel array unit by a second digital gain and output the result.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
La présente invention améliore la précision de mesure de distance dans un système de mesure de distance utilisant un procédé de temps de vol indirect (iToF). Le présent système de mesure de distance comprend un module de caméra et une unité. Le module de caméra comprend une unité de réseau de pixels, une unité de conversion analogique-numérique et une unité de traitement de gain numérique. Dans l'unité de réseau de pixels, une pluralité de pixels comprenant des pixels d'iToF destinés à générer des signaux analogiques pour calculer une distance par le procédé d'iToF sont organisés en une forme de réseau bidimensionnel. L'unité de conversion analogique-numérique convertit les signaux analogiques en signaux numériques. L'unité de traitement de gain numérique effectue : un traitement d'augmentation ou de diminution des signaux numériques par un premier gain numérique et délivre les signaux numériques ; et un traitement d'augmentation ou de diminution de signaux numériques dans une zone prédéfinie de l'unité de réseau de pixels par un second gain numérique et délivre les signaux numériques. L'unité traite les signaux provenant du module de caméra.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024021612 | 2024-02-16 | ||
| JP2024-021612 | 2024-02-16 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025173536A1 true WO2025173536A1 (fr) | 2025-08-21 |
Family
ID=96772957
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2025/002740 Pending WO2025173536A1 (fr) | 2024-02-16 | 2025-01-29 | Système de mesure de distance, capteur d'image et procédé de commande de système de mesure de distance |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025173536A1 (fr) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130176550A1 (en) * | 2012-01-10 | 2013-07-11 | Ilia Ovsiannikov | Image sensor, image sensing method, and image photographing apparatus including the image sensor |
| JP2014103657A (ja) * | 2012-11-20 | 2014-06-05 | Visera Technologies Company Ltd | イメージセンシング装置 |
| WO2018135320A1 (fr) * | 2017-01-19 | 2018-07-26 | ソニーセミコンダクタソリューションズ株式会社 | Élément de réception de lumière, élément d'imagerie et dispositif d'imagerie |
| US20180299549A1 (en) * | 2017-04-18 | 2018-10-18 | Espros Photonics Ag | Optoelectronic sensor device and method for controlling same |
| EP3671277A1 (fr) * | 2018-12-21 | 2020-06-24 | Infineon Technologies AG | Appareil et procédé d'imagerie 3d |
| JP2022019558A (ja) * | 2020-07-15 | 2022-01-27 | 三星電子株式会社 | マルチタブ構造を有する距離ピクセル及びこれを含むタイムオブフライトセンサ |
-
2025
- 2025-01-29 WO PCT/JP2025/002740 patent/WO2025173536A1/fr active Pending
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130176550A1 (en) * | 2012-01-10 | 2013-07-11 | Ilia Ovsiannikov | Image sensor, image sensing method, and image photographing apparatus including the image sensor |
| JP2014103657A (ja) * | 2012-11-20 | 2014-06-05 | Visera Technologies Company Ltd | イメージセンシング装置 |
| WO2018135320A1 (fr) * | 2017-01-19 | 2018-07-26 | ソニーセミコンダクタソリューションズ株式会社 | Élément de réception de lumière, élément d'imagerie et dispositif d'imagerie |
| US20180299549A1 (en) * | 2017-04-18 | 2018-10-18 | Espros Photonics Ag | Optoelectronic sensor device and method for controlling same |
| EP3671277A1 (fr) * | 2018-12-21 | 2020-06-24 | Infineon Technologies AG | Appareil et procédé d'imagerie 3d |
| JP2022019558A (ja) * | 2020-07-15 | 2022-01-27 | 三星電子株式会社 | マルチタブ構造を有する距離ピクセル及びこれを含むタイムオブフライトセンサ |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12177592B2 (en) | Solid-state imaging element, imaging device, and control method of solid-state imaging element | |
| US10746874B2 (en) | Ranging module, ranging system, and method of controlling ranging module | |
| US11582406B2 (en) | Solid-state image sensor and imaging device | |
| US11523079B2 (en) | Solid-state imaging element and imaging device | |
| JP7245178B2 (ja) | 固体撮像素子、撮像装置、および、固体撮像素子の制御方法 | |
| WO2019150786A1 (fr) | Élément d'imagerie à semi-conducteurs, dispositif d'imagerie et procédé de commande pour élément d'imagerie à semi-conducteurs | |
| CN112640428A (zh) | 固态成像装置、信号处理芯片和电子设备 | |
| US11394913B2 (en) | Solid-state imaging element, electronic device, and method for controlling correction of luminance in the solid-state imaging element | |
| WO2018042887A1 (fr) | Dispositif de mesure de distance et procédé de commande pour dispositif de mesure de distance | |
| WO2021117350A1 (fr) | Élément d'imagerie à semi-conducteurs et dispositif d'imagerie | |
| WO2017175492A1 (fr) | Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique | |
| US12279053B2 (en) | Information processing device, information processing system, information processing method, and information processing program | |
| CN117546477A (zh) | 成像装置、电子设备及光检测方法 | |
| CN114270798B (zh) | 摄像装置 | |
| JP2020099015A (ja) | センサ及び制御方法 | |
| JP7144926B2 (ja) | 撮像制御装置、撮像装置、および、撮像制御装置の制御方法 | |
| WO2020105301A1 (fr) | Élément d'imagerie à semi-conducteurs et dispositif d'imagerie | |
| WO2025173536A1 (fr) | Système de mesure de distance, capteur d'image et procédé de commande de système de mesure de distance | |
| WO2020100399A1 (fr) | Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, et procédé de contrôle d'élément d'imagerie à semi-conducteurs | |
| WO2022158246A1 (fr) | Dispositif d'imagerie | |
| WO2020090272A1 (fr) | Circuit électronique, élément d'imagerie à semi-conducteurs, et procédé de fabrication de circuit électronique | |
| CN113661700A (zh) | 成像装置与成像方法 | |
| US11201997B2 (en) | Solid-state imaging device, driving method, and electronic apparatus | |
| TW202433738A (zh) | 光檢測裝置、及、光檢測裝置之控制方法 | |
| WO2025249098A1 (fr) | Dispositif de détection, appareil électronique et procédé de commande de dispositif de détection |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25754862 Country of ref document: EP Kind code of ref document: A1 |