WO2024135094A1 - 光検出装置、および、光検出装置の制御方法 - Google Patents
光検出装置、および、光検出装置の制御方法 Download PDFInfo
- Publication number
- WO2024135094A1 WO2024135094A1 PCT/JP2023/038835 JP2023038835W WO2024135094A1 WO 2024135094 A1 WO2024135094 A1 WO 2024135094A1 JP 2023038835 W JP2023038835 W JP 2023038835W WO 2024135094 A1 WO2024135094 A1 WO 2024135094A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- sensor
- line data
- unit
- neural network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/707—Pixels for event detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/47—Image sensors with pixel address output; Event-driven image sensors; Selection of pixels to be read out based on image data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/76—Addressed sensors, e.g. MOS or CMOS sensors
- H04N25/77—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components
- H04N25/772—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising A/D, V/T, V/F, I/T or I/F converters
- H04N25/773—Pixel circuitry, e.g. memories, A/D converters, pixel amplifiers, shared circuits or shared components comprising A/D, V/T, V/F, I/T or I/F converters comprising photon counting circuits, e.g. single photon detection [SPD] or single photon avalanche diodes [SPAD]
Definitions
- This technology relates to a photodetection device. More specifically, it relates to a photodetection device that uses a neural network model and a method for controlling the photodetection device.
- pixel information for each image group is input into a corresponding neural network circuit group to speed up processing.
- the higher the pixel output rate the greater the power consumption and processing delay of the downstream circuit. This causes the output rate to become a bottleneck, making it difficult to further improve performance.
- This technology was developed in light of these circumstances, and aims to improve the performance of photodetection devices that use neural network circuits.
- This technology has been made to solve the problems mentioned above, and its first aspect is a photodetection device and its control method that include a sensor that reads out sensor data in which multiple pixel data are arranged from a pixel array section, a neural network circuit that processes the sensor data based on a neural network model and outputs line data in which multiple processing results are arranged, and a read control section that generates a read control signal that indicates a pixel group to be read in the pixel array section based on the line data.
- This has the effect of suppressing processing delays and increases in power consumption of the photodetection device.
- the line data may include first line data and second line data
- the neural network circuit may output the first line data and the second line data in parallel
- the read control unit may compare the first line data with the second line data and generate the read control signal based on the comparison result.
- the neural network model may be a spiking neural network model, and each of the first line data and the second line data may include a plurality of bit strings, each of which may include a plurality of bits indicating spike detection results in chronological order. This provides the effect of controlling readout based on the spike detection results.
- the neural network model may be a spiking neural network model, and each of the first line data and the second line data may include a plurality of bit strings indicating the state values of the membrane potential in chronological order. This provides the effect of controlling the readout based on the state values of the membrane potential.
- a conversion unit may be further provided that converts at least one of the identification information and the sensor data and supplies the converted data to the neural network circuit, and the sensor may output the identification information together with the sensor data. This provides the effect of controlling readout based on the identification information and the sensor data.
- a first FIFO (First In, First Out) memory that holds the sensor data in a first-in, first-out manner and a second FIFO memory that holds the read control signal in a first-in, first-out manner may be provided, and the neural network circuit may read the sensor data from the first FIFO memory, and the sensor may read the read control signal from the second FIFO memory. This provides the effect of buffering the read control signal and the sensor data.
- First FIFO First In, First Out
- the device may further include a first FIFO memory that holds the sensor data in a first-in, first-out manner, and a second FIFO memory that holds the line data in a first-in, first-out manner, and the neural network circuit may read the sensor data from the first FIFO memory, and the read control unit may read the line data from the second FIFO memory. This provides the effect of buffering the sensor data and the line data.
- the device may further include a digital processing unit that reads and processes the sensor data from the first FIFO memory, and a first format processing unit that generates a communication frame that stores the sensor data. This provides the effect of outputting the sensor data to the outside.
- a second format processing unit may be further provided that generates a communication frame in which the line data is stored, and the read control unit may output the line data to the second format processing unit. This provides the effect of outputting the line data to the outside.
- the senor may be an EVS (Event-based Vision Sensor). This provides the effect of controlling the reading of the EVS.
- EVS Event-based Vision Sensor
- the senor may be a photon measurement circuit that counts photons. This provides the effect of controlling the readout of the photon counting circuit.
- the senor may be a CIS (CMOS Image Sensor). This provides the effect of controlling the readout of the CIS.
- CIS CMOS Image Sensor
- the senor, the neural network circuit, and the readout control unit may be distributed among multiple stacked chips. This reduces the circuit scale of each chip.
- FIG. 1 is a block diagram showing a configuration example of a light detection device according to a first embodiment of the present technology
- 1 is a block diagram showing a configuration example of a sensor chip according to a first embodiment of the present technology
- 2 is a block diagram showing a configuration example of an EVS according to a first embodiment of the present technology.
- FIG. 1 is a circuit diagram showing a configuration example of a pixel according to a first embodiment of the present technology
- FIG. 2 is a diagram illustrating an implementation example of an SNN circuit according to the first embodiment of the present technology.
- FIG. 2 is a block diagram showing a configuration example of a core according to the first embodiment of the present technology
- FIG. 4A to 4C are diagrams for explaining a method of generating a read control signal in the first embodiment of the present technology.
- 2 is a block diagram showing a configuration example of a test pattern generating unit according to the first embodiment of the present technology
- FIG. 4 is a flowchart showing an example of an operation of the photodetector according to the first embodiment of the present technology.
- FIG. 1 is a block diagram showing a configuration example of an SNN processor according to a first modified example of the first embodiment of the present technology.
- FIG. 13 is a diagram illustrating an example of a state line in a second modified example of the first embodiment of the present technology.
- FIG. 13 is a diagram showing an example of a stacked structure of a sensor chip according to a third modified example of the first embodiment of the present technology
- FIG. FIG. 13 is a circuit diagram showing a configuration example of a pixel according to a third modified example of the first embodiment of the present technology
- 13 is a diagram showing an example of a stacked structure of a sensor chip in a fourth modified example of the first embodiment of the present technology
- FIG. FIG. 11 is a block diagram showing a configuration example of a sensor chip according to a second embodiment of the present technology.
- FIG. 13 is a block diagram showing a configuration example of a sensor chip according to a modified example of the second embodiment of the present technology.
- FIG. 13 is a block diagram showing a configuration example of a sensor chip according to a third embodiment of the present technology.
- FIG. 13 is a block diagram showing a configuration example of a photon measurement circuit according to a third embodiment of the present technology.
- FIG. 13 is a circuit diagram showing a configuration example of a pixel according to a third embodiment of the present technology.
- FIG. 13 is a block diagram showing a configuration example of a sensor chip according to a fourth embodiment of the present technology.
- FIG. 13 is a block diagram showing a configuration example of a CIS according to a fourth embodiment of the present technology.
- FIG. 13 is a circuit diagram showing a configuration example of a pixel according to a fourth embodiment of the present technology.
- 1 is a block diagram showing a schematic configuration example of a vehicle control system;
- FIG. 4 is an explanatory diagram showing an example of an installation position of an imaging unit.
- First embodiment (example of read control based on output of SNN circuit) 2.
- Second embodiment (example in which a digital processing unit performs read control based on the output of an SNN circuit) 3.
- Third embodiment (example of controlling readout of a photon counting circuit based on the output of an SNN circuit) 4.
- Fourth embodiment (example of CIS read control based on output of SNN circuit) 5. Examples of applications to moving objects
- First embodiment [Configuration example of a light detection device] 1 is a block diagram showing a configuration example of a light detection device 100 according to a first embodiment of the present technology.
- the light detection device 100 includes an optical unit 110, a sensor chip 200, and a DSP (Digital Signal Processing) circuit 120.
- the light detection device 100 further includes a display unit 130, an operation unit 140, a bus 150, a frame memory 160, a storage unit 170, and a power supply unit 180.
- a digital camera such as a digital still camera, a smartphone, a personal computer, an in-vehicle camera, and the like are assumed.
- the optical unit 110 collects light from the subject and guides it to the sensor chip 200.
- the sensor chip 200 generates and processes multiple pixel data through photoelectric conversion.
- the sensor chip 200 supplies the processed data to the DSP circuit 120.
- the DSP circuit 120 performs a predetermined signal processing on the data from the sensor chip 200. This DSP circuit 120 outputs the processed data to the frame memory 160 etc. via the bus 150.
- the display unit 130 displays image data and the like.
- the display unit 130 may be, for example, a liquid crystal panel or an organic EL (Electro Luminescence) panel.
- the operation unit 140 generates an operation signal in accordance with a user's operation.
- the bus 150 is a common path for the optical unit 110, the sensor chip 200, the DSP circuit 120, the display unit 130, the operation unit 140, the frame memory 160, the storage unit 170, and the power supply unit 180 to exchange data with each other.
- the memory unit 170 stores various data such as image data.
- the power supply unit 180 supplies power to the sensor chip 200, the DSP circuit 120, the display unit 130, etc.
- Example of sensor chip configuration 2 is a block diagram showing a configuration example of the sensor chip 200 according to the first embodiment of the present technology.
- the sensor chip 200 is a single semiconductor chip, and includes an EVS 300 and an SNN processor 500.
- the sensor chip 200 further includes FIFO memories 211 and 212, a test pattern generation unit 220, a digital processing unit 241, a format processing unit 251, and an external communication interface 261.
- EVS300 detects changes in luminance for each pixel. This EVS300 sequentially selects multiple lines in a pixel array section (not shown), and reads out data in which pixel data for each pixel in that line is arranged as a PL (Pixel Line). EVS300 then outputs each PL to FIFO memory 211. Each piece of pixel data includes, for example, a bit that indicates the detection result of the luminance change of that pixel. Note that EVS300 is an example of a sensor as recited in the claims. Also, PL is an example of sensor data as recited in the claims.
- the FIFO memory 211 holds the PL from the EVS 300 in a first-in, first-out manner.
- the PL is read by the test pattern generation unit 220 and the SNN processor 500.
- the FIFO memory 211 is an example of the first FIFO memory described in the claims.
- the SNN processor 500 processes the PL based on the SNN model, and generates a read control signal Ctrl based on the processing result.
- This read control signal Ctrl is a control signal that indicates the pixel group to be read within the pixel array section of the EVS 300.
- the SNN processor 500 outputs the read control signal Ctrl to the FIFO memory 212.
- the FIFO memory 212 holds the read control signal Ctrl from the SNN processor 500 in a first-in, first-out manner.
- the read control signal Ctrl is read by the EVS 300.
- the FIFO memory 212 is an example of the second FIFO memory described in the claims.
- the test pattern generating unit 220 generates a predetermined test pattern in the test mode. This test pattern generating unit 220 supplies the test pattern to the digital processing unit 241 in the test mode, and supplies the PL to the digital processing unit 241 when not in the test mode.
- the test pattern generation unit 220 is arranged as necessary. If the test pattern generation unit 220 is not required, the PL from the FIFO memory 211 is input directly to the digital processing unit 241.
- the digital processing unit 241 performs various digital processing on the PL.
- the digital processing unit 241 supplies the processed PL to the format processing unit 251.
- the format processing unit 251 generates a communication frame that stores the PL. This format processing unit 251 supplies the generated communication frame to the external communication interface 261.
- the external communication interface 261 transmits communication frames from the format processing unit 251 to the DSP circuit 120, etc.
- a communication standard for the external communication interface 261 for example, MIPI (Mobile Industry Processor Interface) is used.
- [EVS configuration example] 3 is a block diagram showing a configuration example of an EVS 300 according to the first embodiment of the present technology.
- the EVS 300 includes a drive unit 310, a pixel array unit 320, a timing control circuit 330, and a line scanner 340.
- a pixel array unit 320 In the pixel array unit 320, a plurality of pixels 400 are arranged in a two-dimensional lattice pattern.
- the driving unit 310 drives each of the pixels 400.
- the pixels 400 detect whether there is a change in luminance and generate pixel data that indicates the detection result.
- the timing control circuit 330 controls the timing for driving the drive unit 310 and the line scanner 340.
- a vertical synchronization signal is input to the timing control circuit 330.
- the timing control circuit 330 generates a horizontal synchronization signal from the vertical synchronization signal and supplies it to the line scanner 340.
- the line scanner 340 sequentially selects lines (rows, columns, etc.) in synchronization with the horizontal synchronization signal, and reads out the pixel data of each pixel within that line.
- This line scanner 340 arranges the pixel data read from the line in one dimension, and outputs the data to the FIFO memory 211 as PL.
- the read unit is the line, it can also be the area unit instead. In this case, the line scanner 340 arranges the pixel data read from the selected area in one dimension in a specified order, and outputs it as PL.
- the driver 310 and line scanner 340 also select the rows and columns to read out according to a read control signal Ctrl from the FIFO memory 212.
- the read control signal Ctrl indicates, for example, the pixel groups to be read out row by row or column by column.
- the read control signal Ctrl can also indicate the pixel groups to be read out area by area. In the initial state, all pixels are read out.
- EVS300 can also use an arbiter method that reads pixel data without synchronizing with a synchronization signal.
- FIG. 4 is a circuit diagram showing an example of a configuration of a pixel 400 according to the first embodiment of the present technology.
- the pixel 400 includes a pixel circuit 410, a buffer 420, a differentiation circuit 430, and a quantizer 440.
- the pixel circuit 410 includes a photodiode 411, nMOS (negative channel MOS) transistors 412 and 413, and a pMOS (positive channel MOS) transistor 414.
- the photodiode 411 generates a photocurrent by photoelectric conversion of incident light.
- the nMOS transistor 412 is inserted between the power supply and the photodiode 411.
- the pMOS transistor 414 and the nMOS transistor 413 are connected in series between the power supply and a ground terminal.
- the gate of the nMOS transistor 413 is connected to the connection point of the nMOS transistor 412 and the photodiode 411, and a bias voltage Vblog is applied to the gate of the pMOS transistor 414.
- the buffer 420 includes pMOS transistors 421 and 422 connected in series between the power supply and the ground terminal.
- the gate of the pMOS transistor 422 on the ground side is connected to the connection point of the pMOS transistor 414 and the nMOS transistor 413.
- a bias voltage Vbsf is applied to the gate of the pMOS transistor 421 on the power supply side.
- the connection point of the pMOS transistors 421 and 422 is connected to the differentiation circuit 430.
- the above circuit generates a voltage signal corresponding to the photocurrent, which is output from the buffer 420.
- Differential circuit 430 includes capacitors 431 and 433, pMOS transistors 432 and 434, and nMOS transistor 435.
- One end of the capacitance 431 is connected to the buffer 420, and the other end is connected to one end of the capacitance 433 and the gate of the pMOS transistor 434.
- a reset signal xrst is input to the gate of the pMOS transistor 432, and the source and drain are connected to both ends of the capacitance 433.
- the pMOS transistor 434 and the nMOS transistor 435 are connected in series between the power supply and the ground terminal.
- the other end of the capacitance 433 is connected to the connection point of the pMOS transistor 434 and the nMOS transistor 435.
- a bias voltage Vba is applied to the gate of the nMOS transistor 435 on the ground side, and the connection point of the pMOS transistor 434 and the nMOS transistor 435 is also connected to the quantizer 440. With this connection, a differential signal indicating the amount of change in the voltage signal is generated and output to the quantizer 440. The differential signal is also initialized by the reset signal xrst.
- the quantizer 440 includes a pMOS transistor 441 and an nMOS transistor 442 connected in series between a power supply and a ground terminal.
- the gate of the pMOS transistor 441 is connected to the differentiation circuit 430, and a predetermined upper threshold Vbon is applied to the gate of the nMOS transistor 442.
- the voltage signal at the connection point between the pMOS transistor 441 and the nMOS transistor 442 is read by the line scanner 340 as a detection signal of a change in luminance.
- an on-event is detected when the differentiated signal indicating a change in luminance exceeds an upper threshold Vbon.
- the pixel 400 can also detect an off-event when the differentiated signal falls below a lower threshold Vboff.
- a pMOS transistor 443 and an nMOS transistor 444 are added, which are connected in series between the power supply and the ground terminal.
- the gate of the pMOS transistor 443 is connected to the differentiation circuit 430, and the lower threshold Vboff is applied to the gate of the nMOS transistor 444.
- the pixel 400 may detect both an on-event and an off-event, or may detect only one of them.
- Example of SNN processor configuration is a block diagram showing an example of a configuration of an SNN processor 500 according to the first embodiment of the present technology.
- the SNN processor 500 includes an SNN circuit 510 and a read control unit 550.
- the EVS 300 reads out the PL from the pixel array section.
- the PL is input to the SNN circuit 510 via the FIFO memory 211.
- the SNN circuit 510 processes the PL based on an SNN model and generates line data in which multiple processing results are arranged as an SL (Spike Line).
- the SL is data in which spike signals output by a row of neurons in the SNN circuit 510 at a certain time are arranged.
- the SNN circuit 510 reads out the SL and outputs it to the control section 550.
- the read control unit 550 generates a read control signal Ctrl based on the SL.
- the EVS 300 reads the next PL according to this read control signal Ctrl.
- the configuration illustrated in the figure makes it possible to realize an application in which, for example, the SNN processor 500 recognizes a specific object in an image, reads out an ROI (Region of Interest) that includes the object, and specifies it using a control signal Ctrl.
- ROI Region of Interest
- the bandwidth between the EVS300 and the FIFO memory 211 may be insufficient, which may cause processing delays after the test pattern generation unit 220.
- the power consumption of the circuits after the test pattern generation unit 220 may increase.
- the SNN processor 500 uses the read control signal Ctrl to specify the next group of pixels to be read, it is possible to reduce the output rate compared to when all pixels are read, thereby suppressing processing delays and increases in power consumption. This makes it possible to improve the performance of the photodetection device 100.
- the SNN circuit 510 also has an input layer 520, an intermediate layer 530, and an output layer 540.
- the input layer 520 receives the PL as input.
- the intermediate layer 530 has one or more layers. Neurons in the previous layer are connected to neurons in the next layer, and the results of calculations in the previous layer are passed to the next layer.
- the output layer 540 generates spike signals asynchronously.
- a pair of neuron rows are arranged.
- the SL output by one neuron row at each time is designated as SLa
- the SL output by the other neuron row at each time is designated as SLb.
- the read control unit 550 compares the output data of each neuron row, and generates a read control signal Ctrl based on the comparison result.
- SLa is an example of the first line data described in the claims
- SLb is an example of the second line data described in the claims.
- FIG. 6 is a diagram showing an example implementation of an SNN circuit 510 in the first embodiment of the present technology.
- the SNN circuit 510 in FIG. 5 is realized, for example, by the circuit in FIG. 6.
- the SNN circuit 510 includes, for example, an input/output interface 560 and a multi-core array 570.
- the input/output interface 560 transmits and receives data between the outside and the multi-core array 570.
- This input/output interface 560 supplies the PL input from the FIFO memory 211 to the multi-core array 570, and reads the SL from the multi-core array 570 and supplies it to the control unit 550.
- multi-core array 570 multiple cores 590 are arranged in a two-dimensional lattice.
- a router 580 is placed adjacent to each core 590.
- the router 580 controls the data path.
- This router 580 includes, for example, FIFO memories 581 to 585 and an arbiter 586.
- E indicates the east direction of the router 580 in question
- S indicates the south direction
- W indicates the west direction
- N indicates the north direction
- L indicates the direction toward the core 590 adjacent to the router 580.
- FIFO memory 581 holds data from the east direction in a first-in-first-out manner and outputs a request to arbiter 586.
- FIFO memory 582 holds data from the south direction in a first-in-first-out manner and outputs a request to arbiter 586.
- FIFO memory 583 holds data from the west direction in a first-in-first-out manner and outputs a request to arbiter 586.
- FIFO memory 584 holds data from the north direction in a first-in-first-out manner and outputs a request to arbiter 586.
- FIFO memory 585 holds data from the adjacent core 590 in a first-in-first-out manner and outputs a request to arbiter 586.
- the external FIFO memory 211 can be eliminated and replaced with a FIFO memory 581 in the SNN circuit 510.
- the arbiter 586 arbitrates requests from each of the FIFO memories 581 to 585 and returns a response. When a response is received, the FIFO memory outputs data to either the east, west, north, or south adjacent core 590 via the arbiter 586.
- FIG. 7 is a block diagram showing an example of the configuration of a core 590 in the first embodiment of the present technology.
- the core 590 includes a core router 591, a neuron I/O (Input/Output) 592, a multiply-accumulate unit 593, a work memory 594, a membrane potential memory 595, and an LIF (Leaky Integrate and Fire) unit 596.
- the core router 591 supplies data from adjacent routers 580 to the neuron I/O 592 and supplies data from the LIF unit 596 to the adjacent routers 580.
- the multiply-and-accumulate unit 593 uses the work memory 594 to accumulate data from the neuron I/O 592.
- the membrane potential memory 595 holds the membrane potential obtained by integration.
- the LIF unit 596 determines whether the membrane potential has exceeded a predetermined threshold and fired (in other words, a spike has occurred), and supplies the result to the core router 591.
- FIG. 8 is a diagram for explaining a method for generating a read control signal in the first embodiment of this technology.
- Each PL includes multiple pixel data.
- Each pixel data is, for example, one bit of information indicating whether an on-event has been detected.
- x0 to xj indicate the x-coordinate of each pixel in the line.
- b shows an example of the configuration of the output layer 540.
- the output layer 540 a pair of neurons is arranged for each line. If the number of lines is k, then a neuron row Ra in which neurons 541-1 to 541-k are arranged, and a neuron row Rb in which neurons 542-1 to 542-k are arranged are arranged.
- the data output by neurons 541-1 to 541-k are denoted as 1a to ka, and the data output by neurons 542-1 to 542-k are denoted as 1b to kb.
- the data output by each neuron includes, for example, spike groups C1a to Cja that were generated within different time periods.
- Spike groups C1a to Cja are data that correspond to each pixel of x-coordinates x1 to xj.
- Each spike group also includes multiple spike signals that were generated within the corresponding time period.
- t1 to tm indicate the times when the spike signals were generated.
- white rectangles indicate that a spike was present, and black rectangles indicate that a spike was not present.
- the data obtained by arranging the spike signals output by each neuron in neuron row Ra corresponds to the aforementioned SLa.
- the data obtained by arranging the spike signals output by each neuron in neuron row Rb corresponds to the aforementioned SLb.
- the read control unit 550 compares the data output by each of the neuron pairs corresponding to the line and generates a read control signal Ctrl.
- the read control unit 550 counts the number of spikes for each spike group of the neuron pair and compares them. For example, when a pattern to be recognized occurs on a certain line, the count value of the spike group of one of the neuron pairs corresponding to that line (e.g., 541-1) is set to be larger than the count value of the spike group of the other (e.g., 542-1). In this case, for example, when the count value of C1a in a certain line is larger than the count value of C1b, the pixel at x-coordinate x1 on that line is specified as the read target.
- the read control unit 550 compares the count values of spike groups, it can instead input the spike groups to be compared into a softmax function and compare the output values. Also, each spike group has a one-to-one correspondence with a pixel, but a one-to-many correspondence is also possible. Furthermore, the read control unit 550 can adjust the frequency of reading out rows and columns based on the comparison results.
- the read control unit 550 generates and outputs a read control signal Ctrl for each line of y coordinates y1 to yk. This allows the read control unit 550 to specify the pixels to be read on a row or column basis.
- Example of configuration of test pattern generation unit 9 is a block diagram showing an example of a configuration of the test pattern generating unit 220 according to the first embodiment of the present technology.
- the test pattern generating unit 220 includes a test pattern supplying unit 221 and a switch 222.
- test pattern supply unit 221 When the test mode is set by the control signal MODE, the test pattern supply unit 221 generates a specific test pattern and supplies it to the switch 222.
- the switch 222 supplies a test pattern to the digital processing unit 241 when the test mode is set, and supplies the PL from the FIFO memory 211 to the digital processing unit 241 when a mode other than the test mode is set.
- Example of operation of the photodetector] 10 is a flowchart showing an example of the operation of the light detection device 100 according to the first embodiment of the present technology. This operation is started, for example, when a predetermined application for capturing image data is executed.
- the EVS 300 sequentially reads out the PLs in accordance with the read control signal (step S901). In the initial state, all pixels are read out.
- the SNN processor 500 generates the SL (step S902) and generates a read control signal (step S903).
- the digital processing unit 241 performs digital processing on each of the PLs (step S904).
- the format processing unit 251 generates a communication frame by formatting (step S905), and the external communication interface 261 transmits the communication frame to the outside (step S906). After step S906, steps S901 and onwards are repeatedly executed.
- the read control unit 550 generates a read control signal based on the SL, so that the output rate can be lowered and processing delays and increases in power consumption can be suppressed. This can improve the performance of the photodetector 100.
- the line identification information (such as the line number) can also be input so that the SNN processor 500 can identify the line corresponding to the PL.
- the photodetector 100 in the first modified example of the first embodiment differs from the first embodiment in that the EVS 300 inputs the line identification information and the PL to the SNN processor 500.
- FIG. 11 is a block diagram showing an example configuration of an SNN processor 500 in a first modified example of the first embodiment of the present technology.
- the EVS 300 reads the PLs using an arbiter method or a scan method. The EVS 300 then adds identification information (such as a line number) of the corresponding line to each PL, and inputs the information to the SNN processor 500 via the FIFO memory 211.
- identification information such as a line number
- the SNN processor 500 further includes a conversion unit 505.
- the conversion unit 505 converts at least one of the identification information Id and the corresponding PL, and supplies them to the SNN circuit 510.
- the conversion unit 505 inputs PL directly to the SNN circuit 510, while converting the identification information Id into vector data or the like and inputting it.
- the conversion unit 505 inputs the identification information Id directly to the SNN circuit 510, while converting PL into frequency information (such as a phase value or a scalar value) using a Fourier transform or the like, and inputs the information.
- frequency information such as a phase value or a scalar value
- the conversion unit 505 can also convert both the identification information Id and PL.
- the conversion unit 505 converts at least one of the identification information and the PL and inputs it to the SNN circuit 510, so that the subsequent read control unit 550 can identify the line corresponding to the SL from the identification information. This allows the read control unit 550 to generate a read control signal.
- the conversion unit 505 converts at least one of the identification information and the PL and inputs it to the SNN circuit 510, so that the read control unit 550 can generate a read control signal based on these data.
- the SNN circuit 510 outputs SL, but instead of SL, it is also possible to output the state value of the membrane potential in chronological order.
- the photodetector 100 in the second modification of the first embodiment differs from the first embodiment in that the SNN circuit 510 outputs the state value of the membrane potential in chronological order.
- FIG. 12 is a diagram showing an example of a state line in the second modified example of the first embodiment of the present technology.
- a pair of neurons such as 541-1 and 542-1 in the second modified example of the first embodiment outputs output data Da and Db.
- Each output data includes a number of bit strings that indicate the state value of the membrane potential in chronological order.
- t1 to tj indicate the time when the bit strings were output.
- the rectangle below the time is a bit string of two or more bits that indicates the state value at that time. Also, the darker the color of the rectangle, the larger the state value.
- each state value at time t1 to tj corresponds to each pixel at x-coordinates x1 to xj.
- the read control unit 550 compares the state values of each neuron at the same time, and based on the comparison result, determines whether or not to read out the pixel corresponding to that time.
- the first variant can be applied to the second variant of the first embodiment.
- the SNN circuit 510 outputs the state values of the membrane potential in chronological order, so that the read control unit 550 can generate a read control signal based on these state values.
- circuits such as the EVS 300 are arranged on a single semiconductor chip, but this configuration can make it difficult to increase the number of pixels.
- the photodetector 100 in the third modified example of the first embodiment differs from the first embodiment in that circuits are distributed and arranged on two stacked semiconductor chips.
- FIG. 13 is a diagram showing an example of a stacked structure of a sensor chip 200 in a third modified example of the first embodiment of the present technology.
- the sensor chip 200 in the third modified example of the first embodiment includes a pixel chip 201 and a circuit chip 202. These chips are stacked and electrically connected by, for example, Cu-Cu bonding. Note that in addition to Cu-Cu bonding, they can also be connected by vias or bumps.
- FIG. 14 is a circuit diagram showing an example of a configuration of a pixel 400 in a third modified example of the first embodiment of the present technology.
- the pixel circuit 410 is disposed on the pixel chip 201, and the subsequent circuits after the buffer 420 are disposed on the circuit chip 202.
- the circuits arranged on each chip are not limited to those illustrated in the figure.
- the photodiode 411 and the nMOS transistors 412 and 413 can be arranged on the pixel chip 201, and the remaining circuits can be arranged on the circuit chip 202.
- the photodiode 411 can be arranged on the pixel chip 201, and the remaining circuits can be arranged on the circuit chip 202.
- first and second variants can be applied to the third variant of the first embodiment.
- the circuits are distributed across two stacked chips, so the circuit scale per chip can be reduced. This makes it easier to achieve a high pixel count.
- circuits such as the EVS 300 are arranged on a single semiconductor chip, but this configuration can make it difficult to increase the number of pixels.
- the photodetector 100 in the fourth modification of the first embodiment differs from the first embodiment in that circuits are distributed across three stacked semiconductor chips.
- FIG. 15 is a diagram showing an example of a stacked structure of a sensor chip 200 in a fourth modified example of the first embodiment of the present technology.
- the sensor chip 200 includes a stacked pixel chip 201, a circuit chip 202, and a circuit chip 203.
- a part of the pixels of the EVS300 (such as the pixel circuit 410) is disposed on the pixel chip 201, and the remaining circuits of the EVS300 are disposed on the circuit chip 202.
- the circuits subsequent to the FIFO memory 211 are disposed on the circuit chip 203. Note that the circuits disposed on each chip are not limited to those exemplified in the figure. Furthermore, the number of stacked chips is not limited to three, and may be four or more.
- first and second variants can be applied to the fourth variant of the first embodiment.
- the circuits are distributed among the three stacked chips, so the circuit scale per chip can be reduced. This makes it easier to achieve a high pixel count.
- the SNN processor 500 generates the read control signal, but in this configuration, it is necessary to add a read control unit 550 in the SNN processor 500.
- the photodetector 100 in this second embodiment differs from the first embodiment in that a digital processing unit in the subsequent stage generates the read control signal.
- FIG. 16 is a block diagram showing an example of the configuration of a sensor chip 200 in a second embodiment of the present technology.
- the sensor chip 200 in the second embodiment differs from the first embodiment in that it further includes a test pattern generating unit 230 and a digital processing unit 242.
- the SNN processor 500 does not generate the read control signal Ctrl, but outputs SL to the FIFO memory 212.
- the test pattern generation unit 220 supplies the test pattern to the digital processing unit 241 when in test mode, and reads the SL from the FIFO memory 212 and supplies it to the digital processing unit 242 when not in test mode.
- the digital processing unit 242 generates a read control signal Ctrl based on SL and outputs it to the EVS 300.
- the digital processing unit 242 is an example of the read control unit described in the claims.
- FIG. 17 is a block diagram showing an example of the configuration of an SNN processor 500 in the second embodiment of the present technology. As shown in the figure, the SNN processor 500 in the second embodiment differs from the first embodiment in that a read control unit 550 is not provided.
- the digital processing unit 242 generates the read control signal, thereby reducing the read control unit 550 in the SNN processor 500.
- each of the first to fourth variations of the first embodiment can be applied to the second embodiment.
- the digital processing unit 242 generates the read control signal instead of the SNN processor 500, so the circuit size of the SNN processor 500 can be reduced.
- the sensor chip 200 does not output SL or the processing result to the outside, but it can output them to the outside.
- the sensor chip 200 in the modified example of the second embodiment differs from the first embodiment in that it outputs SL and the like to the outside.
- FIG. 18 is a block diagram showing an example of the configuration of a sensor chip 200 in a modified example of the second embodiment of the present technology.
- the sensor chip 200 in this modified example of the second embodiment differs from the second embodiment in that it further includes a format processing unit 252 and an external communication interface 262.
- the digital processing unit 242 generates a read control signal and performs various digital processes on the SL as necessary.
- the digital processing unit 242 outputs the processed SL to the format processing unit 252.
- the format processing unit 252 generates a communication frame that stores the SL and the like. This format processing unit 252 supplies the generated communication frame to the external communication interface 262.
- the external communication interface 262 transmits communication frames from the format processing unit 252 to the DSP circuit 120, etc.
- each of the first to fourth modifications of the first embodiment can be applied to the modification of the second embodiment.
- the sensor chip 200 further outputs SL to the outside, so that a circuit external to the sensor chip 200 can use the data.
- the EVS 300 is used as a sensor that generates PL, but a photon measurement circuit that counts photons can be used instead of the EVS 300.
- the light detection device 100 in this third embodiment differs from the first embodiment in that a photon measurement circuit is used instead of the EVS 300.
- FIG. 19 is a block diagram showing an example of the configuration of a sensor chip 200 in a third embodiment of the present technology.
- the sensor chip 200 in this third embodiment differs from the first embodiment in that a photon measurement circuit 600 is provided instead of the EVS 300.
- the photon measurement circuit 600 is an example of a sensor described in the claims.
- FIG. 20 is a block diagram showing an example of the configuration of a photon measurement circuit 600 in the third embodiment of the present technology.
- This photon measurement circuit 600 includes a drive unit 610, a pixel array unit 620, a timing control circuit 640, and a readout processing unit 650.
- a pixel array unit 620 In the pixel array unit 620, a plurality of pixels 630 are arranged in a two-dimensional lattice.
- the functions of the drive unit 610, pixel array unit 620, timing control circuit 640, and readout processing unit 650 are similar to those of the drive unit 310, pixel array unit 320, timing control circuit 330, and line scanner 340.
- FIG. 21 is a circuit diagram showing an example of a configuration of a pixel 630 in the third embodiment of the present technology.
- This pixel 630 includes a quench resistor 631, a SPAD (Single-Photon Avalanche Diode) 632, an inverter 633, and a photon counter 634.
- SPAD Single-Photon Avalanche Diode
- the quench resistor 631 and the SPAD 632 are connected in series.
- the inverter 633 inverts the voltage signal at the connection point of the quench resistor 631 and the SPAD 632, and supplies it to the photon counter 634 as a pulse signal.
- the photon counter 634 counts the number of pulses in the pulse signal, reads out pixel data indicating the count value, and supplies it to the processing unit 650.
- each pixel data in the PL is a bit string of 2 or more bits indicating the count value.
- a conversion circuit that converts the bit string for each pixel into 1 bit is inserted before the SNN circuit 510.
- circuit configuration of pixel 630 is not limited to the example shown in the figure, as long as it is capable of counting photons.
- the photon measurement circuit 600 is placed in place of the EVS 300, so that processing delays and increases in power consumption can be suppressed downstream of the photon measurement circuit 600.
- the EVS 300 is used as a sensor that generates PL, but a CIS can be used instead of the EVS 300.
- the photodetector 100 in the fourth embodiment differs from the first embodiment in that a CIS is used instead of the EVS 300.
- FIG. 22 is a block diagram showing an example of the configuration of a sensor chip 200 in a fourth embodiment of the present technology.
- the sensor chip 200 in this fourth embodiment differs from the first embodiment in that a CIS 700 is provided instead of an EVS 300.
- the CIS 700 is an example of a sensor described in the claims.
- FIG. 23 is a block diagram showing an example of the configuration of a CIS 700 in the first embodiment of the present technology.
- This CIS 700 includes a vertical scanning circuit 710, a timing control circuit 720, a DAC (Digital to Analog Converter) 730, a pixel array section 740, a column ADC 760, and a horizontal transfer scanning circuit 770.
- Pixels 750 are arranged in a two-dimensional grid in the pixel array section 740.
- the vertical scanning circuit 710 sequentially selects and drives the rows, outputting analog pixel signals to the column ADC 760.
- the timing control circuit 720 generates a horizontal synchronization signal from the vertical synchronization signal and supplies it to the horizontal transfer scanning circuit 770.
- DAC730 generates a predetermined reference signal and supplies it to column ADC760.
- a sawtooth ramp signal is used as the reference signal.
- the column ADC 760 has an ADC for each column and performs AD (Analog to Digital) conversion on the pixel signals of each column.
- the column ADC 760 generates a PL under the control of the horizontal transfer scanning circuit 770 and outputs it to the FIFO memory 211.
- the horizontal transfer scanning circuit 770 controls the column ADC 760 to output pixel data in sequence.
- each pixel data in the PL is a bit string of 2 or more bits that indicates the gradation value of that pixel.
- a conversion circuit that converts the bit string for each pixel to 1 bit is inserted before the SNN circuit 510.
- This pixel 750 includes a photodiode 751, a transfer transistor 752, a reset transistor 753, a floating diffusion layer 754, an amplifier transistor 755, and a selection transistor 756.
- the photodiode 751 converts incident light into electricity to generate an electric charge.
- the transfer transistor 752 transfers the electric charge from the photodiode 751 to the floating diffusion layer 754 in accordance with a transfer signal TRG from the vertical scanning circuit 710.
- the reset transistor 753 extracts charge from the floating diffusion layer 754 to initialize it in accordance with a reset signal RST from the vertical scanning circuit 710.
- the floating diffusion layer 754 accumulates charge and generates a voltage according to the amount of charge.
- the amplification transistor 755 amplifies the voltage of the floating diffusion layer 754.
- the selection transistor 756 outputs the amplified voltage signal as a pixel signal according to the selection signal SEL from the vertical scanning circuit 710.
- vertical signal lines 759 are wired for each column in the pixel array section 740, and the pixel signals of the pixels 750 in a column are output to the column ADC 760 via the vertical signal line 759 of that column.
- circuit configuration of pixel 750 is not limited to the configuration illustrated in the figure, as long as it is capable of generating an analog pixel signal.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be realized as a device mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, or a robot.
- FIG. 25 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology disclosed herein can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
- Also shown as functional components of the integrated control unit 12050 are a microcomputer 12051, an audio/video output unit 12052, and an in-vehicle network I/F (interface) 12053.
- the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 functions as a control device for a drive force generating device for generating the drive force of the vehicle, such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle of the vehicle, and a braking device for generating a braking force for the vehicle.
- the body system control unit 12020 controls the operation of various devices installed in the vehicle body according to various programs.
- the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various lamps such as headlamps, tail lamps, brake lamps, turn signals, and fog lamps.
- radio waves or signals from various switches transmitted from a portable device that replaces a key can be input to the body system control unit 12020.
- the body system control unit 12020 accepts the input of these radio waves or signals and controls the vehicle's door lock device, power window device, lamps, etc.
- the outside-vehicle information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
- the image capturing unit 12031 is connected to the outside-vehicle information detection unit 12030.
- the outside-vehicle information detection unit 12030 causes the image capturing unit 12031 to capture images outside the vehicle and receives the captured images.
- the outside-vehicle information detection unit 12030 may perform object detection processing or distance detection processing for people, cars, obstacles, signs, or characters on the road surface based on the received images.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of light received.
- the imaging unit 12031 can output the electrical signal as an image, or as distance measurement information.
- the light received by the imaging unit 12031 may be visible light, or may be invisible light such as infrared light.
- the in-vehicle information detection unit 12040 detects information inside the vehicle.
- a driver state detection unit 12041 that detects the state of the driver is connected.
- the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 may calculate the driver's degree of fatigue or concentration based on the detection information input from the driver state detection unit 12041, or may determine whether the driver is dozing off.
- the microcomputer 12051 can calculate control target values for the driving force generating device, steering mechanism, or braking device based on information inside and outside the vehicle acquired by the outside-vehicle information detection unit 12030 or the inside-vehicle information detection unit 12040, and output control commands to the drive system control unit 12010.
- the microcomputer 12051 can perform cooperative control aimed at realizing the functions of an Advanced Driver Assistance System (ADAS), including vehicle collision avoidance or impact mitigation, following driving based on the distance between vehicles, maintaining vehicle speed, vehicle collision warning, or vehicle lane departure warning.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 can also control the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040, thereby performing cooperative control aimed at automatic driving, which allows the vehicle to travel autonomously without relying on the driver's operation.
- the microcomputer 12051 can also output control commands to the body system control unit 12020 based on information outside the vehicle acquired by the outside-vehicle information detection unit 12030. For example, the microcomputer 12051 can control the headlamps according to the position of a preceding vehicle or an oncoming vehicle detected by the outside-vehicle information detection unit 12030, and perform cooperative control aimed at preventing glare, such as switching high beams to low beams.
- the audio/image output unit 12052 transmits at least one output signal of audio and image to an output device capable of visually or audibly notifying the occupants of the vehicle or the outside of the vehicle of information.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
- the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
- FIG. 26 shows an example of the installation position of the imaging unit 12031.
- the imaging unit 12031 includes imaging units 12101, 12102, 12103, 12104, and 12105.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at the front nose, side mirrors, rear bumper, back door, and upper part of the windshield inside the vehicle cabin of the vehicle 12100.
- the imaging unit 12101 provided at the front nose and the imaging unit 12105 provided at the upper part of the windshield inside the vehicle cabin mainly acquire images of the front of the vehicle 12100.
- the imaging units 12102 and 12103 provided at the side mirrors mainly acquire images of the sides of the vehicle 12100.
- the imaging unit 12104 provided at the rear bumper or back door mainly acquires images of the rear of the vehicle 12100.
- the imaging unit 12105 provided at the upper part of the windshield inside the vehicle cabin is mainly used to detect leading vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, etc.
- FIG. 26 shows an example of the imaging ranges of the imaging units 12101 to 12104.
- Imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
- imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
- imaging range 12114 indicates the imaging range of the imaging unit 12104 provided on the rear bumper or back door.
- an overhead image of the vehicle 12100 viewed from above is obtained by superimposing the image data captured by the imaging units 12101 to 12104.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera consisting of multiple imaging elements, or an imaging element having pixels for detecting phase differences.
- the microcomputer 12051 can obtain the distance to each solid object within the imaging ranges 12111 to 12114 and the change in this distance over time (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104, and can extract as a preceding vehicle, in particular, the closest solid object on the path of the vehicle 12100 that is traveling in approximately the same direction as the vehicle 12100 at a predetermined speed (e.g., 0 km/h or faster). Furthermore, the microcomputer 12051 can set the inter-vehicle distance that should be maintained in advance in front of the preceding vehicle, and perform automatic braking control (including follow-up stop control) and automatic acceleration control (including follow-up start control). In this way, cooperative control can be performed for the purpose of automatic driving, which runs autonomously without relying on the driver's operation.
- automatic braking control including follow-up stop control
- automatic acceleration control including follow-up start control
- the microcomputer 12051 classifies and extracts three-dimensional object data on three-dimensional objects, such as two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, utility poles, and other three-dimensional objects, based on the distance information obtained from the imaging units 12101 to 12104, and can use the data to automatically avoid obstacles.
- the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
- the microcomputer 12051 determines the collision risk, which indicates the risk of collision with each obstacle, and when the collision risk is equal to or exceeds a set value and there is a possibility of a collision, it can provide driving assistance for collision avoidance by outputting an alarm to the driver via the audio speaker 12061 or the display unit 12062, or by forcibly decelerating or steering to avoid a collision via the drive system control unit 12010.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104. The recognition of such a pedestrian is performed, for example, by a procedure of extracting feature points in the captured image of the imaging units 12101 to 12104 as infrared cameras, and a procedure of performing pattern matching processing on a series of feature points that indicate the contour of an object to determine whether or not it is a pedestrian.
- the audio/image output unit 12052 controls the display unit 12062 to superimpose a rectangular contour line for emphasis on the recognized pedestrian.
- the audio/image output unit 12052 may also control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
- the technology disclosed herein can be applied to, for example, the imaging unit 12031.
- the light detection device 100 in FIG. 1 can be applied to the imaging unit 12031.
- the present technology can also be configured as follows. (1) a sensor that reads out sensor data in which a plurality of pixel data are arranged from a pixel array unit; a neural network circuit that processes the sensor data based on a neural network model and outputs line data in which a plurality of processing results are arranged; a read control unit that generates a read control signal that indicates a pixel group to be read out in the pixel array unit based on the line data. (2) the line data includes first line data and second line data; the neural network circuit outputs the first line data and the second line data in parallel; The light detection device according to (1), wherein the read control unit compares the first line data with the second line data and generates the read control signal based on a result of the comparison.
- the neural network model is a spiking neural network model, each of the first line data and the second line data includes a plurality of bit strings;
- the photodetection device according to (2) wherein each of the plurality of bit strings includes a plurality of bits that indicate spike detection results in chronological order.
- the neural network model is a spiking neural network model, The photodetection device according to (2), wherein each of the first line data and the second line data includes a plurality of bit strings indicating state values of membrane potential in chronological order.
- (5) further comprising a conversion unit that converts at least one of the identification information and the sensor data and supplies the converted data to the neural network circuit;
- the optical detection device according to any one of (1) to (4), wherein the sensor outputs the identification information together with the sensor data.
- (6) a first FIFO (First In, First Out) memory that stores the sensor data in a first-in, first-out manner; a second FIFO memory for holding the read control signal in a first-in, first-out manner;
- the neural network circuit reads the sensor data from the first FIFO memory;
- the photodetection device according to any one of (1) to (5), wherein the sensor reads out the read control signal from the second FIFO memory.
- a first FIFO memory that holds the sensor data in a first-in, first-out manner; a second FIFO memory for holding the line data in a first-in, first-out manner;
- the neural network circuit reads the sensor data from the first FIFO memory;
- the light detection device according to any one of (1) to (5), wherein the read control unit reads the line data from the second FIFO memory.
- a digital processing unit that reads the sensor data from the first FIFO memory and processes the sensor data;
- the light detection device according to (7) above, further comprising: a first format processing unit that generates a communication frame in which the sensor data is stored.
- (9) further comprising a second format processing unit that generates a communication frame storing the line data;
- EVS event-based vision sensor
- (13) The photodetector according to any one of (1) to (12), wherein the sensor, the neural network circuit, and the readout control unit are distributed across multiple stacked chips.
- (14) a step of reading out sensor data, which is an array of data on a plurality of pixels, from a pixel array unit by the sensor; a step of processing the sensor data based on a neural network model and outputting line data in which a plurality of processing results are arranged; and a read control procedure for generating a read control signal for designating a pixel group to be read out in the pixel array portion based on the line data.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Transforming Light Signals Into Electric Signals (AREA)
Abstract
Description
1.第1の実施の形態(SNN回路の出力に基づいて読出し制御を行う例)
2.第2の実施の形態(SNN回路の出力に基づいてデジタル処理部が読出し制御を行う例)
3.第3の実施の形態(SNN回路の出力に基づいて光子計数回路の読出し制御を行う例)
4.第4の実施の形態(SNN回路の出力に基づいてCISの読出し制御を行う例)
5.移動体への応用例
[光検出装置の構成例]
図1は、本技術の第1の実施の形態における光検出装置100の一構成例を示すブロック図である。この光検出装置100は、光学部110、センサーチップ200およびDSP(Digital Signal Processing)回路120を備える。さらに光検出装置100は、表示部130、操作部140、バス150、フレームメモリ160、記憶部170および電源部180を備える。光検出装置100としては、例えば、デジタルスチルカメラなどのデジタルカメラの他、スマートフォンやパーソナルコンピュータ、車載カメラ等が想定される。
図2は、本技術の第1の実施の形態におけるセンサーチップ200の一構成例を示すブロック図である。このセンサーチップ200は、単一の半導体チップであり、EVS300およびSNNプロセッサ500を備える。さらに、センサーチップ200は、FIFOメモリ211および212と、テストパターン生成部220と、デジタル処理部241と、フォーマット処理部251と、外部通信インターフェース261とを備える。
図3は、本技術の第1の実施の形態におけるEVS300の一構成例を示すブロック図である。このEVS300は、駆動部310、画素アレイ部320、タイミング制御回路330およびラインスキャナ340を備える。画素アレイ部320内には、複数の画素400が二次元格子状に配列される。
図4は、本技術の第1の実施の形態における画素400の一構成例を示す回路図である。この画素400は、画素回路410、バッファ420、微分回路430および量子化器440を備える。
図5は、本技術の第1の実施の形態におけるSNNプロセッサ500の一構成例を示すブロック図である。このSNNプロセッサ500は、SNN回路510および読出し制御部550を備える。
図9は、本技術の第1の実施の形態におけるテストパターン生成部220の一構成例を示すブロック図である。このテストパターン生成部220は、テストパターン供給部221およびスイッチ222を備える。
図10は、本技術の第1の実施の形態における光検出装置100の動作の一例を示すフローチャートである。この動作は、例えば、画像データを撮像するための所定のアプリケーションが実行されたときに開始される。
上述の第1の実施の形態では、PLのみをSNNプロセッサ500に入力していたがSNNプロセッサ500がPLに対応するラインを特定するために、ラインの識別情報(ライン番号など)を入力することもできる。この第1の実施の形態の第1の変形例における光検出装置100は、EVS300がラインの識別情報およびPLをSNNプロセッサ500に入力する点において第1の実施の形態と異なる。
上述の第1の実施の形態では、SNN回路510は、SLを出力していたが、SLの代わりに、膜電位の状態値を時系列順に出力することもできる。この第1の実施の形態の第2の変形例における光検出装置100は、SNN回路510が膜電位の状態値を時系列順に出力する点において第1の実施の形態と異なる。
上述の第1の実施の形態では、単一の半導体チップに、EVS300などの回路を配置していたが、この構成では、多画素化が困難になることがある。この第1の実施の形態の第3の変形例における光検出装置100は、積層した2つの半導体チップに、回路を分散して配置した点において第1の実施の形態と異なる。
上述の第1の実施の形態では、単一の半導体チップに、EVS300などの回路を配置していたが、この構成では、多画素化が困難になることがある。この第1の実施の形態の第4の変形例における光検出装置100は、積層した3つの半導体チップに、回路を分散して配置した点において第1の実施の形態と異なる。
上述の第1の実施の形態では、SNNプロセッサ500が読出し制御信号を生成していたが、この構成ではSNNプロセッサ500内に、読出し制御部550を追加する必要がある。この第2の実施の形態における光検出装置100は、後段のデジタル処理部が読出し制御信号を生成する点において第1の実施の形態と異なる。
上述の第2の実施の形態では、センサーチップ200が、SLや、その処理結果を外部出力していなかったが、それらを外部出力することもできる。この第2の実施の形態の変形例におけるセンサーチップ200は、SL等を外部出力する点において第1の実施の形態と異なる。
上述の第1の実施の形態では、PLを生成するセンサーとしてEVS300を用いていたが、EVS300の代わりに光子を計数する光子計測回路を用いることもできる。この第3の実施の形態における光検出装置100は、EVS300の代わりに光子計測回路を用いる点において第1の実施の形態と異なる。
と、第2の実施の形態と、第2の実施の形態の変形例とのそれぞれを適用することができる。
上述の第1の実施の形態では、PLを生成するセンサーとしてEVS300を用いていたが、EVS300の代わりに、CISを用いることもできる。この第4の実施の形態における光検出装置100は、EVS300の代わりにCISを用いる点において第1の実施の形態と異なる。
と、第2の実施の形態と、第2の実施の形態の変形例とのそれぞれを適用することができる。
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
(1)複数の画素データを配列したセンサデータを画素アレイ部から読み出すセンサーと、
ニューラルネットワークモデルに基づいて前記センサデータを処理して複数の処理結果を配列したラインデータを出力するニューラルネットワーク回路と、
前記ラインデータに基づいて前記画素アレイ部内の読出し対象の画素群を指示する読出し制御信号を生成する読出し制御部と
を具備する光検出装置。
(2)前記ラインデータは、第1ラインデータおよび第2ラインデータを含み、
前記ニューラルネットワーク回路は、前記第1ラインデータおよび前記第2ラインデータを並列に出力し、
前記読出し制御部は、前記第1ラインデータと前記第2ラインデータとを比較して当該比較結果に基づいて前記読出し制御信号を生成する
前記(1)記載の光検出装置。
(3)前記ニューラルネットワークモデルは、スパイキングニューラルネットワークモデルであり、
前記第1ラインデータおよび前記第2ラインデータのそれぞれは、複数のビット列を含み、
前記複数のビット列のそれぞれは、スパイクの検出結果を時系列順に示す複数のビットを含む
前記(2)記載の光検出装置。
(4)前記ニューラルネットワークモデルは、スパイキングニューラルネットワークモデルであり、
前記第1ラインデータおよび前記第2ラインデータのそれぞれは、膜電位の状態値を時系列順に示す複数のビット列を含む
前記(2)記載の光検出装置。
(5)識別情報および前記センサデータの少なくとも一方を変換して前記ニューラルネットワーク回路に供給する変換部をさらに具備し、
前記センサーは、前記センサデータとともに前記識別情報を出力する
前記(1)から(4)のいずれかに記載の光検出装置。
(6)前記センサデータを先入れ先出し方式で保持する第1のFIFO(First In, First Out)メモリと、
前記読出し制御信号を先入れ先出し方式で保持する第2のFIFOメモリと
をさらに具備し、
前記ニューラルネットワーク回路は、前記第1のFIFOメモリから前記センサデータを読み出し、
前記センサーは、前記第2のFIFOメモリから前記読出し制御信号を読み出す
前記(1)から(5)のいずれかに記載の光検出装置。
(7)前記センサデータを先入れ先出し方式で保持する第1のFIFOメモリと、
前記ラインデータを先入れ先出し方式で保持する第2のFIFOメモリと
をさらに具備し、
前記ニューラルネットワーク回路は、前記第1のFIFOメモリから前記センサデータを読み出し、
前記読出し制御部は、前記第2のFIFOメモリから前記ラインデータを読み出す
前記(1)から(5)のいずれかに記載の光検出装置。
(8)前記第1のFIFOメモリから前記センサデータを読み出して処理するデジタル処理部と、
前記センサデータを格納した通信フレームを生成する第1のフォーマット処理部と
をさらに具備する
前記(7)記載の光検出装置。
(9)前記ラインデータを格納した通信フレームを生成する第2のフォーマット処理部をさらに具備し、
前記読出し制御部は、前記ラインデータを前記第2のフォーマット処理部に出力する
前記(8)に記載の光検出装置。
(10)前記センサーは、EVS(Event-based Vision Sensor)である
前記(1)から(9)のいずれかに記載の光検出装置。
(11)前記センサーは、光子を計数する光子計測回路である
前記(1)から(9)のいずれかに記載の光検出装置。
(12)前記センサーは、CIS(CMOS Image Sensors)である
前記(1)から(9)のいずれかに記載の光検出装置。
(13)前記センサーと前記ニューラルネットワーク回路と前記読出し制御部とは、積層された複数のチップに分散して配置される
前記(1)から(12)のいずれかに記載の光検出装置。
(14)センサーが、複数の画素データを配列したセンサデータを画素アレイ部から読み出す手順と、
ニューラルネットワークモデルに基づいて前記センサデータを処理して複数の処理結果を配列したラインデータを出力する手順と、
前記ラインデータに基づいて前記画素アレイ部内の読出し対象の画素群を指示する読出し制御信号を生成する読出し制御手順と
を具備する光検出装置の制御方法。
110 光学部
120 DSP回路
130 表示部
140 操作部
150 バス
160 フレームメモリ
170 記憶部
180 電源部
200 センサーチップ
201 画素チップ
202、203 回路チップ
211、212、581~585 FIFOメモリ
220、230 テストパターン生成部
221 テストパターン供給部
222 スイッチ
241、242 デジタル処理部
251、252 フォーマット処理部
261、262 外部通信インターフェース
300 EVS
310、610 駆動部
320、620、740 画素アレイ部
330、640、720 タイミング制御回路
340 ラインスキャナ
400、630、750 画素
410 画素回路
411、751 フォトダイオード
412、413、435、442、444 nMOSトランジスタ
414、421、422、432、434、441、443 pMOSトランジスタ
420 バッファ
430 微分回路
431、433 容量
440 量子化器
500 SNNプロセッサ
505 変換部
510 SNN回路
520 入力層
530 中間層
540 出力層
541-1~541-k、542-1~542-k ニューロン
550 読出し制御部
560 入出力インターフェース
570 マルチコアアレイ
580 ルーター
586 アービタ
590 コア
591 コアルーター
592 ニューロンI/O
593 積和ユニット
594 ワークメモリ
595 膜電位メモリ
596 LIFユニット
600 光子計測回路
631 クウェンチ抵抗
632 SPAD
633 インバータ
634 フォトンカウンタ
650 読出し処理部
700 CIS
710 垂直走査回路
730 DAC
752 転送トランジスタ
753 リセットトランジスタ
754 浮遊拡散層
755 増幅トランジスタ
756 選択トランジスタ
760 カラムADC
770 水平転送走査回路
12031 撮像部
Claims (14)
- 複数の画素データを配列したセンサデータを画素アレイ部から読み出すセンサーと、
ニューラルネットワークモデルに基づいて前記センサデータを処理して複数の処理結果を配列したラインデータを出力するニューラルネットワーク回路と、
前記ラインデータに基づいて前記画素アレイ部内の読出し対象の画素群を指示する読出し制御信号を生成する読出し制御部と
を具備する光検出装置。 - 前記ラインデータは、第1ラインデータおよび第2ラインデータを含み、
前記ニューラルネットワーク回路は、前記第1ラインデータおよび前記第2ラインデータを並列に出力し、
前記読出し制御部は、前記第1ラインデータと前記第2ラインデータとを比較して当該比較結果に基づいて前記読出し制御信号を生成する
請求項1記載の光検出装置。 - 前記ニューラルネットワークモデルは、スパイキングニューラルネットワークモデルであり、
前記第1ラインデータおよび前記第2ラインデータのそれぞれは、複数のビット列を含み、
前記複数のビット列のそれぞれは、スパイクの検出結果を時系列順に示す複数のビットを含む
請求項2記載の光検出装置。 - 前記ニューラルネットワークモデルは、スパイキングニューラルネットワークモデルであり、
前記第1ラインデータおよび前記第2ラインデータのそれぞれは、膜電位の状態値を時系列順に示す複数のビット列を含む
請求項2記載の光検出装置。 - 識別情報および前記センサデータの少なくとも一方を変換して前記ニューラルネットワーク回路に供給する変換部をさらに具備し、
前記センサーは、前記センサデータとともに前記識別情報を出力する
請求項1記載の光検出装置。 - 前記センサデータを先入れ先出し方式で保持する第1のFIFO(First In, First Out)メモリと、
前記読出し制御信号を先入れ先出し方式で保持する第2のFIFOメモリと
をさらに具備し、
前記ニューラルネットワーク回路は、前記第1のFIFOメモリから前記センサデータを読み出し、
前記センサーは、前記第2のFIFOメモリから前記読出し制御信号を読み出す
請求項1記載の光検出装置。 - 前記センサデータを先入れ先出し方式で保持する第1のFIFOメモリと、
前記ラインデータを先入れ先出し方式で保持する第2のFIFOメモリと
をさらに具備し、
前記ニューラルネットワーク回路は、前記第1のFIFOメモリから前記センサデータを読み出し、
前記読出し制御部は、前記第2のFIFOメモリから前記ラインデータを読み出す
請求項1記載の光検出装置。 - 前記第1のFIFOメモリから前記センサデータを読み出して処理するデジタル処理部と、
前記センサデータを格納した通信フレームを生成する第1のフォーマット処理部と
をさらに具備する
請求項7記載の光検出装置。 - 前記ラインデータを格納した通信フレームを生成する第2のフォーマット処理部をさらに具備し、
前記読出し制御部は、前記ラインデータを前記第2のフォーマット処理部に出力する
請求項8記載の光検出装置。 - 前記センサーは、EVS(Event-based Vision Sensor)である
請求項1記載の光検出装置。 - 前記センサーは、光子を計数する光子計測回路である
請求項1記載の光検出装置。 - 前記センサーは、CIS(CMOS Image Sensors)である
請求項1記載の光検出装置。 - 前記センサーと前記ニューラルネットワーク回路と前記読出し制御部とは、積層された複数のチップに分散して配置される
請求項1記載の光検出装置。 - センサーが、複数の画素データを配列したセンサデータを画素アレイ部から読み出す手順と、
ニューラルネットワークモデルに基づいて前記センサデータを処理して複数の処理結果を配列したラインデータを出力する手順と、
前記ラインデータに基づいて前記画素アレイ部内の読出し対象の画素群を指示する読出し制御信号を生成する読出し制御手順と
を具備する光検出装置の制御方法。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2024565637A JPWO2024135094A1 (ja) | 2022-12-23 | 2023-10-27 | |
| KR1020257022985A KR20250126027A (ko) | 2022-12-23 | 2023-10-27 | 광 검출 장치, 및 광 검출 장치의 제어 방법 |
| EP23906467.8A EP4642045A1 (en) | 2022-12-23 | 2023-10-27 | Photodetector device and photodetector device control method |
| CN202380086589.9A CN120380773A (zh) | 2022-12-23 | 2023-10-27 | 光检测装置及光检测装置的控制方法 |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022-206201 | 2022-12-23 | ||
| JP2022206201 | 2022-12-23 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024135094A1 true WO2024135094A1 (ja) | 2024-06-27 |
Family
ID=91588498
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/038835 Ceased WO2024135094A1 (ja) | 2022-12-23 | 2023-10-27 | 光検出装置、および、光検出装置の制御方法 |
Country Status (6)
| Country | Link |
|---|---|
| EP (1) | EP4642045A1 (ja) |
| JP (1) | JPWO2024135094A1 (ja) |
| KR (1) | KR20250126027A (ja) |
| CN (1) | CN120380773A (ja) |
| TW (1) | TW202433738A (ja) |
| WO (1) | WO2024135094A1 (ja) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021064882A (ja) * | 2019-10-15 | 2021-04-22 | キヤノン株式会社 | 認識装置、認識方法 |
| WO2021210389A1 (ja) * | 2020-04-14 | 2021-10-21 | ソニーグループ株式会社 | 物体認識システム及び電子機器 |
| WO2022209253A1 (ja) * | 2021-04-02 | 2022-10-06 | ソニーセミコンダクタソリューションズ株式会社 | センサ装置および半導体装置 |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11037968B2 (en) | 2019-04-05 | 2021-06-15 | Waymo Llc | Image sensor architecture |
-
2023
- 2023-10-27 EP EP23906467.8A patent/EP4642045A1/en active Pending
- 2023-10-27 KR KR1020257022985A patent/KR20250126027A/ko active Pending
- 2023-10-27 CN CN202380086589.9A patent/CN120380773A/zh active Pending
- 2023-10-27 JP JP2024565637A patent/JPWO2024135094A1/ja active Pending
- 2023-10-27 WO PCT/JP2023/038835 patent/WO2024135094A1/ja not_active Ceased
- 2023-11-16 TW TW112144206A patent/TW202433738A/zh unknown
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2021064882A (ja) * | 2019-10-15 | 2021-04-22 | キヤノン株式会社 | 認識装置、認識方法 |
| WO2021210389A1 (ja) * | 2020-04-14 | 2021-10-21 | ソニーグループ株式会社 | 物体認識システム及び電子機器 |
| WO2022209253A1 (ja) * | 2021-04-02 | 2022-10-06 | ソニーセミコンダクタソリューションズ株式会社 | センサ装置および半導体装置 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4642045A1 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4642045A1 (en) | 2025-10-29 |
| TW202433738A (zh) | 2024-08-16 |
| CN120380773A (zh) | 2025-07-25 |
| KR20250126027A (ko) | 2025-08-22 |
| JPWO2024135094A1 (ja) | 2024-06-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111615823B (zh) | 固态成像元件、成像装置以及固态成像元件的控制方法 | |
| US11582406B2 (en) | Solid-state image sensor and imaging device | |
| JP2020072317A (ja) | センサ及び制御方法 | |
| US11523079B2 (en) | Solid-state imaging element and imaging device | |
| WO2020170861A1 (ja) | イベント信号検出センサ及び制御方法 | |
| WO2020116185A1 (ja) | 固体撮像装置、信号処理チップ、および、電子機器 | |
| US11937001B2 (en) | Sensor and control method | |
| US20240236519A1 (en) | Imaging device, electronic device, and light detecting method | |
| WO2021117350A1 (ja) | 固体撮像素子、および、撮像装置 | |
| CN116057950A (zh) | 成像装置和成像方法 | |
| WO2020105301A1 (ja) | 固体撮像素子、および、撮像装置 | |
| WO2024135094A1 (ja) | 光検出装置、および、光検出装置の制御方法 | |
| WO2024135095A1 (ja) | 光検出装置、および、光検出装置の制御方法 | |
| JP2024090345A (ja) | 光検出装置、および、光検出装置の制御方法 | |
| WO2025062888A1 (ja) | 光検出素子及びシステム | |
| WO2024262161A1 (ja) | 固体撮像素子、および、撮像装置 | |
| WO2024199692A1 (en) | Sensor device and method for operating a sensor device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23906467 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024565637 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202380086589.9 Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1020257022985 Country of ref document: KR |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023906467 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWP | Wipo information: published in national office |
Ref document number: 202380086589.9 Country of ref document: CN |
|
| WWP | Wipo information: published in national office |
Ref document number: 1020257022985 Country of ref document: KR |
|
| WWP | Wipo information: published in national office |
Ref document number: 2023906467 Country of ref document: EP |