[go: up one dir, main page]

WO2019176667A1 - Dispositif et procédé de détection - Google Patents

Dispositif et procédé de détection Download PDF

Info

Publication number
WO2019176667A1
WO2019176667A1 PCT/JP2019/008745 JP2019008745W WO2019176667A1 WO 2019176667 A1 WO2019176667 A1 WO 2019176667A1 JP 2019008745 W JP2019008745 W JP 2019008745W WO 2019176667 A1 WO2019176667 A1 WO 2019176667A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
pixel
obstacle
waveform
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/008745
Other languages
English (en)
Japanese (ja)
Inventor
由紀子 柳川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omron Corp
Original Assignee
Omron Corp
Omron Tateisi Electronics Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Omron Corp, Omron Tateisi Electronics Co filed Critical Omron Corp
Publication of WO2019176667A1 publication Critical patent/WO2019176667A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Definitions

  • the present disclosure relates to a detection device and a detection method for detecting a height difference existing on a road surface.
  • Patent Document 1 discloses a step detecting device that detects a step existing on a road surface.
  • the level difference detection device includes a laser radar and a microcomputer that measures the height of the level difference detection position based on distance measurement data acquired in two positional relationships where the distance from the laser radar to the level difference detection position is different.
  • the step detection device determines that a reflector exists and excludes the height of the virtual image by the reflector and detects the step. This prevents a step from being erroneously detected when the reflector is present on the road surface.
  • Patent Document 2 discloses an obstacle detection device that detects an obstacle.
  • the obstacle detection device includes a transmission / reception unit that transmits a pulse wave and receives a reflected wave from the obstacle, and a microcomputer that detects the number of maximum points of the amplitude of the received signal corresponding to the reflected wave.
  • the obstacle detection device determines whether the height of the obstacle is higher than that of the transmission / reception unit based on the number of local maximum points. At this time, the obstacle detection device measures the distance to the obstacle based on the time difference between the start of transmission and the detection of the received signal, and whether or not the detected local maximum point is correct based on the measured distance. Is determined.
  • Patent Document 1 since the height is calculated based on the distance measurement data, the height cannot be accurately calculated if the distance measurement data is not correct.
  • Patent Document 2 since it is determined whether or not the detection of the maximum point is correct based on the measured distance, the height of the obstacle cannot be accurately determined if the measured distance is not correct.
  • the conventional techniques such as Patent Document 1 and Patent Document 2 have a problem that an error occurs in the height determination when the distance cannot be measured accurately. Therefore, in the prior art, for example, it was not possible to accurately detect the presence or absence of an elevation difference that is so small that the distance cannot be measured accurately.
  • An object of the present disclosure is to provide a detection device and a detection method that accurately detect the presence or absence of a height difference.
  • the detection device includes a light projecting unit that projects light, a light receiving unit that receives light incident from a predetermined angle region, and a received light amount according to an elapsed time after projecting light in the predetermined angle region. And a controller that calculates a distance corresponding to the flight time of light based on the received light waveform, and the controller detects the presence or absence of a height difference within a predetermined angle region based on the distortion of the received light waveform.
  • the detection method includes a step of projecting light by a light projecting unit, a step of receiving light incident from a predetermined angle region by a light receiving unit, and a light projecting in a predetermined angle region by a control unit.
  • the detection apparatus and the detection method according to the present disclosure it is possible to accurately detect whether there is a height difference. For example, it is possible to detect the presence or absence of a small obstacle that fits within the vertical angle of view of one pixel.
  • the figure for demonstrating the application example of the detection apparatus which concerns on this indication 1 is a block diagram illustrating the configuration of a detection apparatus according to first and second embodiments. Diagram for explaining scanning by sensor The figure which shows an example of the distance image when there is an obstacle on the road near the vehicle The figure which shows an example of the distance image when there is an obstacle on the road far from the vehicle The figure which shows typically an example of the light projection to the road surface from a sensor when there is no obstacle in the range of the vertical angle of view for one pixel The figure which shows an example of the light reception waveform for 1 pixel in FIG.
  • FIG. 5A The figure which shows typically an example of the light projection from a sensor to an obstruction and a road surface when an obstruction exists in the range of the vertical angle of view for 1 pixel
  • FIG. 6A The figure which shows an example of the light reception waveform for 1 pixel in FIG. 6A
  • the figure for demonstrating the adjacent pixel to compare The figure for demonstrating the comparison of the light reception waveform of the pixel adjacent to a horizontal direction
  • the flowchart which shows an example of the distance image generation process by a detection apparatus Flowchart showing an example of waveform analysis processing by the detection device
  • the flowchart which shows an example of the obstacle determination process by a detection apparatus The figure which shows typically an example of the pixel in which the difference more than a threshold value produced with respect to the adjacent pixel.
  • the figure which shows typically an example of the pixel in which the obstruction identified from the pixel of a boundary part exists The figure which shows the neural network which concerns on Embodiment 2 typically.
  • 10 is a flowchart illustrating an example of waveform analysis processing according to the second embodiment.
  • FIG. 1 is a diagram for describing an application example of the detection apparatus 100 according to the present disclosure.
  • the detection device 100 is applicable to, for example, in-vehicle use.
  • the detection device 100 is mounted on a vehicle 200.
  • the detection device 100 according to the present embodiment is a LIDAR (Light Detection and Ranging) or Laser Imaging Detection and Ranging) device.
  • the detection device 100 detects the distance and direction to the detection target in the traveling direction of the vehicle 200.
  • the detection target is, for example, an obstacle 300 that forms a road surface and a step on the road.
  • the obstacle 300 is, for example, a curb or a fallen object.
  • the detection device 100 projects light in the traveling direction of the vehicle 200 and receives reflected light reflected by the detection target.
  • the detection device 100 measures the distance to the detection target in the traveling direction of the vehicle based on the time difference from light projection to light reception, and generates a distance image (also referred to as a frame image) based on the measured distance.
  • the detection apparatus 100 outputs, for example, information (for example, a distance image) indicating the distance and direction to the obstacle 300 on the road to the vehicle driving apparatus 210.
  • the vehicle 200 is, for example, an automatic driving vehicle, and includes a vehicle driving device 210 for performing automatic driving.
  • the vehicle drive device 210 includes, for example, a steering mechanism that drives the vehicle 200 by setting the traveling direction while avoiding the obstacle 300 on the road. By detecting the obstacle 300 by the detection device 100, the vehicle drive device 210 can perform automatic driving while avoiding the obstacle 300.
  • the detection device 100 of the present disclosure aims to accurately detect an obstacle at a position far away from the vehicle 200. Specifically, the detection device 100 of the present disclosure detects the presence or absence of a small height difference that can be accommodated in one pixel of the distance image.
  • Embodiment 1 The configuration and operation of the detection apparatus 100 according to Embodiment 1 will be described below.
  • FIG. 2 is a block diagram illustrating the configuration of the detection apparatus 100.
  • FIG. 3 is a diagram for explaining scanning by the sensor 10.
  • the detection apparatus 100 includes a sensor 10, a control unit 20, and a storage unit 30.
  • the sensor 10 includes a light projecting unit 11 that projects light to the outside, a light receiving unit 12 that receives light from the outside, and a scanning unit 13.
  • the light projecting unit 11 emits a light beam to the outside under the control of the control unit 20.
  • the light projecting unit 11 includes, for example, a light source composed of one or more light source elements and a light source driving circuit that drives the light source in pulses.
  • the light source element is, for example, an LD (semiconductor laser) that emits laser light.
  • the light source element may be an LED or the like.
  • the light source elements are arranged in a line in the vertical direction Y shown in FIG. 3, and the light projecting unit 11 projects light toward the light projecting region R11.
  • the light receiving unit 12 includes a plurality of light receiving elements. When receiving the light, the light receiving element generates a received light signal corresponding to the received light amount, that is, the received light amount.
  • the plurality of light receiving elements are arranged in a line in the vertical direction Y.
  • Each light receiving element corresponds to, for example, one pixel of the distance image, and separately receives light incident from a range (an example of a predetermined angle region) corresponding to the vertical field angle of one pixel. That is, the light receiving unit 12 has a light receiving region corresponding to a plurality of pixels arranged in the vertical direction Y of the distance image, and generates a light receiving signal for each pixel.
  • the light receiving element is composed of, for example, SPAD (single photon avalanche photodiode).
  • the light receiving element may be composed of a PD (photodiode) or an APD (avalanche photodiode).
  • the scanning unit 13 includes, for example, a mirror, a rotation mechanism that rotates the mirror around a rotation axis along the vertical direction Y, and a scanning drive circuit that drives the rotation mechanism.
  • the scan drive circuit rotates the mirror under the control of the control unit 20. Accordingly, the scanning unit 13 changes the light projecting direction little by little at regular time intervals, and moves the light path along which the light travels little by little.
  • the scanning direction is the horizontal direction X as shown in FIG.
  • the scanning unit 13 shifts the light projection region R11 in the horizontal direction X.
  • the control unit 20 can be realized by a semiconductor element or the like.
  • the control unit 20 can be configured by, for example, a microcomputer, CPU, MPU, GPU, DSP, FPGA, and ASIC.
  • the function of the control unit 20 may be configured only by hardware, or may be realized by combining hardware and software.
  • the control unit 20 implements a predetermined function by reading out data and programs stored in the storage unit 30 and performing various arithmetic processes.
  • the control unit 20 includes a distance image generation unit 21 and an obstacle detection unit 22 as functional configurations.
  • the distance image generation unit 21 performs distance measurement while scanning the projection plane R1 corresponding to the angle of view of the distance image in the horizontal direction X, and generates a distance image.
  • the resolution of the distance image that is, the angle of view for each pixel is, for example, 1.0 to 1.6 degrees in the horizontal direction X and 0.3 to 1.2 degrees in the vertical direction Y.
  • the distance image indicates the distance in the depth direction Z for each pixel arranged in the horizontal direction X and the vertical direction Y.
  • the distance image generation unit 21 outputs the generated distance image to the vehicle drive device 210, for example.
  • the distance image generation unit 21 controls the timing of light projection by the light projection unit 11.
  • the distance image generation unit 21 includes a light reception waveform generation unit 21a and a distance calculation unit 21b.
  • the light reception waveform generation unit 21a generates, for each pixel, light reception waveform data indicating the amount of light received according to the elapsed time since the light projection, based on the light projection timing and the light reception signal obtained from the light reception unit 12. .
  • the distance calculation unit 21b calculates a distance for each pixel based on the received light waveform. For example, the distance calculation unit 21b measures the flight time of light from when the light projected from the light projecting unit 11 is reflected and received by the light receiving unit 12 based on the received light waveform. The distance calculation unit 21b calculates a distance from the measured flight time to a detection target (for example, a road surface and an obstacle) that reflects light. The distance calculation unit 21b generates a distance image based on the distance measured for each pixel.
  • a detection target for example, a road surface and an obstacle
  • the obstacle detection unit 22 detects the presence or absence of a height difference in each pixel, for example, the presence or absence of an obstacle, based on the distortion of the received light waveform.
  • the obstacle detection unit 22 includes a waveform analysis unit 22a and an obstacle determination unit 22b.
  • the waveform analysis unit 22a analyzes the light reception waveform generated by the light reception waveform generation unit 21a for each pixel. Specifically, in the present embodiment, the waveform analysis unit 22a compares the difference between the light reception waveforms of pixels adjacent in the horizontal direction X (an example of distortion of the light reception waveform) with a threshold, and the difference is calculated between the obstacle and the road surface. It is determined whether or not it is a boundary portion.
  • the “difference in received light waveform” refers to a difference in received light amount according to time.
  • the obstacle determination unit 22b determines whether each pixel constituting the distance image includes an obstacle based on the analysis result of the waveform analysis unit 22a. For example, in the analysis result of the waveform analysis unit 22a, if there is a pixel corresponding to the boundary, it is determined that there is an obstacle, and the pixel including the obstacle is specified based on the boundary. When the obstacle determination unit 22b determines that there is an obstacle, the obstacle determination unit 22b outputs information based on the determination result to the vehicle drive device 210. For example, when it is determined that there is an obstacle, the obstacle determination unit 22b specifies the position or orientation of the obstacle based on the pixel including the obstacle, and information indicating the position or orientation of the obstacle is the vehicle drive device. Output to 210.
  • the storage unit 30 is a storage medium that stores programs and data necessary for realizing the functions of the detection apparatus 100.
  • the storage unit 30 can be realized by, for example, a hard disk (HDD), SSD, RAM, DRAM, ferroelectric memory, flash memory, magnetic disk, or a combination thereof.
  • the storage unit 30 may temporarily store various information.
  • storage part 30 may be comprised so that it may function as a work area of the control part 20, for example.
  • the detection apparatus 100 may further include a communication unit including a circuit that performs communication with an external device in accordance with a predetermined communication standard.
  • the predetermined communication standard includes, for example, LAN, Wi-Fi (registered trademark), Bluetooth (registered trademark), USB, and HDMI (registered trademark).
  • FIG. 4A shows an example of a distance image V1 when the obstacle 2 is on the road 1 near the sensor 10.
  • the size of the obstacle 2 shown in FIG. 4A in the vertical direction Y exceeds the length dy corresponding to the vertical field angle of one pixel.
  • the obstacle 2 is positioned closer to the sensor 10 than the road 1 by the height.
  • the flight time of light measured for the pixel P1 with the obstacle 2 is shorter than the flight time of light measured for the pixel P2 without the obstacle 2.
  • the distance to the obstacle 2 can be calculated by measuring the flight time of light at the pixel P1. Thereby, the obstacle 2 appears in the distance image V1.
  • FIG. 4B shows an example of the distance image V2 when the obstacle 3 is on the road 1 far from the sensor 10.
  • the size in the vertical direction Y of the obstacle 3 shown in FIG. 4B is smaller than the length dy corresponding to the vertical field angle of one pixel.
  • the obstacle 3 may not appear in the distance image V2 (meaning that the obstacle 3 indicated by a broken line in FIG. 4B does not appear in the distance image V2).
  • the flight time of light measured for the pixel P4 with the obstacle 3 may be the same as the flight time of light measured for the pixel P3 without the obstacle 3.
  • the distance to the road surface of the road 1 is measured based on the flight time of the light at the pixel P4, and the presence of the obstacle 3 is not detected.
  • FIG. 5A and FIG. 5B the reason why the obstacle 3 whose size in the vertical direction Y is smaller than the length dy corresponding to the vertical field angle of one pixel does not appear in the distance image V2 will be described in detail.
  • FIG. 5A shows an example of light projection from the sensor 10 to the road surface when there is no obstacle (an area corresponding to the pixel P3 in FIG. 4B) within the range of the vertical angle of view ⁇ for one pixel (an example of a predetermined angle area). Is schematically shown.
  • FIG. 5B shows an example of a light reception waveform in the pixel P3 corresponding to FIG. 5A.
  • the horizontal axis in FIG. 5B is time, and the vertical axis is the amount of received light.
  • FIG. 5B shows the amount of light received according to the elapsed time since the light was projected onto the area corresponding to the pixel P3.
  • the amount of received light becomes maximum at the shortest distance dz1 from the sensor 10 to the road surface in the region corresponding to the pixel P3 (time t1).
  • FIG. 6A shows the case where the obstacle 3 is present in the range of the vertical angle of view ⁇ for one pixel (an example of a predetermined angle region) (the region corresponding to the pixel P4 in FIG. 4B) from the sensor 10 to the obstacle and the road surface.
  • An example of light projection is shown typically.
  • FIG. 6B shows an example of a light reception waveform in the pixel P4 corresponding to FIG. 6A.
  • the horizontal axis in FIG. 6B is time, and the vertical axis is the amount of received light.
  • FIG. 6B shows the amount of light received according to the time elapsed since the light was projected onto the area corresponding to the pixel P4.
  • the pixel P4 shown in FIG. 6A is on the same horizontal line as the pixel P3 shown in FIG.
  • the obstacle 3 exists on the road surface.
  • the size of the obstacle 3 is smaller than the range of the vertical angle of view ⁇ for one pixel.
  • the amount of received light becomes maximum at the same time t1 as when there is no obstacle 3 (FIG. 5B). That is, the amount of received light reflected from the road surface at the shortest distance dz1 from the sensor 10 in the region corresponding to the pixel P4 is larger than the amount of received light reflected from the obstacle 3.
  • There are various methods for calculating the distance For example, if the distance is calculated based on the time t1 at which the amount of received light is maximum, the same distance value as that obtained when there is no obstacle 3 is obtained.
  • the distance value corresponding to the pixel P4 is a distance value from the sensor 10 to the road surface. Therefore, the obstacle 3 within the range of the pixel P4 is not detected, and the obstacle 3 does not appear in the distance image.
  • the area corresponding to the vertical angle of view ⁇ for one pixel is narrower as it is closer to the sensor 10 and wider as it is farther from the sensor 10. Therefore, the obstacle farther from the sensor 10 is more likely to be within the range of the vertical angle of view ⁇ for one pixel. Therefore, obstacles far from the sensor 10 are difficult to detect.
  • the detection device 100 of the present disclosure aims to detect such a small obstacle that falls within the range of the vertical angle of view ⁇ for one pixel. As shown in FIGS. 5B and 6B, the shape of the received light waveform differs depending on whether or not the obstacle 3 is present. Therefore, the detection apparatus 100 according to the present embodiment compares the light reception waveforms of two pixels adjacent in the horizontal direction X.
  • FIG. 7 is a diagram for explaining adjacent pixels to be compared.
  • the detection device 100 compares the light reception waveforms of adjacent pixels in the horizontal direction X, such as the pixel P11 and the pixel P12, the pixel P12 and the pixel P13, the pixel P13 and the pixel P14, and so on.
  • FIG. 8A shows an example in which the light reception waveform W11 corresponding to the pixel P11 is compared with the light reception waveform W12 corresponding to the pixel P12 adjacent to the pixel P11.
  • FIG. 8B illustrates a waveform indicating the absolute value of the difference between the received light waveforms W11 and W12.
  • the shape of the waveform depending on whether the obstacle 3 is present or not. Is different. Therefore, in this embodiment, it is determined whether or not the obstacle 3 is included in the pixel based on the difference in the shape of the waveform.
  • FIG. 8B when the absolute value of the difference between the received light waveforms is equal to or greater than a predetermined threshold, it is determined that the adjacent pixel indicates a boundary portion. For example, the difference in the amount of received light is calculated at the same elapsed time since the light is projected, and it is determined whether or not there is a difference that is equal to or greater than a threshold value.
  • the sign of the difference value changes according to the reflectance of light. Since the reflectance of light varies depending on the material, for example, if the obstacle 3 is white, the reflectance is higher than the reflectance of the road 1, and if the obstacle 3 is black, the reflectance is higher than the reflectance of the road 1. Lower. When the obstacle 3 includes a material having a higher reflectance than the road 1, the amount of received light reflected from the obstacle 3 is larger than the amount of received light reflected from the road 1 at the same position in the horizontal direction X.
  • the difference value when the received light amount of the pixel P11 of only the road 1 is subtracted from the received light amount of the pixel P12 including the obstacle 3 is positive, and the obstacle 3 is included from the received light amount of the pixel P16 of only the road 1
  • the difference value when the received light amount of the pixel P15 is subtracted is negative.
  • 5B, FIG. 6B, and FIG. 8A illustrate a case where the obstacle 3 includes a material having a higher reflectance than the road 1.
  • the obstacle 3 includes a material having a reflectance lower than that of the road 1, the amount of received light reflected from the obstacle 3 is smaller than the amount of received light reflected from the road 1 at the same position in the horizontal direction X. .
  • the difference value when the received light amount of the pixel P11 of only the road 1 is subtracted from the received light amount of the pixel P12 including the obstacle 3 is negative, and the obstacle 3 is included from the received light amount of the pixel P16 of only the road 1
  • the difference value when the received light amount of the pixel P15 is subtracted is positive.
  • the boundary between the obstacle 3 and the road 1 can be specified by comparing the absolute value of the difference between the received light waveforms with a predetermined threshold value.
  • FIG. 9 is a flowchart illustrating the operation of the distance image generation unit 21.
  • the distance image generating unit 21 causes the light projecting unit 11 to project light at a predetermined light projecting timing (S101).
  • the light reception waveform generation unit 21a acquires a light reception signal corresponding to the amount of received light from the light reception unit 12 (S102). Based on the light projection timing and the light reception signal, the light reception waveform generation unit 21a generates light reception waveform data indicating the amount of light reception based on the elapsed time since the light was projected (S103).
  • the distance calculation unit 21b calculates a distance corresponding to the flight time of light based on the received light waveform (S104). For example, the distance calculation unit 21b calculates the distance based on the time when the amount of received light is the largest. The generation of the received light waveform in step S103 and the calculation of the distance in step S104 are performed for each pixel. The distance image generation unit 21 determines whether or not the calculation of the distance of the pixels for one column has been completed (S105). If not yet, the generation of the received light waveform and the calculation of the distance are repeated. When the calculation of the distance of the pixels for one column is completed, the distance image generation unit 21 controls the scanning unit 13 to shift the light projection region R11 in the horizontal direction X that is the scanning direction (S106), and step S101. Return to. When the distance for one frame is calculated, the distance image generation unit 21 generates a distance image (frame image) for one frame based on the calculated distance for each pixel.
  • frame image the distance image generation unit 21 generates a distance image (frame image)
  • FIG. 10 is a flowchart illustrating the operation of the waveform analysis unit 22a.
  • FIG. 11 is a flowchart illustrating the operation of the obstacle determination unit 22b.
  • FIG. 12A schematically illustrates an example of a pixel in which a difference greater than or equal to a threshold value occurs with respect to an adjacent pixel by hatching.
  • FIG. 12B schematically shows the area of the obstacle 3 identified based on the pixels P12 and P16 in which the difference is generated by hatching.
  • FIG. 10 shows the operation for one frame.
  • the waveform analysis unit 22a acquires received light waveform data from the received light waveform generation unit 21a (S201).
  • the waveform analysis unit 22a calculates the difference between the received light waveforms of pixels adjacent in the horizontal direction X (S202).
  • the waveform analysis unit 22a determines whether or not the absolute value of the difference is greater than or equal to a threshold value (S203). If the absolute value of the difference is greater than or equal to the threshold value, the waveform analysis unit 22a determines that the adjacent pixels are each a boundary between the road surface and the obstacle (S204). If the absolute value of the difference is less than the threshold value, the waveform analysis unit 22a determines that the adjacent pixel is not a boundary part (S205).
  • the waveform analysis unit 22a determines whether or not the analysis of all the pixels in the frame is completed (S206). If the analysis of all the pixels in the frame has not been completed, the waveform analysis unit 22a returns to step S201.
  • the operation shown in FIG. 11 is performed after the operation of the waveform analysis unit 22a shown in FIG.
  • the obstacle determination unit 22b determines whether or not there is a pixel determined as a boundary in the frame in the analysis result by the waveform analysis unit 22a (S301). If there is a pixel determined as a boundary, an obstacle region is specified based on the boundary (S302). If there is no pixel determined as the boundary, it is determined that there is no obstacle in the frame (S303).
  • the obstacle determination unit 22b outputs a determination result. For example, when there is an obstacle, based on the coordinates of the pixel identified as an obstacle, information indicating the position or orientation is output to the vehicle drive device 210.
  • the hatching of the pixel P12 illustrated in FIG. 12A indicates that the light reception waveform of the pixel P12 is different from the light reception waveform of the pixel P11.
  • the pixel P11 and the pixel P12 are respectively the boundary between the road surface and the obstacle, that is, the obstacle. It is determined to correspond to a pixel having no object and a certain pixel.
  • the hatching of the pixel P16 illustrated in FIG. 12A indicates that a difference occurs in the light reception waveform of the pixel P16 with respect to the light reception waveform of the pixel P15.
  • step S204 Since the difference between the received light waveforms of the adjacent pixels P15 and P16 is greater than or equal to the threshold value, in step S204, the pixel P15 and the pixel P16 correspond to the boundary between the obstacle and the road surface, that is, the pixel having the obstacle and the pixel having no obstacle, respectively. Then, it is determined. In step S302, for example, based on the pixels P12 and P16 in which a difference from the adjacent pixel is generated, the region from the pixel P12 to the pixel P15 is specified as the region where the obstacle 3 exists as shown in FIG. 12B.
  • the threshold value for example, based on the pixels P12 and P16 in which a difference from the adjacent pixel is generated, the region from the pixel P12 to the pixel P15 is specified as the region where the obstacle 3 exists as shown in FIG. 12B.
  • the detection apparatus 100 includes a light projecting unit 11, a light receiving unit 12, and a scanning unit 13.
  • the light projecting unit 11 projects light.
  • the light receiving unit 12 receives light incident from a predetermined angle region.
  • the control unit 20 calculates a distance corresponding to the flight time of light based on a received light waveform indicating an amount of received light according to an elapsed time after light projection in a predetermined angle region. Further, the control unit 20 detects the presence / absence of a height difference within a predetermined angle region based on the distortion of the received light waveform.
  • the light receiving unit 12 separately receives light incident from a plurality of predetermined angle regions corresponding to a plurality of pixels.
  • the predetermined angle region is a range corresponding to the vertical field angle ⁇ for one pixel.
  • the control unit 20 calculates the distance for each pixel based on the light reception waveform for each pixel.
  • the control unit 20 detects whether there is a height difference for each pixel based on the distortion of the received light waveform.
  • the distortion of the light reception waveform is a difference between the light reception waveforms of pixels adjacent in the horizontal direction.
  • the detection apparatus 100 detects a pixel having a difference in height, for example, a boundary portion between a road surface and an obstacle, based on a difference in received light waveform for each pixel. Therefore, even an obstacle that is small enough to be within the range of the vertical angle of view ⁇ of one pixel can be detected.
  • the detection device 100 is mounted on the vehicle 200 and detects the presence or absence of an obstacle that forms a step on the road as the presence or absence of a height difference. Thereby, for example, even an obstacle far from the vehicle 200 can be detected with high accuracy.
  • the difference between the light reception waveforms of pixels located at the same position in two different frame images may be calculated.
  • the difference between the received light waveforms of the pixels at the same position (coordinates) in the previous frame image and the current frame image may be calculated.
  • the received light waveform has the same shape. Therefore, by comparing the light reception waveforms of pixels in different frame images, the presence or absence of an obstacle can be detected from the difference in the light reception waveforms.
  • Embodiment 2 In the first embodiment, it is determined whether or not it is a boundary portion by comparing the difference between the received light waveforms with a threshold value. In the second embodiment, it is determined by using machine learning whether or not the difference between the received light waveforms indicates a boundary portion. In the present embodiment, for example, deep learning is used.
  • FIG. 13 schematically shows an example of a neural network.
  • the boundary detection network that detects whether or not the difference between the received light waveforms indicates the boundary is constructed by a neural network having a multilayer structure used for deep learning.
  • the boundary detection network includes, in order from the input, a first fully connected layer L1 that is an input layer, a second fully connected layer L2 that is an intermediate layer, and an output layer L3.
  • the second total coupling layer L2 as an intermediate layer is one layer, but may include two or more intermediate layers.
  • Each layer L1 to L3 includes a plurality of nodes. For example, the received light amount corresponding to the elapsed time since the projection of one pixel is input to the nodes IN1, IN2, IN3,... Of the first full coupling layer L1, which is the input layer.
  • the number of nodes of the second total coupling layer L2 can be appropriately set according to the embodiment. From the nodes OUT1 and OUT2 of the output layer L3, a result indicating whether or not the boundary portion between the obstacle and the road surface is output.
  • the boundary determination network is learned from the difference data of the received light waveform indicating the boundary and the difference data of the received light waveform not indicating the boundary.
  • the learned boundary determination network is stored in the storage unit 30, for example.
  • the waveform analysis unit 22a uses this boundary determination network to determine whether or not the difference between the received light waveforms indicates the boundary.
  • FIG. 14 is a flowchart showing the operation of the waveform analysis unit 22a in the second embodiment.
  • steps S401, S402, and S404 are the same as steps S201, S202, and S206 of the first embodiment.
  • the waveform analyzer 22a inputs the difference between the received light waveforms of adjacent pixels to a learned boundary determination network (neural network) and acquires a determination result as to whether or not the boundary is present (S403).
  • the neural network instead of learning the difference of the received light waveform by the neural network, for example, by using the received light waveform of the pixel including the obstacle and the received light waveform of the pixel on the road surface, the neural network as shown in FIG. A configured obstacle determination network may be constructed.
  • the obstacle detection unit 22 may output information indicating the position or orientation of the obstacle to the vehicle drive device 210 based on the coordinates of the pixel determined to be an obstacle.
  • the control unit 20 may detect the presence or absence of a height difference for each pixel using a neural network that has learned the presence or absence of a height difference such as an obstacle and a road surface.
  • the obstacle detection unit 22 determines whether an obstacle is included in the pixel based on the distortion of the received light waveform.
  • the distance value is used to determine whether or not the object is an obstacle.
  • FIG. 15 is a block diagram illustrating the configuration of the detection apparatus 100 according to the third embodiment.
  • the obstacle determination unit 22b determines whether or not the pixel includes an obstacle based on the boundary determination result by the waveform analysis unit 22a and the distance value calculated by the distance calculation unit 21b. Determine. For example, it may be verified whether or not the determination of the boundary portion is correct with reference to the distance value between the pixel determined as the boundary portion and the surrounding pixels. As a result, it is possible to more accurately detect the presence or absence of a small obstacle that falls within the vertical angle of view of one pixel.
  • the scanning unit 13 may include a mechanism that changes the light projection direction in the vertical direction Y, and may scan in the vertical direction Y.
  • the scanning part 13 may be abbreviate
  • the light source elements of the light projecting unit 11 and the light receiving elements of the light receiving unit 12 are arranged in a two-dimensional array, distance measurement similar to that of the first embodiment can be performed without using the scanning unit 13. .
  • the detection apparatus 100 scans the projection plane R1 corresponding to a plurality of pixels, calculates the distance of each pixel, and generates a distance image. You may perform light projection and light reception in the area
  • the detection apparatus 100 includes, for example, one light source element and one light receiving element, and determines the presence or absence of an obstacle in the pixel from the light reception waveform of the region corresponding to only one pixel by the above-described obstacle determination network. Also good.
  • the detection device 100 detects an obstacle such as a curb that forms a step on the road, but the detection target is not limited to an obstacle on the road.
  • the detection device 100 of the present disclosure it is possible to detect an object having a height difference. For example, it is possible to detect a level difference caused by a depression on the road.
  • the detection apparatus 100 which concerns on this indication is applicable not only to in-vehicle use but various applications.
  • the detection apparatus 100 may be installed in a factory and used to measure the distance to a part. In this case, for example, it is possible to determine whether or not there is a scratch on the component by detecting the presence or absence of the height difference in the pixel by the detection device 100.
  • the distance calculation unit 21b is not limited to the distance image, and may generate information indicating the distance in various formats.
  • the distance calculation unit 21b may generate three-dimensional point group data.
  • the obstacle detection unit 22 may include the light reception waveform generation unit 21a.
  • the distance image generation unit 21 and the obstacle detection unit 22 may each include a light reception waveform generation unit 21a.
  • the detection apparatus 100 includes the distance calculation unit 21b, but the distance calculation unit 21b may not be provided.
  • the detection apparatus 100 may include the sensor 10, the light reception waveform generation unit 21a, and the obstacle detection unit 22, and may detect the presence or absence of a height difference in the pixel from the distortion of the light reception waveform.
  • the detection device includes a light projecting unit (11) that projects light, a light receiving unit (12) that receives light incident from a predetermined angle region, and a light projecting in the predetermined angle region. And a control unit (20) for calculating a distance corresponding to the flight time of light based on a received light waveform indicating an amount of received light according to an elapsed time since the time, and the control unit (20) Based on the waveform distortion, the presence or absence of a height difference in the predetermined angle region is detected.
  • the light receiving unit (12) separately receives light incident from a plurality of predetermined angle regions corresponding to a plurality of pixels, and the control unit (20). Calculates the distance for each pixel based on the light reception waveform for each pixel, and the control unit (20) detects the presence or absence of a height difference for each pixel based on the distortion of the light reception waveform.
  • the distortion of the light reception waveform is a difference between the light reception waveforms of pixels adjacent in the horizontal direction.
  • control unit detects the presence or absence of a height difference for each pixel using a neural network that has learned the presence or absence of a height difference.
  • control unit detects the presence or absence of a height difference for each pixel based on the distortion of the light reception waveform and the distance.
  • the control unit in the detection device according to the second aspect, the control unit generates a frame image including the plurality of pixels according to the distance for each pixel, and the received light waveform has a different distortion. This is the difference between the received light waveforms of the pixels at the same position in two frame images.
  • the detection device is mounted on a vehicle, and the presence or absence of the height difference is the presence or absence of a step or an obstacle on the road.
  • the detection method includes a step of projecting light by the light projecting unit (11) (S101), a step of receiving light incident from a predetermined angle region by the light receiving unit (12) (S102), and control.
  • the detection device of the present disclosure can be applied to, for example, an autonomous driving vehicle, a self-propelled robot, and AGV (Automated Guided Vehicle).
  • an autonomous driving vehicle a self-propelled robot
  • AGV Automatic Guided Vehicle

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

La présente invention concerne un dispositif de détection (100) qui est pourvu d'une unité d'émission lumineuse (11) qui émet de la lumière, d'une unité de réception lumineuse (12) qui reçoit de la lumière incidente à partir d'une région angulaire prescrite, et d'une unité de commande (20) qui calcule une distance qui correspond à un temps de vol de lumière, sur la base d'une forme d'onde lumineuse reçue qui indique une quantité de lumière reçue qui correspond à un temps écoulé depuis l'émission lumineuse dans la région angulaire prescrite, l'unité de commande (20) détectant la présence ou l'absence d'une différence de hauteur dans la région angulaire prescrite sur la base de la distorsion de la forme d'onde lumineuse reçue.
PCT/JP2019/008745 2018-03-14 2019-03-06 Dispositif et procédé de détection Ceased WO2019176667A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018047291A JP6812997B2 (ja) 2018-03-14 2018-03-14 検出装置及び検出方法
JP2018-047291 2018-03-14

Publications (1)

Publication Number Publication Date
WO2019176667A1 true WO2019176667A1 (fr) 2019-09-19

Family

ID=67908174

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/008745 Ceased WO2019176667A1 (fr) 2018-03-14 2019-03-06 Dispositif et procédé de détection

Country Status (2)

Country Link
JP (1) JP6812997B2 (fr)
WO (1) WO2019176667A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003042757A (ja) * 2001-07-31 2003-02-13 Omron Corp 車両用測距装置
EP1286178A2 (fr) * 2001-08-23 2003-02-26 IBEO Automobile Sensor GmbH Procédé de détection optique du sol
JP2016044969A (ja) * 2014-08-19 2016-04-04 株式会社デンソー 車載レーダ装置
JP2017032329A (ja) * 2015-07-30 2017-02-09 シャープ株式会社 障害物判定装置、移動体、及び障害物判定方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003042757A (ja) * 2001-07-31 2003-02-13 Omron Corp 車両用測距装置
EP1286178A2 (fr) * 2001-08-23 2003-02-26 IBEO Automobile Sensor GmbH Procédé de détection optique du sol
JP2016044969A (ja) * 2014-08-19 2016-04-04 株式会社デンソー 車載レーダ装置
JP2017032329A (ja) * 2015-07-30 2017-02-09 シャープ株式会社 障害物判定装置、移動体、及び障害物判定方法

Also Published As

Publication number Publication date
JP6812997B2 (ja) 2021-01-13
JP2019158686A (ja) 2019-09-19

Similar Documents

Publication Publication Date Title
CN111742241B (zh) 光测距装置
EP3187895B1 (fr) Système de radar de vol à résolution variable
US9207074B2 (en) Distance measurement apparatus, and distance measurement method
US10698092B2 (en) Angle calibration in light detection and ranging system
US9134117B2 (en) Distance measuring system and distance measuring method
JP2015179078A (ja) 視差演算システム及び距離測定装置
CN108845332B (zh) 基于tof模组的深度信息测量方法及装置
CN109444916A (zh) 一种无人驾驶可行驶区域确定装置及方法
JP2018155709A (ja) 位置姿勢推定装置および位置姿勢推定方法、運転支援装置
JP6186863B2 (ja) 測距装置及びプログラム
JP2016065842A (ja) 障害物判定装置および障害物判定方法
US20250271556A1 (en) Object detection device and object detection method
CN118786358A (zh) 用于具有自适应光晕校正的固态LiDAR的系统和方法
JP4691701B2 (ja) 人数検出装置及び方法
US11921216B2 (en) Electronic apparatus and method for controlling thereof
US20240264286A1 (en) Control method and apparatus, lidar, and terminal device
CN110554398B (zh) 一种激光雷达及探测方法
JP6812997B2 (ja) 検出装置及び検出方法
US12146764B2 (en) Ranging method and range finder
JP2008180646A (ja) 形状測定装置および形状測定方法
US20220319025A1 (en) Output control device, distance measuring device comprising the same, output control method, and output control program
US20250306209A1 (en) Lidar sensor for vehicles
US20230066857A1 (en) Dynamic laser emission control in light detection and ranging (lidar) systems
CN109959939A (zh) 基于激光扫描的对象跟踪方法及装置
WO2023181308A1 (fr) Système, procédé et programme informatique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19767059

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19767059

Country of ref document: EP

Kind code of ref document: A1