TWI876338B - Lidar system and crosstalk reduction method thereof - Google Patents
Lidar system and crosstalk reduction method thereof Download PDFInfo
- Publication number
- TWI876338B TWI876338B TW112117043A TW112117043A TWI876338B TW I876338 B TWI876338 B TW I876338B TW 112117043 A TW112117043 A TW 112117043A TW 112117043 A TW112117043 A TW 112117043A TW I876338 B TWI876338 B TW I876338B
- Authority
- TW
- Taiwan
- Prior art keywords
- reflected light
- distance value
- light signal
- subframe
- microcontroller
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
- G01S7/4876—Extracting wanted echo signals, e.g. pulse detection by removing unwanted signals
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Electromagnetism (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
Description
本發明係關於一種光達系統,特別是關於一種具有外光干擾排除功能的光達系統。 The present invention relates to a lidar system, and in particular to a lidar system with the function of eliminating external light interference.
近年,光達(Light Detection and Ranging,LiDAR)技術被廣泛使用於汽車的自動/半自動駕駛及安全警示。光達的主要部件包括:感測器(例如直接飛時測距(direct time of flight,D-ToF)感測器)、雷射光源、掃描器及資料處理器。現行的光達掃描方式可具有多種形式,例如以光學相位陣列(optical phased array,OPA)或繞射光學元件(diffractive optical element,DOE)投射小區塊光點,透過微機電系統(MEMS)微振鏡或多角反射鏡(polygon mirror)進行蛇狀來回掃描或斜向掃描一個大區塊,或透過DOE或多點線性排列光源或多次反射擴束投射線狀光束,以機械轉動方式橫向掃描一個大區塊等。透過上述掃描方式,感測器可接收反射後的光訊號。 In recent years, LiDAR (Light Detection and Ranging) technology has been widely used in the autonomous/semi-autonomous driving and safety warning of automobiles. The main components of LiDAR include: sensors (such as direct time of flight (D-ToF) sensors), laser light sources, scanners and data processors. Current lidar scanning methods can have many forms, such as using an optical phased array (OPA) or a diffractive optical element (DOE) to project a small area of light, using a micro-electromechanical system (MEMS) micro-vibration mirror or a polygon mirror to perform a serpentine back-and-forth scan or oblique scan of a large area, or using a DOE or a multi-point linear arrangement of light sources or multiple reflections to expand the beam and project a linear beam, and use mechanical rotation to horizontally scan a large area. Through the above scanning methods, the sensor can receive the reflected light signal.
然而,以上述掃描方式進行的雷射光源感測,其長寬比(screen ratio)較小,因此必須持續以較高頻率接收反射的光訊號。若感測器接收到其他光源(例如干擾光源(crosstalk)或背景光源(ambient light)),則容易導致資料處理器對距離產生誤判,影響行車安全。因此,亟需一種光達系統,可有效過濾並排除所接收之光訊號中的干擾光源及背景光源,以正確判斷距離,維護行車安全。此外,亦亟需一種外光干擾排除方法,使光達系統可有效過濾並排除所接收之光訊號中的干擾光源及背景光源,以正確判斷距離,維護行車安全。 However, the laser light source sensing performed by the above scanning method has a small aspect ratio (screen ratio), so it is necessary to continuously receive the reflected light signal at a higher frequency. If the sensor receives other light sources (such as interfering light sources (crosstalk) or background light sources (ambient light)), it is easy to cause the data processor to misjudge the distance, affecting driving safety. Therefore, there is an urgent need for a lidar system that can effectively filter and exclude interfering light sources and background light sources in the received light signal to correctly judge the distance and maintain driving safety. In addition, there is also an urgent need for a method for eliminating external light interference, so that the lidar system can effectively filter and exclude interfering light sources and background light sources in the received light signal to correctly judge the distance and maintain driving safety.
本發明的主要目的在於提供一種光達系統,可有效過濾並排除所接收之光訊號中的干擾光源及背景光源,以正確判斷距離,維護行車安全。 The main purpose of the present invention is to provide a lidar system that can effectively filter and eliminate interfering light sources and background light sources in the received light signal to accurately judge the distance and maintain driving safety.
為了達成前述的目的,本發明提供一種光達系統,包括:一微控制器;一雷射光源,耦接至該微控制器;一鏡頭模組,以及一接收器,耦接至該微控制器,其中:該鏡頭模組包括一雷射分光鏡模組及一接收器鏡頭模組,該雷射分光鏡模組接收來自該雷射光源發射的雷射光,並將該雷射光繞射為多束繞射光,該等繞射光向一目標發射。該接收器鏡頭模組接收繞射光接觸目標後反射的反射光訊號,並向接收器發出反射光訊號。該雷射光源以一週期時間射出一脈衝訊號。該微控制器控制該接收器在每一週期時間內的一感測器快門時間內開啟和一重置時間內關閉。該光達系統的一幀包括複數個子幀,每一子幀分別取像於每一週期時間。每一子幀包括複數個取樣區塊的一環境影像中,每一取樣區塊包括複數個像素,每一反射光訊號於每一像素可依一飛行時間求得一距離值。該微控制器將該複數個像素的距離值取一平均距離值,代表該取樣區塊的平均距離值。微控制器比較該幀中每一子幀在相同取樣區塊上的平均距離值,汰除具異常平均距離值的子幀,並融合其餘具相近平均距離值的子幀,作為該幀的一最終距離值。 In order to achieve the above-mentioned purpose, the present invention provides a lidar system, including: a microcontroller; a laser light source coupled to the microcontroller; a lens module, and a receiver coupled to the microcontroller, wherein: the lens module includes a laser spectroscope module and a receiver lens module, the laser spectroscope module receives the laser light emitted from the laser light source, and diverts the laser light into multiple diffusive lights, which are emitted toward a target. The receiver lens module receives the reflected light signal reflected after the diffusive light contacts the target, and emits the reflected light signal to the receiver. The laser light source emits a pulse signal with a cycle time. The microcontroller controls the receiver to open within a sensor shutter time and close within a reset time in each cycle time. A frame of the lidar system includes a plurality of subframes, and each subframe is imaged in each cycle time. Each subframe includes an environmental image of a plurality of sampling blocks, each sampling block includes a plurality of pixels, and each reflected light signal can obtain a distance value at each pixel according to a flight time. The microcontroller takes an average distance value of the distance values of the plurality of pixels to represent the average distance value of the sampling block. The microcontroller compares the average distance value of each subframe in the frame on the same sampling block, eliminates the subframes with abnormal average distance values, and merges the remaining subframes with similar average distance values as a final distance value of the frame.
為了達成前述的目的,本發明提供一種外光干擾排除方法,適用於上述的光達系統,該方法包括:在一環境影像中取樣複數個取樣區塊,每一取樣區塊包括複數個像素,該等取樣區塊中包括的像素總數不多於該環境影像中的像素數量的10%,且該等取樣區塊的數量至少為五個;對同一取樣區塊,在一幀的複數個子幀的每一個感測器快門時間內,獲取一反射光訊號,並依據該反射光訊號的該飛行時間,計算出該取樣區塊中每個像素所代表的距離值;以該取樣區塊中所有的像素距離值計算平均值,代表該取樣區塊的平均距離值。在該複數個相同位置的取樣區塊的該等子幀中,汰除平均距離值明顯不同者的子幀;以及融合未被汰除的子幀的環境影像的距離值,成為該幀的該最終距離值。 In order to achieve the above-mentioned purpose, the present invention provides a method for eliminating external light interference, which is applicable to the above-mentioned lidar system, and the method includes: sampling a plurality of sampling blocks in an environmental image, each sampling block includes a plurality of pixels, the total number of pixels included in the sampling blocks is not more than 10% of the number of pixels in the environmental image, and the number of the sampling blocks is at least five; for the same sampling block, within each sensor shutter time of a plurality of subframes of a frame, a reflected light signal is obtained, and according to the flight time of the reflected light signal, the distance value represented by each pixel in the sampling block is calculated; and the average value of all pixel distance values in the sampling block is calculated to represent the average distance value of the sampling block. Among the sub-frames of the plurality of sampling blocks at the same position, the sub-frames with significantly different average distance values are eliminated; and the distance values of the environment images of the sub-frames that are not eliminated are merged to form the final distance value of the frame.
本發明的功效在於,透過繞射光學元件投射大面積光點,可在一次或數次的脈衝掃描後,即獲取大面積圖像,而不需要進行來回掃描,進而大幅提升圖框率(frame rate),有效消除外光干擾所造成的影響。 The effect of the present invention is that by projecting a large area light spot through a diffraction optical element, a large area image can be obtained after one or several pulse scans without the need for back and forth scanning, thereby greatly improving the frame rate and effectively eliminating the impact caused by external light interference.
100:光達系統 100: LiDAR system
101:微控制器 101:Microcontroller
102:雷射光源 102: Laser light source
104:雷射光 104: Laser light
106:鏡頭模組 106: Lens module
108:接收器鏡頭模組 108: Receiver lens module
110:雷射分光鏡模組 110: Laser spectrometer module
112:接收器 112: Receiver
120:目標 120: Target
122:像場 122: Image field
124:視場 124: Field of view
126:反射光 126: Reflected light
202:凹透鏡 202: Concave lens
204:凸透鏡 204: Convex lens
206:繞射光學元件 206:Diffraction optical element
208:凹面鏡 208: Concave mirror
210:準直鏡組 210: Collimator lens set
302:雷射光 302: Laser light
304:繞射光學元件 304:Diffraction optical element
306a,306b,306c:點雲 306a,306b,306c: Point cloud
502:準直鏡組 502: Collimator assembly
5021:凹透鏡 5021: Concave lens
5022:凸透鏡 5022: Convex lens
504:繞射光學元件 504:Diffraction optical element
506:雷射光 506: Laser light
508:繞射光 508:Diffuse light
512:準直鏡組 512: Collimator lens set
5121:凹透鏡 5121: Concave lens
5122:凸透鏡 5122: Convex lens
514:繞射光學元件 514:Diffraction optical element
516:雷射光 516:Laser light
518:繞射光 518:Diffuse light
520:凹面鏡 520: Concave mirror
PW:脈衝寬度 PW: Pulse Width
T:週期時間 T: cycle time
SS:感測器快門時間 SS: sensor shutter time
R:重置時間 R: Reset time
Ts:開始時間 Ts: start time
Tl:結束時間 Tl: End time
801,802,803,804:情境 801,802,803,804: Situation
900:方法 900:Method
902,904,906,908,910,912,914,916:步驟 902,904,906,908,910,912,914,916: Steps
1000:方法 1000:Method
1002,1004,1006,1008,1010:步驟 1002,1004,1006,1008,1010: Steps
A,B,C,D,E:取樣區塊 A,B,C,D,E: Sampling blocks
圖1是本發明的光達系統的示意圖;圖2是圖1所示部分元件的內部結構示意圖;圖3是顯示繞射光學元件的運作情形;圖4是顯示本發明在不同距離下的運作情形;圖5A是依據本發明的一準直鏡組配置方式;圖5B是依據本發明的另一準直鏡組配置方式;圖6是依據本發明的一範例時序圖;圖7是依據本發明的另一範例時序圖;圖8是本發明在不同情境下的範例時序圖;圖9是本發明的一種外光干擾排除方法的流程圖;圖10A是本發明的另一範例時序圖;圖10B是本發明的另一種外光干擾排除方法的流程圖;圖11A為真實的環境影像;圖11B、圖11C及圖11D為圖11A的取樣範例;以及圖12A、圖12B、圖12C、圖12D、圖12E及圖12F為同一幀中不同子幀的取樣情形。 FIG. 1 is a schematic diagram of a lidar system of the present invention; FIG. 2 is a schematic diagram of the internal structure of some components shown in FIG. 1; FIG. 3 shows the operation of a diffraction optical component; FIG. 4 shows the operation of the present invention at different distances; FIG. 5A is a collimator lens group configuration according to the present invention; FIG. 5B is another collimator lens group configuration according to the present invention; FIG. 6 is an example timing diagram according to the present invention; FIG. 7 is another example timing diagram according to the present invention; and FIG. 8 is a timing diagram of the present invention under different conditions. FIG. 9 is a flow chart of an external light interference elimination method of the present invention; FIG. 10A is another example timing chart of the present invention; FIG. 10B is a flow chart of another external light interference elimination method of the present invention; FIG. 11A is a real environment image; FIG. 11B, FIG. 11C and FIG. 11D are sampling examples of FIG. 11A; and FIG. 12A, FIG. 12B, FIG. 12C, FIG. 12D, FIG. 12E and FIG. 12F are sampling situations of different subframes in the same frame.
以下配合圖式及元件符號對本發明的實施方式做更詳細的說明,俾使熟習該項技藝者在研讀本說明書後能據以實施。 The following is a more detailed description of the implementation of the present invention with the help of diagrams and component symbols, so that those who are familiar with the technology can implement it accordingly after reading this manual.
本發明提供一種具有外光干擾排除功能的光達系統,以及該光達系統的外光干擾排除方法。透過繞射光學元件(diffractive optical element,DOE)投射大面積光點,可在一次或數次的脈衝掃描後,即獲取大面積圖像,而不需要進行來回掃描,進而大幅提升圖框率(frame rate),有效消除外光干擾所造成的影響。 The present invention provides a lidar system with a function of eliminating external light interference, and a method for eliminating external light interference of the lidar system. By projecting a large area light spot through a diffractive optical element (DOE), a large area image can be obtained after one or several pulse scans without the need for back and forth scanning, thereby greatly improving the frame rate and effectively eliminating the impact caused by external light interference.
請參閱圖1,本發明提供一種光達系統100,包括微控制器(MCU)101、雷射光源(TX)102、鏡頭模組106及接收器(RX)112。鏡頭模組106包括接收 器鏡頭模組108及雷射分光鏡模組110。雷射光源102及接收器112耦接至微控制器101。 Referring to FIG. 1 , the present invention provides a lidar system 100, including a microcontroller (MCU) 101, a laser light source (TX) 102, a lens module 106, and a receiver (RX) 112. The lens module 106 includes a receiver lens module 108 and a laser spectrometer module 110. The laser light source 102 and the receiver 112 are coupled to the microcontroller 101.
為了測量目標120與光達系統100之間的距離,首先,微控制器101控制雷射光源102發出雷射光104。接著,雷射分光鏡模組110將雷射光104散射為複數個光點,該等光點分布於像場(field of image,FOI)122之內,且該像場122完全涵蓋目標120。隨後,該等光點在接觸目標120後,反射為複數個反射光126,該等反射光分布於視場(field of view,FOV)124之內。接收器鏡頭模組108接收反射光126,並傳送反射光訊號至接收器112。接收器112將接收的訊號傳送至微控制器101,進行後續影像分析。 In order to measure the distance between the target 120 and the lidar system 100, first, the microcontroller 101 controls the laser light source 102 to emit laser light 104. Then, the laser spectrometer module 110 scatters the laser light 104 into a plurality of light spots, which are distributed in the field of image (FOI) 122, and the field of image 122 completely covers the target 120. Subsequently, after the light spots contact the target 120, they are reflected into a plurality of reflected lights 126, which are distributed in the field of view (FOV) 124. The receiver lens module 108 receives the reflected light 126 and transmits the reflected light signal to the receiver 112. The receiver 112 transmits the received signal to the microcontroller 101 for subsequent image analysis.
請參閱圖2,圖1中的接收器鏡頭模組108包括由至少一凹透鏡202及至少一凸透鏡204組成的透鏡模組,該凹透鏡202及凸透鏡204形成一聚光鏡組,可聚集圖1中的反射光126,以便傳送光訊號至接收器112。圖1中的雷射分光鏡模組110包括繞射光學元件(diffractive optical element,DOE)206、凹面鏡208及準直鏡組(collimation lens)210。雷射分光鏡模組110之運作方式,將於下文詳述。 Please refer to FIG. 2. The receiver lens module 108 in FIG. 1 includes a lens module composed of at least one concave lens 202 and at least one convex lens 204. The concave lens 202 and the convex lens 204 form a focusing lens group, which can focus the reflected light 126 in FIG. 1 so as to transmit the optical signal to the receiver 112. The laser spectrometer module 110 in FIG. 1 includes a diffractive optical element (DOE) 206, a concave mirror 208 and a collimation lens group (collimation lens) 210. The operation of the laser spectrometer module 110 will be described in detail below.
請參閱圖3,當雷射光302射向繞射光學元件304時,繞射光學元件304會將雷射光302繞射為數千至數萬個光點。該等光點在不同距離形成點雲(point cloud)306a、306b及306c,其中,點雲306a距離繞射光學元件304最近,光點最密集,點雲覆蓋面積最小;而點雲306c距離繞射光學元件304最遠,光點最不密集,點雲覆蓋面積最大。繞射光學元件304可為例如廣州印芯半導體公司(Tyrafos)的HCPDOETM,但本發明不限於此。 Please refer to FIG. 3 . When the laser light 302 is directed to the diffraction optical element 304 , the diffraction optical element 304 will diffract the laser light 302 into thousands to tens of thousands of light spots. The light spots form point clouds 306a, 306b, and 306c at different distances. Point cloud 306a is closest to the diffraction optical element 304 , has the most dense light spots, and has the smallest point cloud coverage area; while point cloud 306c is farthest from the diffraction optical element 304 , has the least dense light spots, and has the largest point cloud coverage area. The diffraction optical element 304 may be, for example, HCPDOE ™ of Tyrafos, but the present invention is not limited thereto.
由於圖3所示的點雲覆蓋面積正比於距離的平方,故當距離較遠時,點雲覆蓋面積會快速擴大,造成單位面積光能下降,進而導致反射光強度不足。而大幅增加雷射光302的強度可能導致設備壽命下降,且對人眼易造成傷害。因此,請參閱圖4,由至少一凹透鏡202及至少一凸透鏡204組成的焦距可調變透鏡模組可依照測距範圍(detection range)調變視場大小,使不同距離下(例如15公尺、40公尺、100公尺、200公尺及300公尺)的單位面積光能大致相等,防止距離較遠時反射光強度不足的情形。或者,亦可使用複數個固定焦距的透鏡模組,每一透鏡模組包括至少一凹透鏡202及至少一凸透鏡204,並依照測距範圍切換透鏡模組,以調變視場大小。 Since the point cloud coverage area shown in FIG3 is proportional to the square of the distance, when the distance is far, the point cloud coverage area will expand rapidly, causing the light energy per unit area to decrease, thereby resulting in insufficient reflected light intensity. A significant increase in the intensity of the laser light 302 may result in a decrease in the life of the equipment and may easily cause damage to the human eye. Therefore, please refer to FIG4 , the focal length adjustable lens module composed of at least one concave lens 202 and at least one convex lens 204 can adjust the field of view size according to the detection range, so that the light energy per unit area at different distances (e.g., 15 meters, 40 meters, 100 meters, 200 meters and 300 meters) is roughly equal, to prevent the situation where the reflected light intensity is insufficient when the distance is far. Alternatively, multiple lens modules with fixed focal lengths may be used, each lens module including at least one concave lens 202 and at least one convex lens 204, and the lens modules may be switched according to the ranging range to adjust the field of view.
一種達成圖4所示配置的方式為利用準直鏡組將繞射光的覆蓋面積收束在一定範圍內。透過焦距的調變,準直鏡組可調變出射光的發散角,依測距範圍調整投射光點的像場範圍,以達成圖4所示之效果。可使用複數個固定焦距的準直鏡組,並依照測距範圍切換準直鏡組,以調變像場範圍。或者,亦可使用可變焦距的準直鏡組,並依照測距範圍切換準直鏡組,以調變像場範圍。請參閱圖5A,一種準直鏡組配置為將準直鏡組502設置於繞射光學元件504的正前方,其中,準直鏡組502的鏡面垂直於雷射光506的入射方向。如圖5A所示,準直鏡組502可將繞射光學元件504射出的繞射光508收束為大致相互平行,使繞射光508在不同距離下的單位面積光能大致維持相等。在一實施例中,準直鏡組502包括凹透鏡5021及凸透鏡5022,其中凹透鏡5021及凸透鏡5022的間距可調變,以控制發散角。 One way to achieve the configuration shown in FIG. 4 is to use a collimator set to converge the coverage area of the diffraction light within a certain range. By adjusting the focal length, the collimator set can adjust the divergence angle of the outgoing light and adjust the image field range of the projected light spot according to the ranging range to achieve the effect shown in FIG. 4. A plurality of collimators with fixed focal lengths can be used, and the collimators can be switched according to the ranging range to adjust the image field range. Alternatively, a collimator set with a variable focal length can be used, and the collimators can be switched according to the ranging range to adjust the image field range. Referring to FIG. 5A , a collimator set configuration is to place the collimator set 502 directly in front of the diffraction optical element 504, wherein the mirror surface of the collimator set 502 is perpendicular to the incident direction of the laser light 506. As shown in FIG. 5A , the collimator lens set 502 can converge the diffracted light 508 emitted by the diffraction optical element 504 to be substantially parallel to each other, so that the unit area light energy of the diffracted light 508 at different distances is substantially maintained equal. In one embodiment, the collimator lens set 502 includes a concave lens 5021 and a convex lens 5022, wherein the distance between the concave lens 5021 and the convex lens 5022 can be adjusted to control the divergence angle.
請參閱圖5B,另一種準直鏡組配置為將準直鏡組512設置於凹面鏡520的前方,以凹面鏡520收集繞射光學元件514射出的繞射光。如圖5B所示,繞射光學元件514將雷射光516繞射為多束繞射光518,該等繞射光518透過凹面鏡520進行第一次反射收束並向準直鏡組512入射。隨後,準直鏡組512將該等繞射光518進行第二次收束為大致相互平行,使繞射光518在不同距離下的單位面積光能大致維持相等。在一實施例中,準直鏡組512包括凹透鏡5121及凸透鏡5122,其中凹透鏡5121及凸透鏡5122的間距可調變,以控制發散角。此一配置相較於圖5A所示的配置,可收集到更大角度的繞射光,進而在不增加雷射光強度的情況下,增加投射出的單位面積光能。 Referring to FIG. 5B , another collimator lens set configuration is to place the collimator lens set 512 in front of the concave mirror 520, so that the concave mirror 520 collects the diffracted light emitted by the diffracting optical element 514. As shown in FIG. 5B , the diffracting optical element 514 diffracts the laser light 516 into multiple diffracted light beams 518, which are reflected and converged by the concave mirror 520 for the first time and incident on the collimator lens set 512. Subsequently, the collimator lens set 512 converges the diffracted light beams 518 for the second time to be substantially parallel to each other, so that the unit area light energy of the diffracted light beams 518 at different distances is substantially maintained equal. In one embodiment, the collimator lens assembly 512 includes a concave lens 5121 and a convex lens 5122, wherein the distance between the concave lens 5121 and the convex lens 5122 can be adjusted to control the divergence angle. Compared with the configuration shown in FIG. 5A, this configuration can collect diffracted light at a larger angle, thereby increasing the projected light energy per unit area without increasing the laser light intensity.
在車輛自動駕駛的使用情境中,當車輛行駛時,光達系統100可能接收到的干擾訊號包括前方對向車道上車輛的掃描雷射、前方對向車道上車輛的前定向脈衝雷射、前方同向車道上車輛的掃描雷射及前方同向車道上車輛的後定向脈衝雷射等。因此,必須以適當的方法排除該等干擾訊號,以正確測量距離,維護行車安全。 In the use scenario of vehicle automatic driving, when the vehicle is driving, the interference signals that the lidar system 100 may receive include scanning lasers from vehicles in the opposite lane ahead, front directional pulse lasers from vehicles in the opposite lane ahead, scanning lasers from vehicles in the same lane ahead, and rear directional pulse lasers from vehicles in the same lane ahead. Therefore, such interference signals must be eliminated in an appropriate manner to accurately measure distance and maintain driving safety.
當圖1的雷射光源102射出脈衝訊號時,為了排除干擾訊號,微控制器101可依據測距範圍,將接收器112啟動或關閉,使接收器112僅接收測距範圍內的反射光訊號。舉例而言,若待測物體在300公尺外,則自雷射光源102射出脈衝訊號至接收器112接收反射光訊號的所需時間為2μs(R=ct/2,其中R為距離,c為光速3×108m/s,t為時間(秒))。因此,可在一個週期時間(cycle time) 內,將接收器112與雷射光源102同步啟動,感測時間2μs,而其餘時間關閉,以防止接收干擾訊號。請參閱圖6,雷射光源(TX)以週期時間T射出脈衝寬度(pulse width)PW的脈衝訊號。接收器(RX)在週期時間T內的感測器快門(sensor shutter)時間SS內開啟和重置(reset)時間R內關閉,其中T=SS+R。感測器快門時間SS及重置時間R係依據測距範圍而定。在一個實施例中,當測距範圍為300公尺時,感測器快門時間SS為2μs,重置時間R為2μs,週期時間T為4μs,脈衝寬度PW為100ns。此時,接收器(RX)可接收0至300公尺內的反射光訊號,而理論圖框率(掃描次數)可高達1/T=2.5×105f/s。 When the laser light source 102 of FIG. 1 emits a pulse signal, in order to eliminate interference signals, the microcontroller 101 can activate or deactivate the receiver 112 according to the ranging range, so that the receiver 112 only receives the reflected light signal within the ranging range. For example, if the object to be measured is 300 meters away, the time required from the laser light source 102 emitting a pulse signal to the receiver 112 receiving the reflected light signal is 2μs (R=ct/2, where R is the distance, c is the speed of light 3×10 8 m/s, and t is time (seconds)). Therefore, the receiver 112 can be activated synchronously with the laser light source 102 within a cycle time, the sensing time is 2μs, and the rest of the time is turned off to prevent receiving interference signals. Referring to FIG. 6 , the laser source (TX) emits a pulse signal with a pulse width PW at a cycle time T. The receiver (RX) opens at a sensor shutter time SS within the cycle time T and closes at a reset time R, where T=SS+R. The sensor shutter time SS and the reset time R are determined according to the range. In one embodiment, when the range is 300 meters, the sensor shutter time SS is 2μs, the reset time R is 2μs, the cycle time T is 4μs, and the pulse width PW is 100ns. At this time, the receiver (RX) can receive reflected light signals within a range of 0 to 300 meters, and the theoretical frame rate (number of scans) can be as high as 1/T=2.5×10 5 f/s.
請參閱圖7,除了測距範圍上限以外,亦可透過調整感測器快門時間SS,對接收器(RX)設定測距範圍下限。在圖7中,感測器快門時間SS的開始時間Ts係依據測距範圍下限而定,而結束時間Tl係依據測距範圍上限而定。在一個實施例中,當測距範圍為90至300公尺時,開始時間Ts為600ns,結束時間Tl為2μs,感測器快門時間SS為1400ns,重置時間R為2μs,週期時間T為4μs,脈衝寬度PW為100ns。此時,接收器(RX)可接收90至300公尺內的反射光訊號,而理論圖框率為1/T=2.5×105f/s。 Please refer to FIG7. In addition to the upper limit of the ranging range, the lower limit of the ranging range can also be set for the receiver (RX) by adjusting the sensor shutter time SS. In FIG7, the start time Ts of the sensor shutter time SS is determined according to the lower limit of the ranging range, and the end time Tl is determined according to the upper limit of the ranging range. In one embodiment, when the ranging range is 90 to 300 meters, the start time Ts is 600ns, the end time Tl is 2μs, the sensor shutter time SS is 1400ns, the reset time R is 2μs, the cycle time T is 4μs, and the pulse width PW is 100ns. At this time, the receiver (RX) can receive the reflected light signal within 90 to 300 meters, and the theoretical frame rate is 1/T=2.5×10 5 f/s.
為了去除接收器(RX)啟動期間所接收的干擾訊號,在包括複數個取樣區塊的環境影像中,微控制器101可將相同取樣區塊在相鄰子幀之間接收訊號的情形相互比較,以去除異常值。請參閱圖8,在一個實施例中,每一幀包括三個子幀(subframes),每一子幀分別取像於每一週期時間T,並包括複數個取樣區塊,其中,第一子幀為接收器(RX)於第一感測器快門時間內接收第一反射光訊號經微控制器運算而得,第二子幀為接收器(RX)於第二感測器快門時間內接收第二反射光訊號經微控制器運算而得,第三子幀為接收器(RX)於第三感測器快門時間內接收第三反射光訊號經微控制器運算而得。每一取樣區塊包括複數個像素,而每一反射光訊號於每一像素可依一飛行時間(time of flight,ToF)求得一距離值。微控制器將該複數個像素的距離值取一平均距離值,代表該取樣區塊的平均距離值。在包括複數個取樣區塊的環境影像中,微控制器比較該幀中每一子幀在複數個相同位置的取樣區塊上的平均距離值。在情境801中,在複數個相同位置的取樣區塊上,接收器(RX)在三個子幀內的平均距離值相近,代表三個反射光訊號來自相近距離。因此,情境801可視為正常情境,微控制器101將三個子幀內的反射光訊號所代表的距離值融合,計算該幀的最終距離值。其 中,「融合」可以取平均值、疊加、擇一或其他方式進行。在情境802中,在複數個相同位置的取樣區塊上,第一子幀及第二子幀的反射光訊號的平均距離值不同,代表第一子幀及第二子幀中至少一者接收到干擾訊號。此時,將第三子幀與第二子幀比較,若第三子幀的平均距離值與第二子幀相近,則可判定第二子幀及第三子幀為正常,而第一子幀為異常。如圖8所示,在情境802中,第二子幀及第三子幀為正常,而第一子幀為異常。此時,情境802仍可視為正常情境,但僅採用第二子幀及第三子幀的反射光訊號計算最終距離值,而汰除第一子幀的反射光訊號。在情境803中,第一子幀及第二子幀的反射光訊號的平均距離值不同,第二子幀及第三子幀的反射光訊號的平均距離值亦不同。此時,儘管第一子幀及第三子幀的反射光訊號的平均距離值相近,然而因缺乏連續二個子幀相近的訊號,故無法確認何者為正常訊號、何者為異常訊號。因此,情境803視為異常情境,該幀捨棄不用。在情境804中,三個子幀的反射光訊號的平均距離值皆不同。此時,因無法確認何者為正常訊號、何者為異常訊號,故情境804視為異常情境,該幀捨棄不用。此一方法所使用的子幀數較少,解析度較低(因僅使用兩個子幀融合),而比對速度較快,適用於快速動態偵測之情境,例如汽車前進中的前景或後景偵測。 In order to remove the interference signal received during the activation of the receiver (RX), in the environment image including a plurality of sampling blocks, the microcontroller 101 can compare the signal receiving conditions between adjacent subframes of the same sampling block to remove abnormal values. Please refer to FIG8 . In one embodiment, each frame includes three subframes. Each subframe is imaged at each cycle time T and includes a plurality of sampling blocks. The first subframe is obtained by the microcontroller calculating the first reflected light signal received by the receiver (RX) during the first sensor shutter time. The second subframe is obtained by the microcontroller calculating the second reflected light signal received by the receiver (RX) during the second sensor shutter time. The third subframe is obtained by the microcontroller calculating the third reflected light signal received by the receiver (RX) during the third sensor shutter time. Each sampling block includes a plurality of pixels, and a distance value can be obtained for each reflected light signal at each pixel according to a time of flight (ToF). The microcontroller takes an average distance value of the distance values of the plurality of pixels to represent the average distance value of the sampling block. In the environment image including the plurality of sampling blocks, the microcontroller compares the average distance value of each subframe in the frame on the plurality of sampling blocks at the same position. In scenario 801, on the plurality of sampling blocks at the same position, the average distance values of the receiver (RX) in three subframes are similar, indicating that the three reflected light signals come from similar distances. Therefore, scenario 801 can be regarded as a normal scenario, and the microcontroller 101 fuses the distance values represented by the reflected light signals in the three subframes to calculate the final distance value of the frame. Among them, "fusion" can be performed by taking an average value, superposition, selection or other methods. In scenario 802, the average distance values of the reflected light signals of the first subframe and the second subframe are different in a plurality of sampling blocks at the same position, indicating that at least one of the first subframe and the second subframe receives an interference signal. At this time, the third subframe is compared with the second subframe. If the average distance value of the third subframe is close to that of the second subframe, it can be determined that the second subframe and the third subframe are normal, while the first subframe is abnormal. As shown in FIG8 , in scenario 802, the second subframe and the third subframe are normal, while the first subframe is abnormal. At this time, scenario 802 can still be regarded as a normal scenario, but only the reflected light signals of the second subframe and the third subframe are used to calculate the final distance value, and the reflected light signal of the first subframe is eliminated. In scenario 803, the average distance values of the reflected light signals of the first subframe and the second subframe are different, and the average distance values of the reflected light signals of the second subframe and the third subframe are also different. At this time, although the average distance values of the reflected light signals of the first subframe and the third subframe are similar, due to the lack of two consecutive similar signals of subframes, it is impossible to confirm which is a normal signal and which is an abnormal signal. Therefore, scenario 803 is regarded as an abnormal scenario, and the frame is discarded. In scenario 804, the average distance values of the reflected light signals of the three subframes are all different. At this time, because it is impossible to confirm which is a normal signal and which is an abnormal signal, scenario 804 is regarded as an abnormal scenario, and the frame is discarded. This method uses fewer subframes and has a lower resolution (because only two subframes are used for fusion), but has a faster matching speed. It is suitable for fast dynamic detection scenarios, such as foreground or background detection of a moving car.
請參閱圖9,在一個實施例中,方法900係用於實施圖8所示之不同情境的判定流程。在步驟902中,接收器獲取第一子幀的反射光訊號。在步驟904中,接收器獲取第二子幀的反射光訊號。在步驟906中,微控制器比較第一子幀及第二子幀的反射光訊號在複數個相同位置的取樣區塊上的平均距離值是否相近。若是,則在步驟908中,微控制器將第一子幀及第二子幀的反射光訊號所代表的距離值融合,成為該幀的最終距離值,流程結束。其中,「融合」可以取平均值、疊加、擇一或其他方式進行。若否,則在步驟910中,接收器獲取第三子幀的反射光訊號。在步驟912中,微控制器比較第二子幀及第三子幀的反射光訊號在複數個相同位置的取樣區塊上的平均距離值是否相近。若是,則在步驟914中,微控制器將第二子幀及第三子幀的反射光訊號所代表的距離值融合,成為該幀的最終距離值,流程結束。若否,則在步驟916中,該幀的訊號被捨棄不用,流程結束。 Please refer to Figure 9. In one embodiment, method 900 is used to implement the determination process of different scenarios shown in Figure 8. In step 902, the receiver obtains the reflected light signal of the first subframe. In step 904, the receiver obtains the reflected light signal of the second subframe. In step 906, the microcontroller compares whether the average distance values of the reflected light signals of the first subframe and the second subframe on a plurality of sampling blocks at the same position are similar. If so, in step 908, the microcontroller merges the distance values represented by the reflected light signals of the first subframe and the second subframe into the final distance value of the frame, and the process ends. Among them, "fusion" can be performed by taking an average value, superposition, selection or other methods. If not, in step 910, the receiver obtains the reflected light signal of the third subframe. In step 912, the microcontroller compares whether the average distance values of the reflected light signals of the second subframe and the third subframe on multiple sampling blocks at the same position are similar. If so, in step 914, the microcontroller merges the distance values represented by the reflected light signals of the second subframe and the third subframe into the final distance value of the frame, and the process ends. If not, in step 916, the signal of the frame is discarded and the process ends.
請參閱圖10A,在一個實施例中,可在一幀中採用複數個子幀(較佳為至少六個),以更準確區分正常訊號及異常訊號。在圖10所示的範例中, 在包括複數個取樣區塊的一環境影像中,在同一取樣區塊上,第五子幀的反射光訊號落在感測器快門時間SS內的位置與其餘子幀不同(亦即飛行時間不同),故第五子幀的平均距離值會與其餘子幀不同。此時,微控制器汰除第五子幀,再將其餘子幀的平均距離值融合,成為該幀的最終距離值。由於物體的移動,故在每一取樣區塊中,每一子幀反射進入感測器的訊號不必然出現在同一個像素。因此,在此一實施例中,「融合」可包括在同一取樣區塊內,將未被汰除的子幀中,出現在不同像素的所有訊號疊合,再將疊合後的每一像素取平均值。舉例而言,在一個實施例中,下列表1A~表1F分別代表同一取樣區塊在同一幀內的第一子幀至第六子幀,其中每一個小方格代表一個像素,有數值者代表該子幀的該像素測得的子距離值,而無數值者代表無子距離值。對於每一子幀,將具有子距離值的像素取平均距離值,並根據每一子幀的平均距離值汰除異常子幀。由表1A~表1F可知,第六子幀的平均距離值明顯不同於其餘子幀,故將第六子幀視為異常值而汰除。隨後,如表1G所示,將正常子幀(第一至第五子幀,即表1A至表1E)的每一像素疊合,再將疊合後具有子距離值的的每一像素取平均值,作為此一取樣區塊在此幀的最終距離值。在疊合時,若同一像素具有複數個子距離值,則進行平均或選擇最大值,若該像素不具子距離值,則選擇測距範圍中的最小值(例如0)。或者,若同一像素具有複數個子距離值,則進行平均或選擇最小值,若該像素不具子距離值,則選擇測距範圍中的最大值(例如500或1000)。此一方法所使用的子幀數較多,解析度較高(因使用複數個子幀融合),適用於非快速動態偵測之情境。 Please refer to FIG. 10A. In one embodiment, a plurality of subframes (preferably at least six) may be used in one frame to more accurately distinguish normal signals from abnormal signals. In the example shown in FIG. 10, in an environmental image including a plurality of sampling blocks, on the same sampling block, the reflected light signal of the fifth subframe falls at a different position within the sensor shutter time SS than the remaining subframes (i.e., the flight time is different), so the average distance value of the fifth subframe will be different from the remaining subframes. At this time, the microcontroller eliminates the fifth subframe and then merges the average distance values of the remaining subframes to form the final distance value of the frame. Due to the movement of the object, in each sampling block, the signal reflected by each subframe entering the sensor does not necessarily appear in the same pixel. Therefore, in this embodiment, "fusion" may include superimposing all signals appearing in different pixels in the subframes that have not been eliminated in the same sampling block, and then taking the average value of each superimposed pixel. For example, in one embodiment, the following Table 1A to Table 1F respectively represent the first to sixth subframes in the same frame of the same sampling block, where each small square represents a pixel, and the one with a value represents the sub-distance value measured by the pixel of the subframe, while the one without a value represents no sub-distance value. For each subframe, the average distance value of the pixels with sub-distance values is taken, and the abnormal subframes are eliminated according to the average distance value of each subframe. From Table 1A to Table 1F, it can be seen that the average distance value of the sixth subframe is obviously different from that of the other subframes, so the sixth subframe is regarded as an abnormal value and eliminated. Then, as shown in Table 1G, each pixel of the normal subframe (the first to the fifth subframe, that is, Table 1A to Table 1E) is superimposed, and then the average value of each pixel with sub-distance value after superposition is taken as the final distance value of this sampling block in this frame. During superposition, if the same pixel has multiple sub-range values, the average or the maximum value is selected. If the pixel does not have a sub-range value, the minimum value in the ranging range (such as 0) is selected. Alternatively, if the same pixel has multiple sub-range values, the average or the minimum value is selected. If the pixel does not have a sub-range value, the maximum value in the ranging range (such as 500 or 1000) is selected. This method uses more sub-frames and has a higher resolution (due to the use of multiple sub-frame fusion), which is suitable for non-fast dynamic detection scenarios.
請參閱圖10B,方法1000係用於實施圖10A所示之情境的判定流程。在步驟1002中,微控制器在環境影像中取樣複數個取樣區塊,每一取樣區塊包括複數個像素。其中,該等取樣區塊中包括的像素總數不多於環境影像中的像素數量的10%,且取樣區塊的數量至少為五個。在步驟1004中,對同一取樣區塊,在一幀的複數個子幀的每一個感測器快門時間內,獲取一反射光訊號,並依據該反射光訊號的飛行時間,計算出該取樣區塊中每個像素所代表的距離值。在步驟1006中,以該取樣區塊中所有的像素距離值計算平均值,代表該取樣區塊的平均距離值。在步驟1008中,在該複數個相同位置的取樣區塊的該等子幀中,微控制器汰除平均距離值明顯不同的子幀(或稱異常子幀)。在步驟1010中,微控制器融合未被汰除的子幀(或稱正常子幀)的環境影像的像素距離值,成為該幀的最終距離值。 Please refer to FIG. 10B , method 1000 is used to implement the determination process of the scenario shown in FIG. 10A . In step 1002, the microcontroller samples a plurality of sampling blocks in the environmental image, each sampling block including a plurality of pixels. The total number of pixels included in the sampling blocks is no more than 10% of the number of pixels in the environmental image, and the number of sampling blocks is at least five. In step 1004, for the same sampling block, a reflected light signal is obtained within each sensor shutter time of a plurality of subframes of a frame, and the distance value represented by each pixel in the sampling block is calculated based on the flight time of the reflected light signal. In step 1006, the average value is calculated with all pixel distance values in the sampling block, which represents the average distance value of the sampling block. In step 1008, the microcontroller eliminates subframes with significantly different average distance values (or abnormal subframes) in the subframes of the plurality of sampling blocks at the same position. In step 1010, the microcontroller merges the pixel distance values of the environment image of the subframes that have not been eliminated (or normal subframes) to become the final distance value of the frame.
圖11A為真實的環境影像。為了提升運算效率,並不需要將整張影像上的每一個像素皆進行距離測量,而可取樣數個區塊進行距離測量。每一取樣區塊中包括複數個像素,例如10×10像素。被取樣的像素不宜太多,例如不多於總像素數的10%,以提升運算效率。圖11B顯示取樣二個區塊的實施例。圖11C顯示取樣五個區塊的實施例。圖11D顯示取樣九個區塊的實施例。取樣區塊數不宜少於五個,以較佳地掌握環境訊息。在一個正常的子幀中,具有正常距離值的取樣區塊多於一特定比例(例如80%或88.9%,其中80%指取樣五個區塊時,容許一個取樣區塊具有異常的距離值,而88.9%指取樣九個區塊時,容許一個取樣區塊具有異常的距離值),否則視為異常子幀。該微控制器融合該複數個子幀的環境影像的該等像素的距離值,成為該幀的最終距離值。 Figure 11A is a real environment image. In order to improve the computing efficiency, it is not necessary to measure the distance of every pixel on the entire image, but several blocks can be sampled for distance measurement. Each sampling block includes a plurality of pixels, such as 10×10 pixels. The sampled pixels should not be too many, for example, not more than 10% of the total number of pixels, to improve the computing efficiency. Figure 11B shows an example of sampling two blocks. Figure 11C shows an example of sampling five blocks. Figure 11D shows an example of sampling nine blocks. The number of sampling blocks should not be less than five to better grasp the environmental information. In a normal subframe, if the sampling blocks with normal distance values exceed a certain ratio (e.g. 80% or 88.9%, where 80% means that when five blocks are sampled, one sampling block is allowed to have an abnormal distance value, and 88.9% means that when nine blocks are sampled, one sampling block is allowed to have an abnormal distance value), otherwise it is regarded as an abnormal subframe. The microcontroller fuses the distance values of the pixels of the environment image of the multiple subframes to become the final distance value of the frame.
請參閱圖12A至圖12F,在一幀包括六個子幀的實施例中,該六個子幀依序為圖12A、圖12B、圖12C、圖12D、圖12E及圖12F。其中,第二子幀(圖12B)及第六子幀(圖12F)有外光侵入。為了有效濾除受干擾的子幀,可將每一子幀中各取樣區塊內的每一像素所測得的距離值融合,作為該子幀該取樣區塊的子距離值,再將同一取樣區塊在六個子幀內的六個子距離值相互比較,以濾除異常值。在一個實施例中,濾除異常值的方式為計算同一取樣區塊在六個子幀內的的六個子距離值的平均值(μ)、標準差(σ)、上閾值及下閾值,其中上閾值為平均值加上數個標準差(μ+nσ),下閾值為平均值減去數個標準差(μ-nσ),其中n的大小係依據實驗數據及實際需求而定,且可為整數或 非整數,例如(但不限於)1或1.5。在下文的表2、3及4所示之實施例中,係以n=1為例進行說明,但本發明不以此為限。隨後,汰除該等子距離值中大於上閾值或小於下閾值的子幀,並融合其餘子距離值相近的子幀,作為該幀的最終距離值。 Please refer to FIG. 12A to FIG. 12F. In an embodiment in which a frame includes six subframes, the six subframes are FIG. 12A, FIG. 12B, FIG. 12C, FIG. 12D, FIG. 12E and FIG. 12F, respectively. Among them, the second subframe (FIG. 12B) and the sixth subframe (FIG. 12F) have external light intrusion. In order to effectively filter out the interfered subframes, the distance values measured by each pixel in each sampling block in each subframe can be fused as the sub-distance value of the sampling block in the subframe, and then the six sub-distance values of the same sampling block in the six subframes are compared with each other to filter out abnormal values. In one embodiment, the method of filtering out abnormal values is to calculate the average value (μ), standard deviation (σ), upper threshold value and lower threshold value of the six sub-distance values in the same sampling block within six sub-frames, wherein the upper threshold value is the average value plus a number of standard deviations (μ+nσ), and the lower threshold value is the average value minus a number of standard deviations (μ-nσ), wherein the size of n is determined according to experimental data and actual needs, and can be an integer or a non-integer, such as (but not limited to) 1 or 1.5. In the embodiments shown in Tables 2, 3 and 4 below, n=1 is used as an example for explanation, but the present invention is not limited thereto. Then, the subframes with sub-distance values greater than the upper threshold or less than the lower threshold are eliminated, and the remaining subframes with similar sub-distance values are merged as the final distance value of the frame.
表2、3及4顯示可能的感測結果。在表2所示之範例中,在第一子幀時,取樣區塊A的前方無障礙物。此時,取樣區塊A的距離視為最遠距離(例如500公尺)。在第二子幀時,取樣區塊A有外光侵入,而在第六子幀時,取樣區塊A、B、C、D、E皆有外光侵入。此時,如表2所示,取樣區塊A的第二子幀及第六子幀的距離值低於下閾值,故應視為異常值而濾除。而取樣區塊B、C、D、E的第六子幀的距離值皆低於個別下閾值,故應視為異常值而濾除。 Tables 2, 3, and 4 show possible sensing results. In the example shown in Table 2, in the first subframe, there is no obstacle in front of sampling block A. At this time, the distance of sampling block A is considered to be the farthest distance (e.g., 500 meters). In the second subframe, there is external light intrusion into sampling block A, and in the sixth subframe, there is external light intrusion into sampling blocks A, B, C, D, and E. At this time, as shown in Table 2, the distance values of the second and sixth subframes of sampling block A are lower than the lower threshold value, so they should be considered as abnormal values and filtered out. The distance values of the sixth subframes of sampling blocks B, C, D, and E are all lower than the individual lower threshold values, so they should be considered as abnormal values and filtered out.
在表3所示之範例中,在第四子幀時,取樣區塊A有外光侵入,且測得的距離十分接近正常值,而在第六子幀時,取樣區塊A、B、C、D、E皆有外光侵入。此時,如表3所示,取樣區塊A的第四子幀的距離值高於上閾值,第六子幀的距離值低於下閾值,故應視為異常值而濾除。而取樣區塊B、C、D、E的第六子幀的距離值皆低於個別下閾值,故應視為異常值而濾除。由此,儘管取樣區塊A在第四子幀及第六子幀測得的距離十分接近正常值,該二子幀仍可被正確識別為異常值而濾除。 In the example shown in Table 3, in the fourth subframe, there is external light intrusion in sampling block A, and the measured distance is very close to the normal value, while in the sixth subframe, there is external light intrusion in sampling blocks A, B, C, D, and E. At this time, as shown in Table 3, the distance value of the fourth subframe of sampling block A is higher than the upper threshold value, and the distance value of the sixth subframe is lower than the lower threshold value, so it should be considered as an abnormal value and filtered out. The distance values of the sixth subframes of sampling blocks B, C, D, and E are all lower than the individual lower threshold values, so they should be considered as abnormal values and filtered out. Therefore, even though the distances measured by sampling block A in the fourth and sixth subframes are very close to normal values, the two subframes can still be correctly identified as abnormal values and filtered out.
在表4所示之範例中,在第六子幀時,取樣區塊B、C、D、E有外光侵入。此時,如表4所示,取樣區塊B、C、D、E的第六子幀的距離值皆低於個別下閾值,故應視為異常值而濾除。 In the example shown in Table 4, in the sixth subframe, external light intrudes into sampling blocks B, C, D, and E. At this time, as shown in Table 4, the distance values of the sixth subframe of sampling blocks B, C, D, and E are all lower than the individual lower threshold values, so they should be considered as abnormal values and filtered out.
以上所述者僅為用以解釋本發明的較佳實施例,並非企圖據以對本發明做任何形式上的限制,是以,凡有在相同的發明精神下所作有關本發明的任何修飾或變更,皆仍應包括在本發明意圖保護的範疇。 The above is only used to explain the preferred embodiment of the present invention, and is not intended to limit the present invention in any form. Therefore, any modification or change of the present invention made under the same spirit of the invention should still be included in the scope of protection intended by the present invention.
100:光達系統 100: LiDAR system
101:微控制器 101:Microcontroller
102:雷射光源 102: Laser light source
104:雷射光 104: Laser light
106:鏡頭模組 106: Lens module
108:接收器鏡頭模組 108: Receiver lens module
110:雷射分光鏡模組 110: Laser spectrometer module
112:接收器 112: Receiver
120:目標 120: Target
122:像場 122: Image field
124:視場 124: Field of view
126:反射光 126: Reflected light
Claims (12)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/230,315 US20240045032A1 (en) | 2022-08-05 | 2023-08-04 | LiDAR SYSTEM AND CROSSTALK REDUCTION METHOD THEREOF |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202263395347P | 2022-08-05 | 2022-08-05 | |
| US63/395,347 | 2022-08-05 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| TW202407379A TW202407379A (en) | 2024-02-16 |
| TWI876338B true TWI876338B (en) | 2025-03-11 |
Family
ID=89753737
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW112117055A TWI858694B (en) | 2022-08-05 | 2023-05-08 | Lidar system and resolusion improvement method thereof |
| TW112117043A TWI876338B (en) | 2022-08-05 | 2023-05-08 | Lidar system and crosstalk reduction method thereof |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW112117055A TWI858694B (en) | 2022-08-05 | 2023-05-08 | Lidar system and resolusion improvement method thereof |
Country Status (2)
| Country | Link |
|---|---|
| CN (2) | CN117518184A (en) |
| TW (2) | TWI858694B (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107710015A (en) * | 2015-07-03 | 2018-02-16 | 松下知识产权经营株式会社 | Distance measuring device and distance image synthesis method |
| US20200072946A1 (en) * | 2018-08-29 | 2020-03-05 | Sense Photonics, Inc. | Glare mitigation in lidar applications |
| WO2021067377A1 (en) * | 2019-10-01 | 2021-04-08 | Sense Photonics, Inc. | Strobe based configurable 3d field of view lidar system |
| US11287517B2 (en) * | 2019-04-19 | 2022-03-29 | Sense Photonics, Inc. | Single frame distance disambiguation |
| TW202215833A (en) * | 2020-10-05 | 2022-04-16 | 美商葵欣實驗室股份有限公司 | Vision based light detection and ranging system using dynamic vision sensor |
| US20220187471A1 (en) * | 2020-08-24 | 2022-06-16 | Innoviz Technologies Ltd. | Lidar system with variable resolution multi-beam scanning |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107567592B (en) * | 2015-04-07 | 2021-07-16 | 闪光股份有限公司 | Small lidar system |
| DE102015217908A1 (en) * | 2015-09-18 | 2017-03-23 | Robert Bosch Gmbh | lidar |
| US10761195B2 (en) * | 2016-04-22 | 2020-09-01 | OPSYS Tech Ltd. | Multi-wavelength LIDAR system |
| CN105911559A (en) * | 2016-06-02 | 2016-08-31 | 中国科学院光电研究院 | Laser radar system based on visible light-near infrared-short wave infrared bands |
| EP3343246A1 (en) * | 2016-12-30 | 2018-07-04 | Xenomatix NV | System for characterizing surroundings of a vehicle |
| CA3239810A1 (en) * | 2019-03-08 | 2020-09-17 | Leddartech Inc. | Method, system and computer readable medium for evaluating influence of an action performed by an external entity |
-
2023
- 2023-05-08 CN CN202310510239.2A patent/CN117518184A/en active Pending
- 2023-05-08 TW TW112117055A patent/TWI858694B/en active
- 2023-05-08 CN CN202310513142.7A patent/CN117518185A/en active Pending
- 2023-05-08 TW TW112117043A patent/TWI876338B/en active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN107710015A (en) * | 2015-07-03 | 2018-02-16 | 松下知识产权经营株式会社 | Distance measuring device and distance image synthesis method |
| US20200072946A1 (en) * | 2018-08-29 | 2020-03-05 | Sense Photonics, Inc. | Glare mitigation in lidar applications |
| US11287517B2 (en) * | 2019-04-19 | 2022-03-29 | Sense Photonics, Inc. | Single frame distance disambiguation |
| WO2021067377A1 (en) * | 2019-10-01 | 2021-04-08 | Sense Photonics, Inc. | Strobe based configurable 3d field of view lidar system |
| US20220187471A1 (en) * | 2020-08-24 | 2022-06-16 | Innoviz Technologies Ltd. | Lidar system with variable resolution multi-beam scanning |
| TW202215833A (en) * | 2020-10-05 | 2022-04-16 | 美商葵欣實驗室股份有限公司 | Vision based light detection and ranging system using dynamic vision sensor |
Also Published As
| Publication number | Publication date |
|---|---|
| CN117518185A (en) | 2024-02-06 |
| TW202407379A (en) | 2024-02-16 |
| CN117518184A (en) | 2024-02-06 |
| TW202407380A (en) | 2024-02-16 |
| TWI858694B (en) | 2024-10-11 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10754036B2 (en) | Scanning illuminated three-dimensional imaging systems | |
| US10782392B2 (en) | Scanning optical system and light projecting and receiving apparatus | |
| EP3963368B1 (en) | Temporal jitter in a lidar system | |
| CN103975250B (en) | The spatial selectivity utilizing dynamic mask in the plane of delineation detects | |
| US20150204977A1 (en) | Object detection device and sensing apparatus | |
| BE1025547B1 (en) | System for characterizing the environment of a vehicle | |
| EP3179273A1 (en) | Light detection and ranging (lidar) imaging systems and methods | |
| KR102020037B1 (en) | Hybrid LiDAR scanner | |
| EP2824418A1 (en) | Surround sensing system | |
| JP2015178975A (en) | Object detection device and sensing device | |
| KR102650443B1 (en) | Fully waveform multi-pulse optical rangefinder instrument | |
| US12345836B2 (en) | Filtering measurement data of an active optical sensor system | |
| EP3206074B1 (en) | Scanning optical system and light projection and reception device | |
| US20210274160A1 (en) | Vehicular camera testing using a staggered target | |
| TWI876338B (en) | Lidar system and crosstalk reduction method thereof | |
| EP3428687B1 (en) | A vision system and vision method for a vehicle | |
| US20240045068A1 (en) | LiDAR SYSTEM AND RESOLUSION IMPROVEMENT METHOD THEREOF | |
| CN113447947A (en) | Device and method for generating scene data | |
| CN113597569B (en) | LiDAR system with holographic imaging optics | |
| US20240045032A1 (en) | LiDAR SYSTEM AND CROSSTALK REDUCTION METHOD THEREOF | |
| CN112924987A (en) | Laser light field visualization device and method based on InGaAs camera | |
| JP7785576B2 (en) | LiDAR device and LiDAR device control method | |
| JP2023101803A (en) | Scanning device and distance-measuring device | |
| JP2025156814A (en) | Optical and moving devices | |
| WO2025069291A1 (en) | Imaging system, electromagnetic wave irradiation system, measurement method, and program |