WO2018177026A1 - Dispositif et procédé de détermination d'un bord de route - Google Patents
Dispositif et procédé de détermination d'un bord de route Download PDFInfo
- Publication number
- WO2018177026A1 WO2018177026A1 PCT/CN2018/075130 CN2018075130W WO2018177026A1 WO 2018177026 A1 WO2018177026 A1 WO 2018177026A1 CN 2018075130 W CN2018075130 W CN 2018075130W WO 2018177026 A1 WO2018177026 A1 WO 2018177026A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road
- road edge
- vehicle
- curve
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
Definitions
- the invention belongs to the technical field of intelligent automobiles, and relates to an apparatus and a method for determining a road edge based on a stationary target beside a road edge.
- Autonomous driving is an important direction for the development of smart cars, and more and more vehicles are beginning to apply automatic driving systems to achieve the automatic driving function of the vehicle.
- an automatic driving system can determine the travelable area of the vehicle at any time. In determining the travelable area, an important aspect is the need to determine the road edge of the current traveled road.
- an automatic driving system usually determines an image edge by an image including a lane line collected by an image sensor (for example, a camera mounted on a vehicle), wherein the road edge is an image based on a lane line in an image acquired in real time. Processing to determine.
- This technique of determining the edge of a road has at least one of the following problems:
- the lane line portion is missing or the lane line is completely absent, or the determined road edge is largely deviated from the real road edge;
- the technique for determining the edge of the road is implemented based on an image sensor.
- the amount of information carried by the image sensor on the close-range image and the long-distance image is different.
- the distance near the center point of the image sensor lens is smaller than the boundary area of the lens, so that it is easy to bring a problem of poor recognition of the lane line at a long distance. That is, the determination or detection of a road edge at a long distance (relative to the vehicle) is inaccurate.
- At least one aspect or other technical problem of the technical problem to be solved by the present invention provides the following technical solutions.
- an apparatus for determining a road edge comprising:
- a radar detector mounted on the vehicle that is capable of detecting at least a stationary target beside the road edge of the road on which the vehicle is located;
- a processing component configured to: receive the stationary target detected by the radar detector and extract arrangement information of the stationary target substantially aligned with the road, thereby obtaining road edge information based on the arrangement information.
- a method for determining a road edge comprising the steps of:
- a vehicle is provided provided with an automatic driving system in which any of the above-described devices for determining a road edge is provided.
- Figure 1 is a block diagram showing the construction of a device for determining a road edge in accordance with an embodiment of the present invention.
- FIG. 2 is a schematic diagram of an application scenario of the apparatus of the embodiment shown in FIG. 1 when determining a road edge.
- FIG. 3 is a flow chart of a method of determining a road edge in accordance with an embodiment of the present invention.
- FIG. 1 is a schematic view showing the structure of a device for determining a road edge according to an embodiment of the present invention
- FIG. 2 is a schematic view showing an application scenario of the device of the embodiment shown in FIG. 1 when determining a road edge.
- the apparatus of the embodiment of the present invention and its working principle are exemplified below with reference to FIG. 1 and FIG.
- a device for determining a road edge (hereinafter simply referred to as "determining device") is mounted on the vehicle 100, and the specific type of the vehicle 100 is not limited, and the vehicle 100 is the determining device with respect to the determining device. Host vehicle.
- the determining device can be applied to an automatic driving system in which the vehicle 100 is installed.
- the vehicle 100 is traveling on a road 900 having corresponding road edges 901a and 901b, wherein 901a is the left road edge and 901b is the right road edge.
- the road edge 901a And 901b are not clearly identified by lane lines, or there is no corresponding lane line in a section of road 900 of the example to identify the road rim.
- stationary targets there are various stationary (relative road stationary) objects on both sides of the road 900, which are targets for determining device detection, and therefore, also referred to as "stationary targets"; by way of example, stationary targets beside the left road edge 901a of the road 900 are shown For example, tree 801, utility pole 802, isolated 803 (where three isolated 803a, 803b, and 803c are shown), etc., it should be understood that the stationary target beside the road edge is not limited to the object type of the above embodiment, for example, It can be a fence, a sign pole, etc.
- the determining device primarily includes a radar detector 110 mounted on the vehicle 100 that is capable of detecting at least a stationary target on at least one side of the road of the road 900 on which the vehicle 100 is located.
- the radar detector 110 is a millimeter wave radar that is mounted at the front end of the vehicle 100 and is capable of detecting various objects in front of the road plane at a 90[deg.] detection angle range, including, for example, the road shown in FIG. A stationary target beside the edge 901a.
- the radar detector 110 emits electromagnetic waves of a certain wavelength during detection and receives reflections from objects in front, so that the position of various objects can be detected, especially for objects at a long distance (for example, 40 meters or more), as with close objects. It can be detected relatively accurately (relative to image sensor 120) and, therefore, it has better remote sensing characteristics relative to image sensor 120.
- the vehicle coordinate system that is, the XY coordinate system
- the XY coordinate system may be defined in the determining device, wherein the center of mass of the vehicle 100 is the circle O, the X axis is defined as the front vertical direction of the vehicle 100, and the X coordinate is defined as the relative vehicle.
- the deviation of the distance of the centroid in the vertical direction, the Y-axis is defined as the horizontal direction of the vehicle 100, and the Y-coordinate is defined as the deviation in the horizontal direction from the distance from the center of mass of the vehicle.
- the radar detector 110 detects various objects (including stationary objects), the coordinates (X, Y) of an object are substantially determined, wherein the X coordinate represents the distance of the object from the centroid of the vehicle 100 in the vehicle coordinate system.
- the Y-coordinate represents the deviation of the distance of the object from the centroid of the vehicle 100 in the horizontal direction in the vehicle coordinate system (i.e., the deviation on the Y-axis).
- the millimeter wave radar is configured to determine a stationary target from among various detected objects based on the Doppler effect and the vehicle speed of the host vehicle, that is, an object that is stationary relative to the road edge 901. Therefore, the millimeter wave radar can output related information of a stationary target (for example, coordinates in a vehicle coordinate system) in substantially real time.
- the radar detector 110 has the advantage of being relatively low cost and capable of accurately detecting a stationary target at a long distance (for example, 40 meters or more) when using a millimeter wave radar, but it should be understood that the radar detector 110 is not limited to a millimeter wave radar, for example, It is also possible for Lidar, which is relatively more accurate in detecting various stationary targets (including distant stationary targets), but is relatively expensive and requires more data processing capability for subsequent processing components 130.
- the determining device further includes a processing component 130 that is also disposed on the vehicle 100, which may be implemented by a processing device in an automated driving system on the vehicle 100, or may be independently set up with respect to the automated driving system.
- the processing component 130 can process the algorithm code stored therein and execute instructions from the automated driving system or vehicle, and the specific hardware implementation of the processing component 130 is known and will not be described in detail herein.
- the processing component 130 can mainly perform data processing on the stationary target related information transmitted by the radar detector 110 to obtain a road edge curve, which is configured to: receive the stationary target detected by the radar detector 110 and extract a substantially regular arrangement of the relative roads. Arrangement information of the stationary target, thereby obtaining road edge information based on the arrangement information. Among them, the road edge information can be expressed as road edge curve information.
- the specific working principle of the processing unit 130 will be exemplified below by taking the road edge curve of the left road edge 901a shown in FIG.
- the number of stationary targets transmitted by the radar detector 110 after one scan detection may be up to several tens of orders of magnitude. Therefore, a corresponding screening unit 131 is provided in the processing component 130, which is capable of being able to Among the plurality of stationary targets transmitted by the detector 110, at least three still targets are selected as reference targets.
- the tree 901 having the substantially regular arrangement of the opposite roads 900, the utility pole 902, and the isolation 903 as the reference target of the left road edge 901a can be selected from a plurality of stationary targets, and not for the opposite left road edge 901a.
- Static objects such as other trees and electric poles that are regularly arranged may not be selected as reference targets or filtered.
- both sides of the road 900 will generally have objects that are generally regularly aligned with respect to the road 900, such as trees 901 and utility poles 902, etc., it may be based on the vehicle 100 when determining whether a stationary target is substantially regularly aligned with respect to the road 900.
- the current yaw rate (eg, can be acquired from components such as the steering system of the vehicle 100) to obtain a predicted travel trajectory that roughly corresponds to the current road curve, and thus can be based substantially on whether the stationary target is relative
- the predicted travel trajectories are generally regularly arranged to determine whether their stationary targets are substantially regularly aligned with respect to the road 900, so that the corresponding stationary target can be screened as a reference target.
- the regularly arranged trees 801, the utility poles 802, the isolation piers 803, and the like beside the left road edge 901a are determined by the screening unit 131 to be substantially regularly arranged relative to the predicted traveling trajectory, and therefore, at least Three are used as reference targets, for example, three or more trees 801 are selected as reference targets, or isolation piers 803a, 803b, and 803c are selected as reference targets, or a plurality of trees 801 and one utility pole 802 and one isolation pier 803a are selected as reference targets. .
- the processing component 130 is provided with a target curve fitting unit 132 configured to curve at least three or more reference targets in a vehicle coordinate system to obtain a corresponding reference target alignment curve.
- a target curve fitting unit 132 configured to curve at least three or more reference targets in a vehicle coordinate system to obtain a corresponding reference target alignment curve.
- the reference target alignment curve that needs to be obtained is defined in advance by a quadratic function, that is, the following functional relationship (1):
- X is an independent variable, which corresponds to the X coordinate in the vehicle coordinate system, and the X coordinate is defined as the deviation in the vertical direction from the centroid of the vehicle;
- Y is the dependent variable, which corresponds to the vehicle coordinate The Y coordinate under the system, which is defined as the deviation in the horizontal direction from the centroid of the vehicle;
- C 2 is the quadratic coefficient,
- C 1 is the primary term coefficient, and
- C 0 ' is the constant term.
- the coordinates of the plurality of reference targets are substituted into the quadratic function relation (1), and the quadratic coefficient C 2 , the term coefficient C 1 , and the constant term C 0 ' in the quadratic function relation (1) are calculated.
- the value thus obtaining the relation (1), that is, determining the reference target arrangement curve.
- the turning radius of the current vehicle may also be calculated based on the current yaw rate of the vehicle, thereby calculating the quadratic coefficient C 2 in the relation (1), at this time.
- the sub-function relation (1) is reduced to a linear function relation.
- the values of the primary term coefficient C 1 and the constant term C 0 ' can be further calculated, thereby obtaining the relation (1). That is, the reference target alignment curve is determined.
- the processing component 130 is provided with a road edge estimation unit 133 for estimating a road edge curve based on the reference target alignment curve.
- the road edge estimation unit 133 obtains a corresponding road edge curve by obtaining the following quadratic function relationship (2) from the quadratic function relation (1) as a road edge curve:
- D is the distance constant of the road target relative to the stationary target that is substantially regularly arranged next to it, for example, the corresponding stationary target next to the left road edge 901a is estimated in advance
- the distance constant D of the tree 801, the utility pole 802, the isolated 803, etc. relative to the left road edge 901a generally, the distance between the tree 801, the utility pole 802, the isolated 803 and the road edge 901 respectively has corresponding specifications, which may be based on These specified values are used to estimate the distance constant D (specifically, for example, 0.5 m)
- the quadratic function relation (2) is determined, that is, the road edge curve is determined.
- the determining device of the above embodiment can determine the road edge information based on the stationary targets on both sides of the road, is completely independent of the lane line, and is therefore very suitable for application to unstructured roads (for example, lane line blur, lane line disappearance or missing)
- the road edge information is obtained in the road; and it is not dependent on the image sensor. Therefore, the problem that the long-distance road cannot accurately acquire the corresponding road edge information can be obtained, and the road edge information can be obtained relatively accurately at a long distance.
- the determining device of the above embodiment can be applied to the vehicle 100 having an automatic driving system based on the road edge curve provided by the determining device, which can not only give the near-end travelable area, but also can give a relatively accurate far end.
- an image sensor 120 may be further disposed in the determining device, which may be installed, for example, at a substantially rear view mirror position inside the vehicle.
- the image sensor 120 may specifically be a camera or the like, which may
- the lane line image information of the lane line 901 of the road 900 (if the lane line 901 is present) is acquired in real time.
- the image information acquired by the image sensor 120 is not limited to lane line image information, for example, Includes image information such as vehicles ahead, pedestrians, obstacles, etc.
- the processing component 130 in the determining device further receives the lane line image information, and further calculates a road edge curve of the road 900 in real time based on the lane line image information; performs image processing based on the lane line image information and
- the calculation of the road edge curve is well known in the art and will not be described in detail herein. Therefore, the processing component 130 may obtain two road edge curves obtained by the two mechanisms respectively, and the processing component 130 may determine the road edge curve of the road 900 based on the two road edge curves in different scenarios.
- lane lines of road 900 exist and there are stationary targets along road 900 that are generally regularly aligned with respect to road 900, based on the above two mechanisms or two road edge curves, for close-range roads, road edges
- the curve can use the road edge curve calculated based on the lane line image information.
- the road edge curve is calculated based on the stationary target to obtain the road edge curve, thus overcoming the road edge calculated based on the lane line image information. The problem of curves being inaccurate in the long distance segment.
- the lane line of the road 900 is missing or unclear, and there are stationary targets along the road 900 that are substantially regularly aligned with respect to the road 900, for missing or unclear sections of the lane line of the road
- the road edge curve can be calculated based on the stationary target.
- FIGS. 1 through 3 is a flow chart showing a method of determining a road edge in accordance with an embodiment of the present invention. A method of determining a road edge in accordance with an embodiment of the present invention is illustrated in conjunction with FIGS. 1 through 3.
- step S310 a stationary target beside the road edge of the road on which the vehicle is located is detected.
- This step S310 can be implemented in a radar detector 110 such as a millimeter wave radar.
- a stationary target of at least one of the sides of the road of the road 900 where the vehicle 100 is located (for example, the side of the left road edge 901a) is detected by the radar detector 110, particularly for objects at a long distance (for example, 40 meters or more), and close objects. The same can be detected relatively accurately.
- the millimeter wave radar is configured to be able to determine a stationary target from among various detected objects based on the Doppler effect, that is, an object that is stationary relative to the road edge 901, for example, including a tree 801, a utility pole 802, and an isolation 803. Therefore, the millimeter wave radar can output related information of a stationary target (for example, coordinates in a vehicle coordinate system) in substantially real time.
- step S320 the reference target is selected from the stationary target.
- This step S320 is implemented in the screening unit 131 of the main processing unit 130.
- at least three still objects are selected from the plurality of stationary targets transmitted from the radar detector 110 as reference targets.
- the principle of screening is based on whether the stationary targets are roughly arranged relative to the road 900.
- the roads 900 may correspond to predicted driving trajectories that may be based on the current yaw angular velocity of the vehicle 100 (eg, may be from the vehicle 100) Obtained from the components such as the steering system, the trees 801, the utility poles 802, the isolation piers 803, and the like arranged based on the rules of the left road edge 901a are determined to be substantially regularly arranged relative to the predicted traveling trajectory, thereby screening them.
- the more the number of reference targets the more favorable it is to get the road edge curve accurately.
- step S330 curve fitting is performed on the reference target to obtain a reference target alignment curve.
- This step S330 is mainly implemented in the target curve fitting unit 132 of the processing unit 130.
- the reference target alignment curve that needs to be obtained is defined in advance by a quadratic function, that is, the following functional relationship (1):
- X is an independent variable, which corresponds to the X coordinate in the vehicle coordinate system, and the X coordinate is defined as the deviation in the vertical direction from the centroid of the vehicle;
- Y is the dependent variable, which corresponds to the vehicle coordinate The Y coordinate under the system, which is defined as the deviation in the horizontal direction from the centroid of the vehicle;
- C 2 is the quadratic coefficient,
- C 1 is the primary term coefficient, and
- C 0 ' is the constant term.
- the coordinates of the plurality of reference targets are substituted into the quadratic function relation (1), and the quadratic function relation (1) is calculated.
- the turning radius of the current vehicle may also be calculated based on the current yaw rate of the vehicle, thereby calculating the quadratic coefficient C 2 in the relation (1), at this time.
- the sub-function relation (1) is reduced to a linear function relation.
- the values of the primary term coefficient C 1 and the constant term C 0 ' can be further calculated, thereby obtaining the relation (1). That is, the reference target alignment curve is determined.
- step S340 the road edge curve is estimated based on the reference target arrangement curve.
- This step S330 is mainly implemented in the road edge estimation unit 133 of the processing unit 130.
- the corresponding road edge curve is obtained by obtaining the following quadratic function relation (2) from the quadratic function relation (1) as a road edge curve:
- D is the distance constant of the road target relative to the stationary target that is substantially regularly arranged next to it, for example, the corresponding stationary target next to the left road edge 901a is estimated in advance
- the distance constant D of the tree 801, the utility pole 802, the isolated 803, etc. relative to the left road edge 901a generally, the distance between the tree 801, the utility pole 802, the isolated 803 and the road edge 901 respectively has corresponding specifications, which may be based on These specified values are used to estimate the distance constant D (specifically, for example, 0.5 m)
- the quadratic function relation (2) is determined, that is, the road edge curve is determined.
- the method for determining the road edge of the embodiment shown in FIG. 3 above is not dependent on the lane line or the image sensor, and is very suitable for application to unstructured roads (for example, lane line blur, lane line disappearing or missing roads).
- the road edge information is obtained in the road, and the road edge information can also be obtained relatively accurately at a long distance.
- the terms “close distance” and “distance” are respectively based on the effective detection distance of the radar detector and the effective detection distance of the image sensor, respectively.
- the effective detection distance of the radar detector is relative to the image.
- the effective detection range of the sensor is farther; therefore, the range of distance less than or equal to the effective detection distance of the image sensor is defined as “close distance”, and the range of distance beyond the effective detection distance of the image sensor is defined as “distance” of the present application.
- the "close distance” and “distance” are not based on fixed distance values.
- the effective detection distance of different types of image sensors may also be different, for example, with image sensor technology. Development, the effective detection distance of newly emerging image sensors after this application date may also be further.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
- Radar Systems Or Details Thereof (AREA)
- Image Analysis (AREA)
Abstract
L'invention concerne un dispositif et un procédé de détermination d'un bord de route, se rapportant au domaine technique des voitures intelligentes. Le dispositif de détermination d'un bord de route comprend : un détecteur radar (110), qui est monté sur un véhicule (100) et qui peut au moins détecter des cibles statiques à côté d'un bord de route d'une route sur laquelle se trouve le véhicule (100) ; et un élément de traitement (130), configuré pour recevoir les cibles statiques détectées par le détecteur radar (110) et extraire des informations d'agencement des cibles statiques qui sont disposées approximativement de manière régulière par rapport à la route, de façon à obtenir des informations de bord de route sur la base des informations d'agencement. Le dispositif et le procédé sont applicables pour l'acquisition d'informations de bord de route sur une route non structurée, et peuvent être utilisés pour acquérir de manière relativement précise des informations de bord de route à une distance.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710196345.2 | 2017-03-29 | ||
| CN201710196345.2A CN106991389B (zh) | 2017-03-29 | 2017-03-29 | 确定道路边沿的装置和方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018177026A1 true WO2018177026A1 (fr) | 2018-10-04 |
Family
ID=59412995
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2018/075130 Ceased WO2018177026A1 (fr) | 2017-03-29 | 2018-02-02 | Dispositif et procédé de détermination d'un bord de route |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN106991389B (fr) |
| WO (1) | WO2018177026A1 (fr) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112132109A (zh) * | 2020-10-10 | 2020-12-25 | 北京百度网讯科技有限公司 | 车道线处理和车道定位方法、装置、设备及存储介质 |
| CN113525368A (zh) * | 2021-06-23 | 2021-10-22 | 清华大学 | 车辆的车道保持紧急控制策略与安全控制方法及装置 |
| CN113762011A (zh) * | 2020-11-25 | 2021-12-07 | 北京京东乾石科技有限公司 | 路牙检测方法、装置、设备和存储介质 |
| CN113879312A (zh) * | 2021-11-01 | 2022-01-04 | 无锡威孚高科技集团股份有限公司 | 基于多传感器融合的前向目标选择方法、装置和存储介质 |
| CN114067120A (zh) * | 2022-01-17 | 2022-02-18 | 腾讯科技(深圳)有限公司 | 基于增强现实的导航铺路方法、装置、计算机可读介质 |
| CN114475614A (zh) * | 2022-03-21 | 2022-05-13 | 中国第一汽车股份有限公司 | 一种危险目标的筛选方法、装置、介质及设备 |
| CN114872712A (zh) * | 2022-06-29 | 2022-08-09 | 小米汽车科技有限公司 | 静态车辆检测方法、装置、设备、车辆及存储介质 |
Families Citing this family (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106991389B (zh) * | 2017-03-29 | 2021-04-27 | 蔚来(安徽)控股有限公司 | 确定道路边沿的装置和方法 |
| CN109895694B (zh) * | 2017-12-08 | 2020-10-20 | 郑州宇通客车股份有限公司 | 一种车道偏离预警方法、装置及车辆 |
| CN108572642B (zh) * | 2017-12-15 | 2022-02-18 | 蔚来(安徽)控股有限公司 | 一种自动驾驶系统及其横向控制方法 |
| CN108573272B (zh) * | 2017-12-15 | 2021-10-29 | 蔚来(安徽)控股有限公司 | 车道拟合方法 |
| CN108693517B (zh) * | 2018-05-22 | 2020-10-09 | 森思泰克河北科技有限公司 | 车辆定位方法、装置和雷达 |
| US11035943B2 (en) * | 2018-07-19 | 2021-06-15 | Aptiv Technologies Limited | Radar based tracking of slow moving objects |
| CN109254289B (zh) * | 2018-11-01 | 2021-07-06 | 百度在线网络技术(北京)有限公司 | 道路护栏的检测方法和检测设备 |
| CN110174113B (zh) * | 2019-04-28 | 2023-05-16 | 福瑞泰克智能系统有限公司 | 一种车辆行驶车道的定位方法、装置及终端 |
| CN110244696A (zh) * | 2019-06-24 | 2019-09-17 | 北京经纬恒润科技有限公司 | 车身横向控制方法及电子控制单元ecu |
| CN110320504B (zh) * | 2019-07-29 | 2021-05-18 | 浙江大学 | 一种基于激光雷达点云统计几何模型的非结构化道路检测方法 |
| WO2021062581A1 (fr) * | 2019-09-30 | 2021-04-08 | 深圳市大疆创新科技有限公司 | Procédé et appareil de reconnaissance de marquage routier |
| CN111198370B (zh) * | 2020-01-02 | 2022-07-08 | 北京百度网讯科技有限公司 | 毫米波雷达背景检测方法、装置、电子设备及存储介质 |
| CN111289980B (zh) * | 2020-03-06 | 2022-03-08 | 成都纳雷科技有限公司 | 基于车载毫米波雷达的路边静止物的检测方法及系统 |
| CN112597839B (zh) * | 2020-12-14 | 2022-07-08 | 上海宏景智驾信息科技有限公司 | 基于车载毫米波雷达的道路边界检测方法 |
| CN112949489B (zh) * | 2021-03-01 | 2023-05-12 | 成都安智杰科技有限公司 | 一种道路边界识别方法、装置、电子设备和存储介质 |
| CN113167886B (zh) * | 2021-03-02 | 2022-05-31 | 华为技术有限公司 | 目标检测方法和装置 |
| CN116067316B (zh) * | 2023-01-17 | 2025-08-19 | 武汉理工大学 | 一种Frenet坐标多源数据的坐标统一车辆位置估测系统及方法 |
| CN118212232B (zh) * | 2024-05-20 | 2024-08-02 | 安徽蔚来智驾科技有限公司 | 线性元素的检测方法、智能设备及计算机可读存储介质 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104108392A (zh) * | 2013-04-11 | 2014-10-22 | 株式会社万都 | 车道估计装置和方法 |
| CN105404844A (zh) * | 2014-09-12 | 2016-03-16 | 广州汽车集团股份有限公司 | 一种基于多线激光雷达的道路边界检测方法 |
| CN105922991A (zh) * | 2016-05-27 | 2016-09-07 | 广州大学 | 基于生成虚拟车道线的车道偏离预警方法及系统 |
| CN106463064A (zh) * | 2014-06-19 | 2017-02-22 | 日立汽车系统株式会社 | 物体识别装置和使用该物体识别装置的车辆行驶控制装置 |
| CN106991389A (zh) * | 2017-03-29 | 2017-07-28 | 蔚来汽车有限公司 | 确定道路边沿的装置和方法 |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6728392B1 (en) * | 2001-01-30 | 2004-04-27 | Navigation Technologies Corp. | Shape comparison using a rotational variation metric and applications thereof |
| JP2008164831A (ja) * | 2006-12-27 | 2008-07-17 | Aisin Aw Co Ltd | 地図情報生成システム |
| CN102275587B (zh) * | 2011-06-07 | 2015-12-09 | 长安大学 | 一种后方车辆碰撞危险性监测装置及其监测方法 |
| CN104002809B (zh) * | 2014-05-28 | 2016-08-24 | 长安大学 | 一种车辆岔口路段检测装置及检测方法 |
| CN106476689A (zh) * | 2015-08-27 | 2017-03-08 | 长城汽车股份有限公司 | 一种用于车辆的道路限宽提醒设备和方法 |
| CN105711588B (zh) * | 2016-01-20 | 2018-05-11 | 奇瑞汽车股份有限公司 | 一种车道保持辅助系统和车道保持辅助方法 |
| CN106326850A (zh) * | 2016-08-18 | 2017-01-11 | 宁波傲视智绘光电科技有限公司 | 快速车道线检测方法 |
-
2017
- 2017-03-29 CN CN201710196345.2A patent/CN106991389B/zh active Active
-
2018
- 2018-02-02 WO PCT/CN2018/075130 patent/WO2018177026A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN104108392A (zh) * | 2013-04-11 | 2014-10-22 | 株式会社万都 | 车道估计装置和方法 |
| CN106463064A (zh) * | 2014-06-19 | 2017-02-22 | 日立汽车系统株式会社 | 物体识别装置和使用该物体识别装置的车辆行驶控制装置 |
| CN105404844A (zh) * | 2014-09-12 | 2016-03-16 | 广州汽车集团股份有限公司 | 一种基于多线激光雷达的道路边界检测方法 |
| CN105922991A (zh) * | 2016-05-27 | 2016-09-07 | 广州大学 | 基于生成虚拟车道线的车道偏离预警方法及系统 |
| CN106991389A (zh) * | 2017-03-29 | 2017-07-28 | 蔚来汽车有限公司 | 确定道路边沿的装置和方法 |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112132109A (zh) * | 2020-10-10 | 2020-12-25 | 北京百度网讯科技有限公司 | 车道线处理和车道定位方法、装置、设备及存储介质 |
| CN112132109B (zh) * | 2020-10-10 | 2024-09-06 | 阿波罗智联(北京)科技有限公司 | 车道线处理和车道定位方法、装置、设备及存储介质 |
| CN113762011A (zh) * | 2020-11-25 | 2021-12-07 | 北京京东乾石科技有限公司 | 路牙检测方法、装置、设备和存储介质 |
| CN113525368A (zh) * | 2021-06-23 | 2021-10-22 | 清华大学 | 车辆的车道保持紧急控制策略与安全控制方法及装置 |
| CN113879312A (zh) * | 2021-11-01 | 2022-01-04 | 无锡威孚高科技集团股份有限公司 | 基于多传感器融合的前向目标选择方法、装置和存储介质 |
| CN113879312B (zh) * | 2021-11-01 | 2023-02-28 | 无锡威孚高科技集团股份有限公司 | 基于多传感器融合的前向目标选择方法、装置和存储介质 |
| CN114067120A (zh) * | 2022-01-17 | 2022-02-18 | 腾讯科技(深圳)有限公司 | 基于增强现实的导航铺路方法、装置、计算机可读介质 |
| CN114475614A (zh) * | 2022-03-21 | 2022-05-13 | 中国第一汽车股份有限公司 | 一种危险目标的筛选方法、装置、介质及设备 |
| CN114872712A (zh) * | 2022-06-29 | 2022-08-09 | 小米汽车科技有限公司 | 静态车辆检测方法、装置、设备、车辆及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106991389B (zh) | 2021-04-27 |
| CN106991389A (zh) | 2017-07-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018177026A1 (fr) | Dispositif et procédé de détermination d'un bord de route | |
| EP3007099B1 (fr) | Système de reconnaissance d'image pour véhicule et procédé correspondant | |
| US9083856B2 (en) | Vehicle speed measurement method and system utilizing a single image capturing unit | |
| CN109212531B (zh) | 确定目标车辆取向的方法 | |
| CN107646114B (zh) | 用于估计车道的方法 | |
| CN110705458B (zh) | 边界检测方法及装置 | |
| CN103176185B (zh) | 用于检测道路障碍物的方法及系统 | |
| EP3910533B1 (fr) | Procédé, appareil, dispositif électronique et support de stockage pour la surveillance d'un dispositif d'acquisition d'images | |
| RU2764708C1 (ru) | Способы и системы для обработки данных лидарных датчиков | |
| JP6450294B2 (ja) | 物体検出装置、物体検出方法、及びプログラム | |
| WO2022067647A1 (fr) | Procédé et appareil pour déterminer des éléments de chaussée | |
| US11151729B2 (en) | Mobile entity position estimation device and position estimation method | |
| WO2020029706A1 (fr) | Procédé et appareil d'élimination de ligne de voie fictive | |
| CN108280840B (zh) | 一种基于三维激光雷达的道路实时分割方法 | |
| CN106681353A (zh) | 基于双目视觉与光流融合的无人机避障方法及系统 | |
| CN109101939B (zh) | 车辆运动状态的确定方法、系统、终端及可读存储介质 | |
| Shunsuke et al. | GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon | |
| Sehestedt et al. | Robust lane detection in urban environments | |
| CN114155511A (zh) | 一种用于自动驾驶汽车在公共道路的环境信息采集方法 | |
| CN114972427A (zh) | 一种基于单目视觉的目标跟踪方法、终端设备及存储介质 | |
| CN115902839A (zh) | 港口激光雷达标定方法及装置、存储介质及电子设备 | |
| Hussain et al. | Multiple objects tracking using radar for autonomous driving | |
| US20250029401A1 (en) | Image processing device | |
| Wang et al. | Road edge detection based on improved RANSAC and 2D LIDAR Data | |
| CN108416305B (zh) | 连续型道路分割物的位姿估计方法、装置及终端 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18778319 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18778319 Country of ref document: EP Kind code of ref document: A1 |