[go: up one dir, main page]

WO2018177026A1 - 确定道路边沿的装置和方法 - Google Patents

确定道路边沿的装置和方法 Download PDF

Info

Publication number
WO2018177026A1
WO2018177026A1 PCT/CN2018/075130 CN2018075130W WO2018177026A1 WO 2018177026 A1 WO2018177026 A1 WO 2018177026A1 CN 2018075130 W CN2018075130 W CN 2018075130W WO 2018177026 A1 WO2018177026 A1 WO 2018177026A1
Authority
WO
WIPO (PCT)
Prior art keywords
road
road edge
vehicle
curve
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2018/075130
Other languages
English (en)
French (fr)
Inventor
胡传远
付晶玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NIO Nextev Ltd
Original Assignee
NIO Nextev Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NIO Nextev Ltd filed Critical NIO Nextev Ltd
Publication of WO2018177026A1 publication Critical patent/WO2018177026A1/zh
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road

Definitions

  • the invention belongs to the technical field of intelligent automobiles, and relates to an apparatus and a method for determining a road edge based on a stationary target beside a road edge.
  • Autonomous driving is an important direction for the development of smart cars, and more and more vehicles are beginning to apply automatic driving systems to achieve the automatic driving function of the vehicle.
  • an automatic driving system can determine the travelable area of the vehicle at any time. In determining the travelable area, an important aspect is the need to determine the road edge of the current traveled road.
  • an automatic driving system usually determines an image edge by an image including a lane line collected by an image sensor (for example, a camera mounted on a vehicle), wherein the road edge is an image based on a lane line in an image acquired in real time. Processing to determine.
  • This technique of determining the edge of a road has at least one of the following problems:
  • the lane line portion is missing or the lane line is completely absent, or the determined road edge is largely deviated from the real road edge;
  • the technique for determining the edge of the road is implemented based on an image sensor.
  • the amount of information carried by the image sensor on the close-range image and the long-distance image is different.
  • the distance near the center point of the image sensor lens is smaller than the boundary area of the lens, so that it is easy to bring a problem of poor recognition of the lane line at a long distance. That is, the determination or detection of a road edge at a long distance (relative to the vehicle) is inaccurate.
  • At least one aspect or other technical problem of the technical problem to be solved by the present invention provides the following technical solutions.
  • an apparatus for determining a road edge comprising:
  • a radar detector mounted on the vehicle that is capable of detecting at least a stationary target beside the road edge of the road on which the vehicle is located;
  • a processing component configured to: receive the stationary target detected by the radar detector and extract arrangement information of the stationary target substantially aligned with the road, thereby obtaining road edge information based on the arrangement information.
  • a method for determining a road edge comprising the steps of:
  • a vehicle is provided provided with an automatic driving system in which any of the above-described devices for determining a road edge is provided.
  • Figure 1 is a block diagram showing the construction of a device for determining a road edge in accordance with an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of an application scenario of the apparatus of the embodiment shown in FIG. 1 when determining a road edge.
  • FIG. 3 is a flow chart of a method of determining a road edge in accordance with an embodiment of the present invention.
  • FIG. 1 is a schematic view showing the structure of a device for determining a road edge according to an embodiment of the present invention
  • FIG. 2 is a schematic view showing an application scenario of the device of the embodiment shown in FIG. 1 when determining a road edge.
  • the apparatus of the embodiment of the present invention and its working principle are exemplified below with reference to FIG. 1 and FIG.
  • a device for determining a road edge (hereinafter simply referred to as "determining device") is mounted on the vehicle 100, and the specific type of the vehicle 100 is not limited, and the vehicle 100 is the determining device with respect to the determining device. Host vehicle.
  • the determining device can be applied to an automatic driving system in which the vehicle 100 is installed.
  • the vehicle 100 is traveling on a road 900 having corresponding road edges 901a and 901b, wherein 901a is the left road edge and 901b is the right road edge.
  • the road edge 901a And 901b are not clearly identified by lane lines, or there is no corresponding lane line in a section of road 900 of the example to identify the road rim.
  • stationary targets there are various stationary (relative road stationary) objects on both sides of the road 900, which are targets for determining device detection, and therefore, also referred to as "stationary targets"; by way of example, stationary targets beside the left road edge 901a of the road 900 are shown For example, tree 801, utility pole 802, isolated 803 (where three isolated 803a, 803b, and 803c are shown), etc., it should be understood that the stationary target beside the road edge is not limited to the object type of the above embodiment, for example, It can be a fence, a sign pole, etc.
  • the determining device primarily includes a radar detector 110 mounted on the vehicle 100 that is capable of detecting at least a stationary target on at least one side of the road of the road 900 on which the vehicle 100 is located.
  • the radar detector 110 is a millimeter wave radar that is mounted at the front end of the vehicle 100 and is capable of detecting various objects in front of the road plane at a 90[deg.] detection angle range, including, for example, the road shown in FIG. A stationary target beside the edge 901a.
  • the radar detector 110 emits electromagnetic waves of a certain wavelength during detection and receives reflections from objects in front, so that the position of various objects can be detected, especially for objects at a long distance (for example, 40 meters or more), as with close objects. It can be detected relatively accurately (relative to image sensor 120) and, therefore, it has better remote sensing characteristics relative to image sensor 120.
  • the vehicle coordinate system that is, the XY coordinate system
  • the XY coordinate system may be defined in the determining device, wherein the center of mass of the vehicle 100 is the circle O, the X axis is defined as the front vertical direction of the vehicle 100, and the X coordinate is defined as the relative vehicle.
  • the deviation of the distance of the centroid in the vertical direction, the Y-axis is defined as the horizontal direction of the vehicle 100, and the Y-coordinate is defined as the deviation in the horizontal direction from the distance from the center of mass of the vehicle.
  • the radar detector 110 detects various objects (including stationary objects), the coordinates (X, Y) of an object are substantially determined, wherein the X coordinate represents the distance of the object from the centroid of the vehicle 100 in the vehicle coordinate system.
  • the Y-coordinate represents the deviation of the distance of the object from the centroid of the vehicle 100 in the horizontal direction in the vehicle coordinate system (i.e., the deviation on the Y-axis).
  • the millimeter wave radar is configured to determine a stationary target from among various detected objects based on the Doppler effect and the vehicle speed of the host vehicle, that is, an object that is stationary relative to the road edge 901. Therefore, the millimeter wave radar can output related information of a stationary target (for example, coordinates in a vehicle coordinate system) in substantially real time.
  • the radar detector 110 has the advantage of being relatively low cost and capable of accurately detecting a stationary target at a long distance (for example, 40 meters or more) when using a millimeter wave radar, but it should be understood that the radar detector 110 is not limited to a millimeter wave radar, for example, It is also possible for Lidar, which is relatively more accurate in detecting various stationary targets (including distant stationary targets), but is relatively expensive and requires more data processing capability for subsequent processing components 130.
  • the determining device further includes a processing component 130 that is also disposed on the vehicle 100, which may be implemented by a processing device in an automated driving system on the vehicle 100, or may be independently set up with respect to the automated driving system.
  • the processing component 130 can process the algorithm code stored therein and execute instructions from the automated driving system or vehicle, and the specific hardware implementation of the processing component 130 is known and will not be described in detail herein.
  • the processing component 130 can mainly perform data processing on the stationary target related information transmitted by the radar detector 110 to obtain a road edge curve, which is configured to: receive the stationary target detected by the radar detector 110 and extract a substantially regular arrangement of the relative roads. Arrangement information of the stationary target, thereby obtaining road edge information based on the arrangement information. Among them, the road edge information can be expressed as road edge curve information.
  • the specific working principle of the processing unit 130 will be exemplified below by taking the road edge curve of the left road edge 901a shown in FIG.
  • the number of stationary targets transmitted by the radar detector 110 after one scan detection may be up to several tens of orders of magnitude. Therefore, a corresponding screening unit 131 is provided in the processing component 130, which is capable of being able to Among the plurality of stationary targets transmitted by the detector 110, at least three still targets are selected as reference targets.
  • the tree 901 having the substantially regular arrangement of the opposite roads 900, the utility pole 902, and the isolation 903 as the reference target of the left road edge 901a can be selected from a plurality of stationary targets, and not for the opposite left road edge 901a.
  • Static objects such as other trees and electric poles that are regularly arranged may not be selected as reference targets or filtered.
  • both sides of the road 900 will generally have objects that are generally regularly aligned with respect to the road 900, such as trees 901 and utility poles 902, etc., it may be based on the vehicle 100 when determining whether a stationary target is substantially regularly aligned with respect to the road 900.
  • the current yaw rate (eg, can be acquired from components such as the steering system of the vehicle 100) to obtain a predicted travel trajectory that roughly corresponds to the current road curve, and thus can be based substantially on whether the stationary target is relative
  • the predicted travel trajectories are generally regularly arranged to determine whether their stationary targets are substantially regularly aligned with respect to the road 900, so that the corresponding stationary target can be screened as a reference target.
  • the regularly arranged trees 801, the utility poles 802, the isolation piers 803, and the like beside the left road edge 901a are determined by the screening unit 131 to be substantially regularly arranged relative to the predicted traveling trajectory, and therefore, at least Three are used as reference targets, for example, three or more trees 801 are selected as reference targets, or isolation piers 803a, 803b, and 803c are selected as reference targets, or a plurality of trees 801 and one utility pole 802 and one isolation pier 803a are selected as reference targets. .
  • the processing component 130 is provided with a target curve fitting unit 132 configured to curve at least three or more reference targets in a vehicle coordinate system to obtain a corresponding reference target alignment curve.
  • a target curve fitting unit 132 configured to curve at least three or more reference targets in a vehicle coordinate system to obtain a corresponding reference target alignment curve.
  • the reference target alignment curve that needs to be obtained is defined in advance by a quadratic function, that is, the following functional relationship (1):
  • X is an independent variable, which corresponds to the X coordinate in the vehicle coordinate system, and the X coordinate is defined as the deviation in the vertical direction from the centroid of the vehicle;
  • Y is the dependent variable, which corresponds to the vehicle coordinate The Y coordinate under the system, which is defined as the deviation in the horizontal direction from the centroid of the vehicle;
  • C 2 is the quadratic coefficient,
  • C 1 is the primary term coefficient, and
  • C 0 ' is the constant term.
  • the coordinates of the plurality of reference targets are substituted into the quadratic function relation (1), and the quadratic coefficient C 2 , the term coefficient C 1 , and the constant term C 0 ' in the quadratic function relation (1) are calculated.
  • the value thus obtaining the relation (1), that is, determining the reference target arrangement curve.
  • the turning radius of the current vehicle may also be calculated based on the current yaw rate of the vehicle, thereby calculating the quadratic coefficient C 2 in the relation (1), at this time.
  • the sub-function relation (1) is reduced to a linear function relation.
  • the values of the primary term coefficient C 1 and the constant term C 0 ' can be further calculated, thereby obtaining the relation (1). That is, the reference target alignment curve is determined.
  • the processing component 130 is provided with a road edge estimation unit 133 for estimating a road edge curve based on the reference target alignment curve.
  • the road edge estimation unit 133 obtains a corresponding road edge curve by obtaining the following quadratic function relationship (2) from the quadratic function relation (1) as a road edge curve:
  • D is the distance constant of the road target relative to the stationary target that is substantially regularly arranged next to it, for example, the corresponding stationary target next to the left road edge 901a is estimated in advance
  • the distance constant D of the tree 801, the utility pole 802, the isolated 803, etc. relative to the left road edge 901a generally, the distance between the tree 801, the utility pole 802, the isolated 803 and the road edge 901 respectively has corresponding specifications, which may be based on These specified values are used to estimate the distance constant D (specifically, for example, 0.5 m)
  • the quadratic function relation (2) is determined, that is, the road edge curve is determined.
  • the determining device of the above embodiment can determine the road edge information based on the stationary targets on both sides of the road, is completely independent of the lane line, and is therefore very suitable for application to unstructured roads (for example, lane line blur, lane line disappearance or missing)
  • the road edge information is obtained in the road; and it is not dependent on the image sensor. Therefore, the problem that the long-distance road cannot accurately acquire the corresponding road edge information can be obtained, and the road edge information can be obtained relatively accurately at a long distance.
  • the determining device of the above embodiment can be applied to the vehicle 100 having an automatic driving system based on the road edge curve provided by the determining device, which can not only give the near-end travelable area, but also can give a relatively accurate far end.
  • an image sensor 120 may be further disposed in the determining device, which may be installed, for example, at a substantially rear view mirror position inside the vehicle.
  • the image sensor 120 may specifically be a camera or the like, which may
  • the lane line image information of the lane line 901 of the road 900 (if the lane line 901 is present) is acquired in real time.
  • the image information acquired by the image sensor 120 is not limited to lane line image information, for example, Includes image information such as vehicles ahead, pedestrians, obstacles, etc.
  • the processing component 130 in the determining device further receives the lane line image information, and further calculates a road edge curve of the road 900 in real time based on the lane line image information; performs image processing based on the lane line image information and
  • the calculation of the road edge curve is well known in the art and will not be described in detail herein. Therefore, the processing component 130 may obtain two road edge curves obtained by the two mechanisms respectively, and the processing component 130 may determine the road edge curve of the road 900 based on the two road edge curves in different scenarios.
  • lane lines of road 900 exist and there are stationary targets along road 900 that are generally regularly aligned with respect to road 900, based on the above two mechanisms or two road edge curves, for close-range roads, road edges
  • the curve can use the road edge curve calculated based on the lane line image information.
  • the road edge curve is calculated based on the stationary target to obtain the road edge curve, thus overcoming the road edge calculated based on the lane line image information. The problem of curves being inaccurate in the long distance segment.
  • the lane line of the road 900 is missing or unclear, and there are stationary targets along the road 900 that are substantially regularly aligned with respect to the road 900, for missing or unclear sections of the lane line of the road
  • the road edge curve can be calculated based on the stationary target.
  • FIGS. 1 through 3 is a flow chart showing a method of determining a road edge in accordance with an embodiment of the present invention. A method of determining a road edge in accordance with an embodiment of the present invention is illustrated in conjunction with FIGS. 1 through 3.
  • step S310 a stationary target beside the road edge of the road on which the vehicle is located is detected.
  • This step S310 can be implemented in a radar detector 110 such as a millimeter wave radar.
  • a stationary target of at least one of the sides of the road of the road 900 where the vehicle 100 is located (for example, the side of the left road edge 901a) is detected by the radar detector 110, particularly for objects at a long distance (for example, 40 meters or more), and close objects. The same can be detected relatively accurately.
  • the millimeter wave radar is configured to be able to determine a stationary target from among various detected objects based on the Doppler effect, that is, an object that is stationary relative to the road edge 901, for example, including a tree 801, a utility pole 802, and an isolation 803. Therefore, the millimeter wave radar can output related information of a stationary target (for example, coordinates in a vehicle coordinate system) in substantially real time.
  • step S320 the reference target is selected from the stationary target.
  • This step S320 is implemented in the screening unit 131 of the main processing unit 130.
  • at least three still objects are selected from the plurality of stationary targets transmitted from the radar detector 110 as reference targets.
  • the principle of screening is based on whether the stationary targets are roughly arranged relative to the road 900.
  • the roads 900 may correspond to predicted driving trajectories that may be based on the current yaw angular velocity of the vehicle 100 (eg, may be from the vehicle 100) Obtained from the components such as the steering system, the trees 801, the utility poles 802, the isolation piers 803, and the like arranged based on the rules of the left road edge 901a are determined to be substantially regularly arranged relative to the predicted traveling trajectory, thereby screening them.
  • the more the number of reference targets the more favorable it is to get the road edge curve accurately.
  • step S330 curve fitting is performed on the reference target to obtain a reference target alignment curve.
  • This step S330 is mainly implemented in the target curve fitting unit 132 of the processing unit 130.
  • the reference target alignment curve that needs to be obtained is defined in advance by a quadratic function, that is, the following functional relationship (1):
  • X is an independent variable, which corresponds to the X coordinate in the vehicle coordinate system, and the X coordinate is defined as the deviation in the vertical direction from the centroid of the vehicle;
  • Y is the dependent variable, which corresponds to the vehicle coordinate The Y coordinate under the system, which is defined as the deviation in the horizontal direction from the centroid of the vehicle;
  • C 2 is the quadratic coefficient,
  • C 1 is the primary term coefficient, and
  • C 0 ' is the constant term.
  • the coordinates of the plurality of reference targets are substituted into the quadratic function relation (1), and the quadratic function relation (1) is calculated.
  • the turning radius of the current vehicle may also be calculated based on the current yaw rate of the vehicle, thereby calculating the quadratic coefficient C 2 in the relation (1), at this time.
  • the sub-function relation (1) is reduced to a linear function relation.
  • the values of the primary term coefficient C 1 and the constant term C 0 ' can be further calculated, thereby obtaining the relation (1). That is, the reference target alignment curve is determined.
  • step S340 the road edge curve is estimated based on the reference target arrangement curve.
  • This step S330 is mainly implemented in the road edge estimation unit 133 of the processing unit 130.
  • the corresponding road edge curve is obtained by obtaining the following quadratic function relation (2) from the quadratic function relation (1) as a road edge curve:
  • D is the distance constant of the road target relative to the stationary target that is substantially regularly arranged next to it, for example, the corresponding stationary target next to the left road edge 901a is estimated in advance
  • the distance constant D of the tree 801, the utility pole 802, the isolated 803, etc. relative to the left road edge 901a generally, the distance between the tree 801, the utility pole 802, the isolated 803 and the road edge 901 respectively has corresponding specifications, which may be based on These specified values are used to estimate the distance constant D (specifically, for example, 0.5 m)
  • the quadratic function relation (2) is determined, that is, the road edge curve is determined.
  • the method for determining the road edge of the embodiment shown in FIG. 3 above is not dependent on the lane line or the image sensor, and is very suitable for application to unstructured roads (for example, lane line blur, lane line disappearing or missing roads).
  • the road edge information is obtained in the road, and the road edge information can also be obtained relatively accurately at a long distance.
  • the terms “close distance” and “distance” are respectively based on the effective detection distance of the radar detector and the effective detection distance of the image sensor, respectively.
  • the effective detection distance of the radar detector is relative to the image.
  • the effective detection range of the sensor is farther; therefore, the range of distance less than or equal to the effective detection distance of the image sensor is defined as “close distance”, and the range of distance beyond the effective detection distance of the image sensor is defined as “distance” of the present application.
  • the "close distance” and “distance” are not based on fixed distance values.
  • the effective detection distance of different types of image sensors may also be different, for example, with image sensor technology. Development, the effective detection distance of newly emerging image sensors after this application date may also be further.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

一种确定道路边沿的装置和方法,属于智能汽车技术领域。一种确定道路边沿的装置,其包括:安装在车辆(100)上的雷达探测器(110),其至少能够检测车辆(100)所在道路的道路边沿旁边的静止目标;和处理部件(130),其被配置为:接收所述雷达探测器(110)所检测的静止目标并提取出相对道路大致规则排列的静止目标的排列信息,从而基于所述排列信息获得道路边沿信息。装置和方法非常适合应用于非结构化道路中获取道路边沿信息,并且远距离也可以相对准确地获得道路边沿信息。

Description

确定道路边沿的装置和方法 技术领域
本发明属于智能汽车技术领域,涉及基于道路边沿旁边的静止目标确定道路边沿的装置和方法。
背景技术
自动驾驶(包含辅助驾驶)是智能汽车发展的重要方向,并且越来越多的车辆中开始应用自动驾驶系统来实现车辆的自动驾驶功能。通常地,自动驾驶系统能需要随时地确定车辆的可行驶区域,在确定可行驶区域的过程中,一个重要的方面是需要确定出当前行驶道路的道路边沿。
目前,自动驾驶系统中通常是通过图像传感器(例如安装在车辆上的摄像头)所采集的包括车道线的图像来确定道路边沿的,其中,道路边沿是基于实时采集的图像中的车道线的图像处理来确定的。这种确定道路边沿的技术存在以下问题的至少一方面:
第一方面,必须依赖于车道的车道线,对于车道线模糊、车道线部分缺失或者车道线完全不存在的道路,是难以确定道路边沿的,或者确定的道路边沿是较大地偏离真实道路边沿;
第二方面,这种确定道路边沿的技术是基于图像传感器来实现的,然而在实际应用中,图像传感器在近距离图像和远距离图像上所承载的信息量有所差异。一般地,就图像上两个像素点之间代表的实际物理距离来说,图像传感器镜头中心点附近距离比镜头边界区更小,这样,容易带来对远距离的车道线识别能力差的问题,也即,远距离(相对于车辆)的道路边沿的确定或检测是不准确的。
发明内容
本发明要解决的技术问题的至少一方面或者其他技术问题本发明提供以下技术方案。
按照本发明的一方面,提供一种确定道路边沿的装置,其包括:
安装在车辆上的雷达探测器,其至少能够检测车辆所在道路的道路边沿旁边的静止目标;和
处理部件,其被配置为:接收所述雷达探测器所检测的静止目标并提取出相对道路大致规则排列的静止目标的排列信息,从而基于所述排列信息获得道路边沿信息。
按照本发明的又一方面,提供一种确定道路边沿的方法,其特征在于,包括步骤:
(a)检测车辆所在道路的道路边沿旁边的静止目标;以及
(b)提取出相对道路大致规则排列的静止目标的排列信息,以及基于所述排列信息获得道路边沿信息。
按照本发明的还一方面,提供一种车辆,设置有自动驾驶系统,所述自动驾驶系统中设置有上述任一所述的确定道路边沿的装置。
根据以下描述和附图本发明的以上特征和操作将变得更加显而易见。
附图说明
从结合附图的以下详细说明中,将会使本发明的上述和其他目的及优点更加完整清楚,其中,相同或相似的要素采用相同的标号表示。
图1是按照本发明一实施例的确定道路边沿的装置的结构示意图。
图2是图1所示实施例的装置在确定道路边沿时的应用场景示意图。
图3是按照本发明一实施例的确定道路边沿的方法的流程图。
具体实施方式
现在将参照附图更加完全地描述本发明,附图中示出了本发明的示例性实施例。但是,本发明可按照很多不同的形式实现,并且不应该被理解为限制于这里阐述的实施例。相反,提供这些实施例使得本公开变得彻底和完整,并将本发明的构思完全传递给本领域技术人员。附图中,相同的标号指代相同的元件或部件,因此,将省略对它们的描述。
附图中所示的一些方框图是功能实体,不一定必须与物理或逻辑上独立的实体相对应。可以采用软件形式来实现这些功能实体,或者在一个或多个硬件模块或集成电路中实现这些功能实体,或者在不同网络和/或处理器装置和/或微控制器装置中实现这些功能实体。
图1所示为按照本发明一实施例的确定道路边沿的装置的结构示 意图,图2所示为图1所示实施例的装置在确定道路边沿时的应用场景示意图。以下结合图1和图2对本发明实施例的装置及其工作原理进行示例说明。
如图1所示,确定道路边沿的装置(以下简称为“确定装置”)是被安装在车辆100上,车辆100的具体类型不是限制性的,相对于该确定装置,车辆100是该确定装置的宿主车辆。该确定装置可以应用于车辆100所安装的自动驾驶系统上。
以图2为示例说明,车辆100是在道路900上行驶,道路900具有相应的道路边沿901a和901b,其中,901a为左道路边沿,901b为右道路边沿,在该应用场景中,道路边沿901a和901b中并没有通过车道线明确地标识出来,或者在该示例的一段道路900中并不存在相应的车道线来标识道路边沿。道路900的两旁存在各种静止(相对道路静止)物体,其是确定装置检测的目标,因此,也称为“静止目标”;示例地,道路900的左道路边沿901a旁边的静止目标被示出,例如,树801、电线杆802、隔离敦803(其中示出三个隔离敦803a、803b和803c)等,应当理解,道路边沿旁边的静止目标并不限于以上实施例的物体种类,例如还可以是栅栏、指示牌立杆等,
确定装置主要地包括安装在车辆100上的雷达探测器110,其至少能够检测车辆100所在道路900的道路两旁中的至少一旁的静止目标。在一实施例中,雷达探测器110为毫米波雷达,其安装在车辆100的前端,能够在道路平面上以90°探测角度范围检测前方的各种物体,包括例如如图2所示的道路边沿901a旁边的静止目标。雷达探测器110在探测时发射一定波长的电磁波并接收来自前方物体的反射,因此,可以检测到各种物体的位置,特别是对于远距离(例如40米以上)的物体,与近距离物体一样可以被相对准确地检测(相对于图像传感器120来说),因此,其相对图像传感器120具有较佳的远距离检测特性。
需要指出的是,确定装置中可以预先定义车辆坐标体系,即XY坐标体系,其中,以车辆100的质心为圆点O,X轴定义为车辆100的前方垂直方向,X坐标定义为相对车辆的质心的距离在垂直方向上的偏差,Y轴定义为车辆100的水平方向,Y坐标定义为相对所述车辆的质心的距离在水平方向上的偏差。雷达探测器110在检测出各种物体(包括静止物体)时,某一物体的坐标(X,Y)被基本确定,其 中,X坐标表示在车辆坐标体系下该物体与车辆100的质心的距离在垂直方向上的偏差(即X轴上的偏差),Y坐标表示车辆坐标体系下该物体与车辆100的质心的距离在水平方向上的偏差(即Y轴上的偏差)。
其中,毫米波雷达被配置能够基于多普勒效应和宿主车辆的车速从检测的各种物体中确定静止目标,也即相对道路边沿901静止不动的物体。因此,毫米波雷达能够基本实时地输出静止目标的相关信息(例如,在车辆坐标体系下的坐标)。
雷达探测器110使用毫米波雷达时具有相对成本低且能准确检测远距离(例如40米以上)的静止目标的优点,但是,应当理解,雷达探测器110并不限于为毫米波雷达,例如其还可以为激光雷达,激光雷达相对更为准确地检测各种静止目标(包括远距离的静止目标),但是,成本相对昂贵,对后续处理部件130的数据处理能力要求更高。
继续如图1所示,确定装置还包括处理部件130,其被也设置车辆100上,具体其可以通过车辆100上的自动驾驶系统中的处理装置来实现,也可以相对自动驾驶系统独立设置处理器来实现。处理部件130可以处理其中存储的算法代码、并执行来自自动驾驶系统或车辆的指令,处理部件130具体硬件实现方式已知的,在此不再详述。
处理部件130主要可以对雷达探测器110传输过来的静止目标相关信息进行数据处理来获得道路边沿曲线,其被配置为:接收雷达探测器110所检测的静止目标并提取出相对道路大致规则排列的静止目标的排列信息,从而基于该排列信息获得道路边沿信息。其中,道路边沿信息具体可以表现为道路边沿曲线信息。以下以获取图2所示的左道路边沿901a的道路边沿曲线示例说明处理部件130的具体工作原理。
在一实施例中,雷达探测器110在一次扫描检测后传输过来的静止目标数量可能可以达到几十个这个数量级,因此,在处理部件130中设置有相应的筛选单元131,其从能够从雷达探测器110传输过来的众多静止目标中的筛选出至少三个以上静止目标作为参照目标。如图2所示,从可以从诸多静止目标中筛选出相对道路900具有大致规则排列的树木901、电线杆902、隔离敦903作为左道路边沿901a的参照目标,对于不是相对左道路边沿901a不规则排列的其他树木、电性杆 等静止目标,可以不选择作为参照目标或被过滤。
申请人注意到,由于道路900的两旁一般会具有相对道路900大致规则排列的物体,例如树木901和电线杆902等;在确定某一静止目标是否相对道路900大致规则排列时,可以基于车辆100的当前的横摆角速度(例如可以从车辆100的转向系统等部件中采集获取)来得到预测行驶轨迹,该预测行驶轨迹与当前的道路曲线是大致对应的,因此,可以大致基于静止目标是否相对预测行驶轨迹大致规则排列来确定其静止目标是否相对道路900大致规则排列,从而可以筛选出相应静止目标作为的参照目标。应当理解的,“大致规则排列”中“大致”反映道路900的两旁的静止目标并不一定是严格按照某一规律相对道路整齐排列的,例如,在相对道路的排列整齐度上存在数米数量级上的公差等。
如图2所示,左道路边沿901a旁边的规则排列的树木801、电线杆802、隔离墩803等被筛选单元131确定为是相对预测行驶轨迹是大致规则排列的,因此,将它们中的至少三个作为参照目标,例如,选择三个以上树木801作为参照目标、或者选择隔离墩803a、803b和803c作为参照目标、或者选择多个树木801和一个电线杆802和一个隔离墩803a作为参照目标。参照目标的数量越多,后续越有利于准确得到道路边沿曲线。
在一实施例中,处理部件130中设置有目标曲线拟合单元132,其被配置为至少将三个或三个以上参照目标在车辆坐标体系下进行曲线拟合以得到相应的参照目标排列曲线。具体地,需要得到的参照目标排列曲线预先地以二次函数来定义,即以下函数关系式(1):
Y=C 2×X 2+C 1×X+C 0’     (1)
其中,X为自变量,其对应为在车辆坐标体系下的X坐标,该X坐标定义为相对所述车辆的质心的距离在垂直方向上的偏差;Y为因变量,其对应为在车辆坐标体系下的Y坐标,该Y坐标定义为相对所述车辆的质心的距离在水平方向上的偏差;C 2为二次项系数,C 1为一次项系数,C 0’为常数项。
进一步,将多个参照目标的坐标代入二次函数关系式(1),计算出二次函数关系式(1)中的二次项系数C 2、一次项系数C 1和常数项C 0’的值,从而得到了关系式(1),也即确定了参照目标排列曲线。
在又一实施例中,也可以还可以基于所述车辆的当前横摆角速度来计算出当前车辆的转弯半径,从而计算得出关系式(1)中的二次项系数C 2,此时二次函数关系式(1)降为一次函数关系式,基于多多个参照目标的坐标,进一步可以计算出一次项系数C 1和常数项C 0’的值,从而得到了关系式(1),也即确定了参照目标排列曲线。
在一实施例中,处理部件130中设置有道路边沿估计单元133,其用于基于参照目标排列曲线估算得到道路边沿曲线。在一实施例中,所述道路边沿估计单元133通过以下方式得到相应的道路边沿曲线:从二次函数关系式(1)得到以下二次函数关系式(2)作为道路边沿曲线:
Y=C 2×X 2+C 1×X+C 0     (2)
其中,C 0为常数项,C 0=C 0’+D,D为道路边沿相对其旁边大致规则排列的静止目标的距离常数,例如,预先地估计出左道路边沿901a旁边对应的静止目标(树木801、电线杆802、隔离敦803等)相对左道路边沿901a的距离常数D,一般地,树木801、电线杆802、隔离敦803相对道路边沿901的距离分别存在相应的规范规定,可以基于这些规定值来估算得到距离常数D(具体例如0.5m)
这样,二次函数关系式(2)被确定,也即确定了道路边沿曲线。
需要说明的是,尽管以上示例是以二次函数关系式来确定参照目标排列曲线和道路边沿曲线,应当理解,还可以基于更高次的函数关系式(例如三次函数关系式、四次函数关系式)来确定参照目标排列曲线和道路边沿曲线,当然,函数关系式次数越高,所需要的参照目标的个数也越多。
以上实施例的确定装置可以基于道路两旁的静止目标来确定道路边沿信息,完全不依赖于车道线来实现,也因此,非常适合应用于非结构化道路(例如车道线模糊、车道线消失或缺失的道路)中获取道路边沿信息;并且,也不依赖于图像传感器来实现,因此,摆脱远距离道路无法准确地获取相应的道路边沿信息的问题,远距离也可以相对准确地获得道路边沿信息。
以上实施例的确定装置可以应用于具有自动驾驶系统的车辆100中,自动驾驶系统基于确定装置提供的道路边沿曲线,不但可以给出近端的可行驶区域,而且可以给出相对准确的远端可行驶区域,其中, 基于道路边沿曲线来确定可行驶区域的算法并不是限制性的。
继续如图1所示,在又一实施例中,确定装置中还可以设置图像传感器120,其例如可以安装在车辆内部的大致后视镜位置处,图像传感器120具体可以为摄像头等,其可以实时地获取道路900的车道线901(如果存在车道线901的情况下)的车道线图像信息,当然,在实际应用中,图像传感器120所获取的图像信息并不限于车道线图像信息,例如还包括前方车辆、行人、障碍物等等图像信息。
在又一实施例中,确定装置中的处理部件130还接收上述车道线图像信息,并且,还基于车道线图像信息实时计算得到道路900的道路边沿曲线;基于车道线图像信息来进行图像处理并计算得到道路边沿曲线是本领域公知的,在此不再详述。因此,处理部件130可能可以得到两种机理分别获得的两条道路边沿曲线,处理部件130可以在不同场景下基于两条道路边沿曲线来确定道路900的道路边沿曲线。
示例地,在一种场景下,道路900的车道线存在并且道路900旁存在相对道路900大致规则排列的静止目标,基于上述两种机理或者两条道路边沿曲线,对于近距离道路,其道路边沿曲线可以采用基于车道线图像信息计算得到的道路边沿曲线,对于远距离道路,其道路边沿曲线采用基于所述静止目标计算得到道路边沿曲线,这样,克服了基于车道线图像信息计算得到的道路边沿曲线在远距离段不准确的问题。
示例地,在又一场景下,道路900的车道线存在部分路段缺失或不清楚、道路900旁存在相对道路900大致规则排列的静止目标,对于在所述道路的车道线缺失或不清晰的路段,其道路边沿曲线可以采用基于所述静止目标计算得到道路边沿曲线。从而克服基于图像传感器120在某些路段不能获得或不能准确获得道路边沿曲线的问题。
图3所示为按照本发明一实施例的确定道路边沿的方法的流程图。结合图1至图3,示例说明本发明实施例的确定道路边沿的方法。
首先,步骤S310,检测车辆所在道路的道路边沿旁边的静止目标。
该步骤S310可以在诸如毫米波雷达的雷达探测器110中实现。通过雷达探测器110检测车辆100所在道路900的道路两旁中的至少一旁(例如左道路边沿901a一侧)的静止目标,特别是对于远距离(例如40米以上等)的物体,与近距离物体一样可以被相对准确地检测。 毫米波雷达被配置能够基于多普勒效应从检测的各种物体中确定静止目标,也即相对道路边沿901静止不动的物体,例如包括树801、电线杆802、隔离敦803。因此,毫米波雷达能够基本实时地输出静止目标的相关信息(例如,在车辆坐标体系下的坐标)。
进一步,步骤S320,从静止目标中的筛选出参照目标。
该步骤S320主要的处理部件130中的筛选单元131中实现,该步骤中,从雷达探测器110传输过来的众多静止目标中的筛选出至少三个以上静止目标作为参照目标。筛选的原则是以该静止目标是否相对道路900大致规定排列,具体地,道路900可以以预测行驶轨迹来对应,该预测行驶轨迹可以基于车辆100的当前的横摆角速度(例如可以从车辆100的转向系统等部件中采集获取)来得到,基于左道路边沿901a旁边的规则排列的树木801、电线杆802、隔离墩803等被确定为是相对预测行驶轨迹是大致规则排列的,从而将其筛选为参照目标。参照目标的数量越多,后续越有利于准确得到道路边沿曲线。
进一步,步骤S330,对参照目标进行曲线拟合得到参照目标排列曲线。
该步骤S330主要在处理部件130的目标曲线拟合单元132中实现。在一实施例中,需要得到的参照目标排列曲线预先地以二次函数来定义,即以下函数关系式(1):
Y=C 2×X 2+C 1×X+C 0’     (1)
其中,X为自变量,其对应为在车辆坐标体系下的X坐标,该X坐标定义为相对所述车辆的质心的距离在垂直方向上的偏差;Y为因变量,其对应为在车辆坐标体系下的Y坐标,该Y坐标定义为相对所述车辆的质心的距离在水平方向上的偏差;C 2为二次项系数,C 1为一次项系数,C 0’为常数项。
进一步,将多个参照目标的坐标(例如树木801、电线杆802、隔离墩803在车辆坐标体现下的坐标)代入二次函数关系式(1),计算出二次函数关系式(1)中的二次项系数C 2、一次项系数C 1和常数项C 0’的值,从而得到了关系式(1),也即确定了参照目标排列曲线。
在又一实施例中,也可以还可以基于所述车辆的当前横摆角速度来计算出当前车辆的转弯半径,从而计算得出关系式(1)中的二次项系数C 2,此时二次函数关系式(1)降为一次函数关系式,基于多多个 参照目标的坐标,进一步可以计算出一次项系数C 1和常数项C 0’的值,从而得到了关系式(1),也即确定了参照目标排列曲线。
进一步,步骤S340,基于参照目标排列曲线估算得到道路边沿曲线。
该步骤S330主要在处理部件130的道路边沿估计单元133中实现。在一实施例中,通过以下方式得到相应的道路边沿曲线:从二次函数关系式(1)得到以下二次函数关系式(2)作为道路边沿曲线:
Y=C 2×X 2+C 1×X+C 0        (2)
其中,C 0为常数项,C 0=C 0’+D,D为道路边沿相对其旁边大致规则排列的静止目标的距离常数,例如,预先地估计出左道路边沿901a旁边对应的静止目标(树木801、电线杆802、隔离敦803等)相对左道路边沿901a的距离常数D,一般地,树木801、电线杆802、隔离敦803相对道路边沿901的距离分别存在相应的规范规定,可以基于这些规定值来估算得到距离常数D(具体例如0.5m)
这样,二次函数关系式(2)被确定,也即确定了道路边沿曲线。
以上图3所示实施例的确定道路边沿的方法既不依赖于车道线、也不依赖于图像传感器来实现,非常适合应用于非结构化道路(例如车道线模糊、车道线消失或缺失的道路)中获取道路边沿信息,并且远距离也可以相对准确地获得道路边沿信息。
在本文中,术语“近距离”和“远距离”二者是分别大致基于雷达探测器的有效探测距离和图像传感器的有效探测距离来对应的,一般地,雷达探测器的有效探测距离相对图像传感器的有效探测距离更远;因此,将小于或等于图像传感器的有效探测距离的距离范围定义为“近距离”,将超出图像传感器的有效探测距离的距离范围定义为本申请的“远距离”。需要理解是,“近距离”与“远距离”之间并不是基于固定的距离值来划分,例如,不同型号的图像传感器的有效探测距离可能也不同,还例如有,随着图像传感器技术的发展,在本申请日之后新涌现的图像传感器的有效探测距离也可能更远。
以上例子主要说明了本发明的确定道路边沿的装置和方法。尽管只对其中一些本发明的实施方式进行了描述,但是本领域普通技术人员应当了解,本发明可以在不偏离其主旨与范围内以许多其他的形式实施。因此,所展示的例子与实施方式被视为示意性的而非限制性的, 在不脱离如所附各权利要求所定义的本发明精神及范围的情况下,本发明可能涵盖各种的修改与替换。

Claims (17)

  1. 一种确定道路边沿的装置,其特征在于,包括:
    安装在车辆上的雷达探测器,其至少能够检测车辆所在道路的道路边沿旁边的静止目标;和
    处理部件,其被配置为:接收所述雷达探测器所检测的静止目标并提取出相对道路大致规则排列的静止目标的排列信息,从而基于所述排列信息获得道路边沿信息。
  2. 如权利要求1所述的装置,其特征在于,所述处理部件包括:
    筛选单元,其用于从所述静止目标中的筛选出三个或三个以上相对所述道路大致规则排列的所述静止目标作为参照目标;
    目标曲线拟合单元,其用于将三个或三个以上所述参照目标在车辆坐标体系下进行曲线拟合以得到相应的参照目标排列曲线;以及
    道路边沿估计单元,其用于基于所述参照目标排列曲线估算得到第一道路边沿曲线。
  3. 如权利要求2所述的装置,其特征在于,所述筛选单元被配置为:基于所述车辆的当前横摆角速度计算得到车辆的预测行驶轨迹,进一步基于所述静止目标是否相对所述预测行驶轨迹大致规则排列来筛选出相应静止目标作为参照目标。
  4. 如权利要求2所述的装置,其特征在于,所述参照目标排列曲线为以下二次函数关系式(1):
    Y=C 2×X 2+C 1×X+C 0’     (1)
    其中,X为自变量,其对应为在所述车辆坐标体系下的X坐标,该X坐标定义为相对所述车辆的质心的距离在垂直方向上的偏差;Y为因变量,其对应为在所述车辆坐标体系下的Y坐标,该Y坐标定义为相对所述车辆的质心的距离在水平方向上的偏差;C 2为二次项系数,C 1为一次项系数,C 0’为常数项;
    其中,基于参照目标在所述车辆坐标体系下的坐标信息,计算出二次函数关系式(1)中的二次项系数C 2、一次项系数C 1和常数项C 0’;
    其中,所述车辆的当前横摆角速度来计算出当前车辆的转弯半径, 从而得出所述二次项系数C 2
    进一步,所述道路边沿估计单元还被配置为:从所述二次函数关系式(1)得到以下二次函数关系式(2)作为第一道路边沿曲线:
    Y=C 2×X 2+C 1×X+C 0       (2)
    其中,C 0为常数项,C 0=C 0’+D,D为道路边沿相对其旁边大致规则排列的静止目标的距离常数。
  5. 如权利要求1或2所述的装置,其特征在于,所述雷达探测器为毫米波雷达。
  6. 如权利要求2所述的装置,其特征在于,所述装置还包括:安装在所述车辆上的图像传感器,其用于获取所述道路的车道线图像信息;
    其中,所述处理部件还被配置为:基于所述车道线图像信息计算得到所述道路的第二道路边沿曲线,以及基于所述第一道路边沿曲线和第二道路边沿曲线来确定所述道路的道路边沿曲线。
  7. 如权利要求6所述的装置,其特征在于,所述处理部件还被配置为:近距离道路的道路边沿曲线采用所述第二道路边沿曲线、远距离道路的道路边沿曲线采用所述第一道路边沿曲线。
  8. 如权利要求6所述的装置,其特征在于,所述处理部件还被配置为:在所述道路的车道线缺失或不清晰的路段采用所述第一道路边沿曲线作为其道路边沿曲线。
  9. 一种确定道路边沿的方法,其特征在于,包括步骤:
    (a)检测车辆所在道路的道路边沿旁边的静止目标;以及
    (b)提取出相对道路大致规则排列的静止目标的排列信息,以及基于所述排列信息获得道路边沿信息。
  10. 如权利要求9所述的方法,其特征在于,所述步骤(b)包括子步骤:
    (b1)从所述静止目标中的筛选出三个或三个以上相对所述道路大致规则排列的所述静止目标作为参照目标;
    (b2)将三个或三个以上所述参照目标在车辆坐标体系下进行曲线拟合以得到相应的参照目标排列曲线;以及
    (b3)基于所述参照目标排列曲线估算得到第一道路边沿曲线。
  11. 如权利要求10所述的方法,其特征在于,所述子步骤(b1) 中,基于所述车辆的当前横摆角速度计算得到车辆的预测行驶轨迹,进一步基于所述静止目标是否相对所述预测行驶轨迹大致规则排列来筛选出相应静止目标作为参照目标。
  12. 如权利要求10所述的方法,其特征在于,所述步骤(b2)中,所述参照目标排列曲线为以下二次函数关系式(1):
    Y=C 2×X 2+C 1×X+C 0’      (1)
    其中,X为自变量,其对应为在所述车辆坐标体系下的X坐标,该X坐标定义为相对所述车辆的质心的距离在垂直方向上的偏差;Y为因变量,其对应为在所述车辆坐标体系下的Y坐标,该Y坐标定义为相对所述车辆的质心的距离在水平方向上的偏差;C 2为二次项系数,C 1为一次项系数,C 0’为常数项;
    其中,基于参照目标在所述车辆坐标体系下的坐标信息,计算出二次函数关系式(1)中的二次项系数C 2、一次项系数C 1和常数项C 0’;
    其中,基于所述车辆的当前横摆角速度来计算出当前车辆的转弯半径,从而得出所述二次项系数C 2
    所述步骤(b3)中,从所述二次函数关系式(1)得到以下二次函数关系式(2)作为第一道路边沿曲线:
    Y=C 2×X 2+C 1×X+C 0      (2)
    其中,C 0为常数项,C 0=C 0’+D,D为道路边沿相对其旁边大致规则排列的静止目标的距离常数。
  13. 如权利要求9所述的方法,其特征在于,还包括步骤:
    获取所述道路的车道线图像信息;
    基于所述车道线图像信息计算得到所述道路的第二道路边沿曲线;以及
    基于所述第一道路边沿曲线和第二道路边沿曲线来确定所述道路的道路边沿曲线。
  14. 如权利要求13所述的方法,其特征在于,在确定所述道路的道路边沿曲线的步骤中,近距离道路的道路边沿曲线采用所述第二道路边沿曲线、远距离道路的道路边沿曲线采用所述第一道路边沿曲线。
  15. 如权利要求14所述的方法,其特征在于,在确定所述道路的道路边沿曲线的步骤中,在所述道路的车道线缺失或不清晰的路段采用所述第一道路边沿曲线作为其道路边沿曲线。
  16. 一种用于车辆的自动驾驶系统,其包括如权利要求1至8中任一项所述的确定道路边沿的装置。
  17. 一种车辆,设置有自动驾驶系统,其特征在于,所述自动驾驶系统中设置有如权利要求1-8中任一项所述的确定道路边沿的装置。
PCT/CN2018/075130 2017-03-29 2018-02-02 确定道路边沿的装置和方法 Ceased WO2018177026A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710196345.2A CN106991389B (zh) 2017-03-29 2017-03-29 确定道路边沿的装置和方法
CN201710196345.2 2017-03-29

Publications (1)

Publication Number Publication Date
WO2018177026A1 true WO2018177026A1 (zh) 2018-10-04

Family

ID=59412995

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/075130 Ceased WO2018177026A1 (zh) 2017-03-29 2018-02-02 确定道路边沿的装置和方法

Country Status (2)

Country Link
CN (1) CN106991389B (zh)
WO (1) WO2018177026A1 (zh)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132109A (zh) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 车道线处理和车道定位方法、装置、设备及存储介质
CN113525368A (zh) * 2021-06-23 2021-10-22 清华大学 车辆的车道保持紧急控制策略与安全控制方法及装置
CN113762011A (zh) * 2020-11-25 2021-12-07 北京京东乾石科技有限公司 路牙检测方法、装置、设备和存储介质
CN113879312A (zh) * 2021-11-01 2022-01-04 无锡威孚高科技集团股份有限公司 基于多传感器融合的前向目标选择方法、装置和存储介质
CN114067120A (zh) * 2022-01-17 2022-02-18 腾讯科技(深圳)有限公司 基于增强现实的导航铺路方法、装置、计算机可读介质
CN114475614A (zh) * 2022-03-21 2022-05-13 中国第一汽车股份有限公司 一种危险目标的筛选方法、装置、介质及设备
CN114872712A (zh) * 2022-06-29 2022-08-09 小米汽车科技有限公司 静态车辆检测方法、装置、设备、车辆及存储介质

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106991389B (zh) * 2017-03-29 2021-04-27 蔚来(安徽)控股有限公司 确定道路边沿的装置和方法
CN109895694B (zh) * 2017-12-08 2020-10-20 郑州宇通客车股份有限公司 一种车道偏离预警方法、装置及车辆
CN108572642B (zh) * 2017-12-15 2022-02-18 蔚来(安徽)控股有限公司 一种自动驾驶系统及其横向控制方法
CN108573272B (zh) * 2017-12-15 2021-10-29 蔚来(安徽)控股有限公司 车道拟合方法
CN108693517B (zh) * 2018-05-22 2020-10-09 森思泰克河北科技有限公司 车辆定位方法、装置和雷达
US11035943B2 (en) * 2018-07-19 2021-06-15 Aptiv Technologies Limited Radar based tracking of slow moving objects
CN109254289B (zh) * 2018-11-01 2021-07-06 百度在线网络技术(北京)有限公司 道路护栏的检测方法和检测设备
CN110174113B (zh) * 2019-04-28 2023-05-16 福瑞泰克智能系统有限公司 一种车辆行驶车道的定位方法、装置及终端
CN110244696A (zh) * 2019-06-24 2019-09-17 北京经纬恒润科技有限公司 车身横向控制方法及电子控制单元ecu
CN110320504B (zh) * 2019-07-29 2021-05-18 浙江大学 一种基于激光雷达点云统计几何模型的非结构化道路检测方法
WO2021062581A1 (zh) * 2019-09-30 2021-04-08 深圳市大疆创新科技有限公司 路面标识识别方法及装置
CN111198370B (zh) * 2020-01-02 2022-07-08 北京百度网讯科技有限公司 毫米波雷达背景检测方法、装置、电子设备及存储介质
CN111289980B (zh) * 2020-03-06 2022-03-08 成都纳雷科技有限公司 基于车载毫米波雷达的路边静止物的检测方法及系统
CN112597839B (zh) * 2020-12-14 2022-07-08 上海宏景智驾信息科技有限公司 基于车载毫米波雷达的道路边界检测方法
CN112949489B (zh) * 2021-03-01 2023-05-12 成都安智杰科技有限公司 一种道路边界识别方法、装置、电子设备和存储介质
CN113167886B (zh) * 2021-03-02 2022-05-31 华为技术有限公司 目标检测方法和装置
CN116067316B (zh) * 2023-01-17 2025-08-19 武汉理工大学 一种Frenet坐标多源数据的坐标统一车辆位置估测系统及方法
CN118212232B (zh) * 2024-05-20 2024-08-02 安徽蔚来智驾科技有限公司 线性元素的检测方法、智能设备及计算机可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104108392A (zh) * 2013-04-11 2014-10-22 株式会社万都 车道估计装置和方法
CN105404844A (zh) * 2014-09-12 2016-03-16 广州汽车集团股份有限公司 一种基于多线激光雷达的道路边界检测方法
CN105922991A (zh) * 2016-05-27 2016-09-07 广州大学 基于生成虚拟车道线的车道偏离预警方法及系统
CN106463064A (zh) * 2014-06-19 2017-02-22 日立汽车系统株式会社 物体识别装置和使用该物体识别装置的车辆行驶控制装置
CN106991389A (zh) * 2017-03-29 2017-07-28 蔚来汽车有限公司 确定道路边沿的装置和方法

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6728392B1 (en) * 2001-01-30 2004-04-27 Navigation Technologies Corp. Shape comparison using a rotational variation metric and applications thereof
JP2008164831A (ja) * 2006-12-27 2008-07-17 Aisin Aw Co Ltd 地図情報生成システム
CN102275587B (zh) * 2011-06-07 2015-12-09 长安大学 一种后方车辆碰撞危险性监测装置及其监测方法
CN104002809B (zh) * 2014-05-28 2016-08-24 长安大学 一种车辆岔口路段检测装置及检测方法
CN106476689A (zh) * 2015-08-27 2017-03-08 长城汽车股份有限公司 一种用于车辆的道路限宽提醒设备和方法
CN105711588B (zh) * 2016-01-20 2018-05-11 奇瑞汽车股份有限公司 一种车道保持辅助系统和车道保持辅助方法
CN106326850A (zh) * 2016-08-18 2017-01-11 宁波傲视智绘光电科技有限公司 快速车道线检测方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104108392A (zh) * 2013-04-11 2014-10-22 株式会社万都 车道估计装置和方法
CN106463064A (zh) * 2014-06-19 2017-02-22 日立汽车系统株式会社 物体识别装置和使用该物体识别装置的车辆行驶控制装置
CN105404844A (zh) * 2014-09-12 2016-03-16 广州汽车集团股份有限公司 一种基于多线激光雷达的道路边界检测方法
CN105922991A (zh) * 2016-05-27 2016-09-07 广州大学 基于生成虚拟车道线的车道偏离预警方法及系统
CN106991389A (zh) * 2017-03-29 2017-07-28 蔚来汽车有限公司 确定道路边沿的装置和方法

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132109A (zh) * 2020-10-10 2020-12-25 北京百度网讯科技有限公司 车道线处理和车道定位方法、装置、设备及存储介质
CN112132109B (zh) * 2020-10-10 2024-09-06 阿波罗智联(北京)科技有限公司 车道线处理和车道定位方法、装置、设备及存储介质
CN113762011A (zh) * 2020-11-25 2021-12-07 北京京东乾石科技有限公司 路牙检测方法、装置、设备和存储介质
CN113525368A (zh) * 2021-06-23 2021-10-22 清华大学 车辆的车道保持紧急控制策略与安全控制方法及装置
CN113879312A (zh) * 2021-11-01 2022-01-04 无锡威孚高科技集团股份有限公司 基于多传感器融合的前向目标选择方法、装置和存储介质
CN113879312B (zh) * 2021-11-01 2023-02-28 无锡威孚高科技集团股份有限公司 基于多传感器融合的前向目标选择方法、装置和存储介质
CN114067120A (zh) * 2022-01-17 2022-02-18 腾讯科技(深圳)有限公司 基于增强现实的导航铺路方法、装置、计算机可读介质
CN114475614A (zh) * 2022-03-21 2022-05-13 中国第一汽车股份有限公司 一种危险目标的筛选方法、装置、介质及设备
CN114872712A (zh) * 2022-06-29 2022-08-09 小米汽车科技有限公司 静态车辆检测方法、装置、设备、车辆及存储介质

Also Published As

Publication number Publication date
CN106991389B (zh) 2021-04-27
CN106991389A (zh) 2017-07-28

Similar Documents

Publication Publication Date Title
WO2018177026A1 (zh) 确定道路边沿的装置和方法
EP3007099B1 (en) Image recognition system for a vehicle and corresponding method
US9083856B2 (en) Vehicle speed measurement method and system utilizing a single image capturing unit
CN109212531B (zh) 确定目标车辆取向的方法
CN107646114B (zh) 用于估计车道的方法
CN110705458B (zh) 边界检测方法及装置
CN103176185B (zh) 用于检测道路障碍物的方法及系统
EP3910533B1 (en) Method, apparatus, electronic device, and storage medium for monitoring an image acquisition device
RU2764708C1 (ru) Способы и системы для обработки данных лидарных датчиков
JP6450294B2 (ja) 物体検出装置、物体検出方法、及びプログラム
WO2022067647A1 (zh) 一种路面要素确定方法及装置
WO2020029706A1 (zh) 一种伪车道线剔除方法及装置
CN108280840B (zh) 一种基于三维激光雷达的道路实时分割方法
CN106681353A (zh) 基于双目视觉与光流融合的无人机避障方法及系统
US20200279380A1 (en) Mobile entity position estimation device and position estimation method
CN109101939B (zh) 车辆运动状态的确定方法、系统、终端及可读存储介质
Shunsuke et al. GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon
Sehestedt et al. Robust lane detection in urban environments
CN114155511A (zh) 一种用于自动驾驶汽车在公共道路的环境信息采集方法
CN115902839A (zh) 港口激光雷达标定方法及装置、存储介质及电子设备
Hussain et al. Multiple objects tracking using radar for autonomous driving
US20250029401A1 (en) Image processing device
KR102831462B1 (ko) 차량의 객체 검출 장치 및 방법
Wang et al. Road edge detection based on improved RANSAC and 2D LIDAR Data
CN108416305B (zh) 连续型道路分割物的位姿估计方法、装置及终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18778319

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18778319

Country of ref document: EP

Kind code of ref document: A1