WO2018036892A1 - Driving assistance method and system for vehicle - Google Patents
Driving assistance method and system for vehicle Download PDFInfo
- Publication number
- WO2018036892A1 WO2018036892A1 PCT/EP2017/070810 EP2017070810W WO2018036892A1 WO 2018036892 A1 WO2018036892 A1 WO 2018036892A1 EP 2017070810 W EP2017070810 W EP 2017070810W WO 2018036892 A1 WO2018036892 A1 WO 2018036892A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- road
- data
- calculating
- sides
- driving assistance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3658—Lane guidance
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/20—Static objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
Definitions
- the application generally relates to a technical field of driving assistance.
- the present application relates to a vehicle driving assistance method and system for drawing a virtual lane marker.
- a lane marker is one of common traffic index lines, which is established on the surface of a road in order to separate traffic flows.
- the lane marker may be abraded and may become invisible, because of continuous exposure to the sun, rain, wind, snow or ice, alternation of cooling and heating, and/or abrasion from vehicles, etc..
- the lane marker may be covered by snow, when there is heavy snow.
- the lane marker may also be covered by dirt and/or silt. In some regions, no lane marker is established for an imperfection in traffic facilities. All these situations would cause that a driver of vehicle cannot identify the lane marker. As a result, the vehicle may deviate from its path, leading to an increased risk of driving.
- a vehicle driving assistance method, device and/or system are needed to draw a virtual lane marker and present the virtual lane marker to a driver of ego-vehicle in real time, when a physical lane marker is invisible.
- a vehicle driving assistance method and system for drawing a virtual lane marker are provided in embodiments of the present disclosure.
- a vehicle driving assistance method may include: receiving data of standing objects on both sides of a road; calculating a road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road; calculating a distance between the road boundaries of both sides of the road; and calculating one or more virtual lane markers, based on the calculated distance and a predefined lane width.
- a vehicle driving assistance system is provided.
- the vehicle driving assistance system may include: a detection device, configured to detect data of standing objects on both sides of a road; and a data processing device, configured to perform operations including: receiving the data of standing objects on both sides of the road detected by the detection device; calculating a road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road; calculating a distance between the road boundaries of both sides of the road; and calculating one or more virtual lane markers, based on the calculated distance and a predefined lane width.
- the driving assistance method and system can draw one or more virtual lane markers and present the virtual lane markers to a driver of ego-vehicle in real time, when a physical lane marker is invisible. Therefore, the driver may check the travelling path for ego-vehicle according to the virtual lane markers, so as to improve safety of driving.
- Fig. 1 describes an example scene where a vehicle driving assistance method and system provided in embodiments of the disclosure may be applied;
- FIG. 2 is a simplified schematic diagram of a vehicle driving assistance system, according to an embodiment of the disclosure.
- FIG. 3A-B shows simplified models of example roads, according to embodiments of the disclosure.
- Fig. 4 is a simplified flow chart of a vehicle driving assistance method, according to an embodiment of the disclosure.
- Figure 5 shows a schematic structure diagram of a data processing device, according to embodiments of the disclosure.
- references to "one embodiment”, “an embodiment”, “demonstrative embodiment”, “various embodiments” etc. indicate that the embodiment(s) so described may include a particular feature, structure, or characteristic, but not every embodiment necessarily includes the particular feature, structure, or characteristic. Further, repeated use of the phrase “in one embodiment” does not necessarily refer to the same embodiment, although it may.
- Fig. 1 describes an example scene 100 where a vehicle driving assistance method and system provided in embodiments of the disclosure may be applied.
- scene 100 physical lane markers are covered by snow and are invisible.
- the physical lane markers may become invisible for several reasons, such as, abrasion resulted from continuous exposure to the sun, rain, wind, snow or ice, alternation of cooling and heating, and/or abrasion from vehicles; coverage by dirt and/or silt; a lack of lane marker for an imperfection in traffic facilities; and/or poor visibility resulted from heavy fog or darkness and so forth.
- the vehicle driving assistance method and system can draw one or more virtual lane markers by taking standing objects on both side of a road as a reference and display the virtual lane markers to a driver of ego-vehicle.
- the phrase "standing objects on both side of a road” used in the disclosure is intended to include any object on each side of the road that may be taken as a reference.
- the standing objects may be trees.
- the standing objects may also be a building or public infrastructure (such as, a street lamp, a barrier and so on).
- the standing objects may also be a certain shape of the ground (such as a curb, a ditch, a cliff, a wall, a mountain and so on).
- Fig. 2 is a simplified schematic diagram of a vehicle driving assistance system 200 according to an embodiment of the disclosure.
- the vehicle driving assistance system 200 may be applied in cars, and other kinds of vehicles, such as, those having internal-combustion engine, electric motor, etc. as power mechanism, emerging electric vehicles and so on.
- the vehicle driving assistance system 200 may include a detection device 210, a data processing device 220, and may optionally include a storage device 230, a display device 240 and other devices as needed (e.g., in some embodiments, the vehicle driving assistance system 200 may also include a lane marker monitoring device (not shown)). These devices may be connected with each other.
- the detection device 210 may be configured to detect data of standing objects (such as, a tree, a building, a street lamp, a barrier, a curb, a ditch, a cliff, a wall, a mountain etc.) on both sides of a road on which ego-vehicle is running, and particularly, data of a boundary of the standing objects on the corresponding side of the road.
- the detection device 210 may include a camera, a laser scanner, an ultrasonic sensor, and/or a radar sensor etc..
- sensors such as, a camera, a laser scanner, an ultrasonic sensor, and/or a radar sensor etc., may be used separately or in combination.
- the data processing device 220 may include, for example, a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a processor, a microprocessor, a controller, a chip, a microchip, a circuit, a logic unit, an Integrated Circuit (IC), an Application-Specific IC (ASIC), or any other suitable multipurpose or specific processor or controller.
- the data processing device 220 may execute one or more suitable instructions stored in the storage device 230.
- the data processing device 220 may be configured to receive the data of standing objects on both sides of the road detected by the detection device, for example, data of a boundary of the standing objects on the corresponding side of the road.
- the data processing device 220 may be configured to calculate a road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road. For example, the data processing device 220 may receive data of the right boundary of all standing objects on the left side of the road, and analyze the received data to calculate (for example, by regression analysis, least square method, interpolation method, or other methods) a function of the left boundary of the road. For example, the data processing device 220 may receive data of the left boundary of all standing objects on the right side of the road, and analyze the received data to calculate (for example, by regression analysis, least square method, interpolation method, or other methods) a function of the right boundary of the road.
- the function of each of the road boundaries of both sides of the road may be, for example, a linear function, a quadratic function, or a higher-order function.
- the data processing device 220 may be configured to calculate a distance between the road boundaries of both sides of the road.
- the data processing device 220 may calibrate the functions of the road boundaries of both sides of the road into the same coordinate system, after the functions of the road boundaries of both sides of the road are derived.
- the data processing device 220 may establish a X-Y rectangular plane coordinate system by taking a position where the ego-vehicle is located as an original point, taking a direction in which the ego-vehicle travels as the direction of X axis, and taking a direction perpendicular to the X axis as the direction of Y axis.
- the data processing device 220 may then calibrate the functions of the road boundaries of both sides of the road into the established X-Y coordinate system.
- the data processing device 220 may establish other types of coordinate systems, such as, a polar coordinate system, a geodetic coordinate system and so on.
- the data processing device 220 may select a plurality of data points on the function of the road boundary of either side of the road.
- the number of the data points should guarantee that the derived distance has an error no more than a predetermined threshold, for example, 0.1 meter.
- the data processing device 220 may be configured to perform following calculations for each of the plurality of data points.
- Figs. 3A-B shows simplified models of example roads 300a and 300b, according to embodiments of the disclosure. The calculations for each of the plurality of data points are described by reference to the simplified models 300a and 300b of Figs. 3A-B.
- Simplified model 300a describes a situation of an ideal road, i.e., the functions of the road boundaries of both sides of the road are two parallel straight lines.
- a data point (marked as A) is selected on the function of the right boundary of the road.
- a data point may also be selected on the function of the left boundary of the road. It is not limited herein.
- a tangent of the function of the right boundary of the road is calculated by using the data point A as a point of tangency (in this situation, the tangent coincides with the function of the right boundary of the road).
- a normal of the function is calculated on the data point A. The normal intersects with the function of the left boundary of the road at a data point (marked as B). Since the functions of the road boundaries of both sides of the road are two parallel straight lines, a length of the line segment I AB I is determined as the distance between the road boundaries of both sides of the road.
- Simplified model 300b describes a situation in which the functions of the road boundaries of both sides of the road are two parallel curves.
- a data point (marked as A) is selected on the function of the right boundary of the road.
- a data point may also be selected on the function of the left boundary of the road. It is not limited herein.
- a tangent (marked as T) of the function of the right boundary of the road is calculated by using the data point A as a point of tangency.
- a normal (marked as V) of the function on the data point A is calculated by using the data point A as a vertical point. The normal V intersects with the function of the left boundary of the road at a data point (marked as B). Since the functions of the road boundaries of both sides of the road are two parallel curves, a length of the line segment I AB I is determined as the distance between the road boundaries of both sides of the road.
- calculus may be used to decrease order of the functions, such that the functions may be partitioned into a number of curve segments which are approximate to straight line segments.
- the above calculation steps may be performed for each of the curve segments.
- a function of the distance between the road boundaries of both sides of the road may be fitted based on the calculations for each of the curve segments.
- Related prior art may be referenced for details of calculus, which will not be repeated herein.
- the data processing device 220 may be configured to calculate one or more virtual lane markers, based on the calculated distance and a predefined lane width.
- the data processing device 220 is configured to calculate a number of lanes, by rounding down a result of dividing the distance between the road boundaries of both sides of the road by the predefined lane width, and determine the virtual lane markers, based on the number of lanes.
- the data processing device 220 is configured to subtract a predefined safety buffer distance (such as, 1 meter) from the distance between the road boundaries of both sides of the road, before dividing the distance by the predefined lane width.
- the data processing device 220 may draw one or more lane markers, according to the distance between the road boundaries of both sides of the road (e.g., 7m), the predefined lane width (e.g., 2.5m), the number of lanes (e.g., 2), and/or the predefined safety buffer distance on each side of the road (e.g., 0.5m).
- the data processing device 220 may set a lane marker at the central line (marked as "1" in Fig. 3 A) between the road boundaries of both sides of the road, set a lane marker (marked as "2" in Fig. 3 A) parallel to lane marker 1 and 2.5m right from lane marker 1, and set a lane marker (marked as "3" in Fig.
- the lane markers may be drawn in other ways, for example, determining a set of lane markers one by one from a road boundary of either side of the road, according to the predefined safety buffer distance on each side of the road (e.g., 0.5m) and the predefined lane width (e.g., 2.5m).
- the storage device 230 may include one or more types of computer-readable storage media capable of storing data, including volatile memory, non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and the like.
- the storage device 230 may store instructions and data for execution of the data processing device 220, for example, data detected by the detection device 210, the predefined lane width, the predefined safety buffer distance and so on.
- the storage device 230 may also store data calculated by the data processing device 220, such as, data of the virtual lane markers. It should be noted that the predefined lane width, the predefined safety buffer distance and other data may be hard-coded in program instructions.
- the vehicle driving assistance system 200 may include the display device 240.
- the display device 240 may be configured to display the virtual lane markers to a driver of ego-vehicle.
- the display device 240 may include a display screen mounted on the ego-vehicle (such as, an instrument cluster screen and a media display screen etc.), a mobile device carried by the driver (such as, a mobile phone, a tablet and a notebook etc.), a projection and display device (such as, a projector and any component that may show content projected by the projector, for example, the windshield of the ego-vehicle), a wearable device (such as, smart glasses), and so on.
- a display screen mounted on the ego-vehicle such as, an instrument cluster screen and a media display screen etc.
- a mobile device carried by the driver such as, a mobile phone, a tablet and a notebook etc.
- a projection and display device such as, a projector and any component that may show content projected by the projector
- a view of the road with one or more virtual lane markers drawn thereon may be displayed in the instrument cluster screen of ego-vehicle.
- the Augmented Reality (AR) technology is applied in the vehicle driving assistance system 200, such that in the driver' view, the virtual lane markers seem like physical lane markers on the road.
- the virtual lane markers may be projected on the windshield of the ego -vehicle in such a way that the virtual lane markers seem to be drawn on the road.
- the virtual lane markers may be displayed on smart glasses worn by the driver, such that in the driver' view, the virtual lane markers seem like physical lane markers on the road.
- the driver of ego-vehicle may manually monitor lane markers on the road, and manually initiate the vehicle driving assistance system 200 by a button, a knob, an operation on a touchscreen or by voice, when the lane markers are invisible.
- the vehicle driving assistance system 200 may include a lane marker monitoring device (not shown).
- the lane marker monitoring device may be a front camera installed on the windshield of the vehicle. The lane marker monitoring device may be configured to manually monitor physical lane markers on the road, and automatically initiate the vehicle driving assistance system, when it is determined that the physical lane markers are invisible.
- Fig. 4 is a simplified flow chart of a vehicle driving assistance method 400 according to an embodiment of the disclosure.
- the vehicle driving assistance method 400 may be manually initiated by a driver of ego-vehicle or automatically initiated by the lane marker monitoring device.
- the vehicle driving assistance method 400 may be performed by, for example, the vehicle driving assistance system 200 of Fig. 2.
- step 410 data of standing objects on both sides of a road (for example, data of a boundary close to the road of the standing objects) is received by a data processing device (e.g., the data processing device 220 of Fig 2).
- the data of standing objects on both sides of a road may be detected, for example, by a detection device (e.g., the detection device 210 of Fig 2).
- a road boundary of each side of the road is calculated by the data processing device (e.g., the data processing device 220 of Fig 2), according to the data of standing objects on the corresponding side of the road.
- the data processing device may receive data of the right boundary of all standing objects on the left side of the road, and analyze the received data to calculate (for example, by regression analysis, least square method, interpolation method, or other methods) a function of the left boundary of the road.
- the data processing device may receive data of the left boundary of all standing objects on the right side of the road, and analyze the received data to calculate (for example, by regression analysis, least square method, interpolation method, or other methods) a function of the right boundary of the road.
- the function of each of the road boundaries of both sides of the road may be, for example, a linear function, a quadratic function, or a higher-order function.
- a distance between the road boundaries of both sides of the road is calculated by the data processing device (e.g., the data processing device 220 of Fig 2).
- the data processing device may calibrate the functions of the road boundaries of both sides of the road into the same coordinate system, after the functions of the road boundaries of both sides of the road are derived.
- the data processing device may use several ways to calculate the distance between the road boundaries of both sides of the road, as described above.
- one or more virtual lane markers are calculated by the data processing device (e.g., the data processing device 220 of Fig 2), based on the calculated distance and a predefined lane width.
- the data processing device is configured to calculate a number of lanes, by rounding down a result of dividing the distance between the road boundaries of both sides of the road by the predefined lane width, and determine the virtual lane markers, based on the number of lanes.
- the data processing device is configured to subtract a predefined safety buffer distance (such as, 1 meter) from the distance between the road boundaries of both sides of the road, before dividing the distance by the predefined lane width.
- the predefined lane width and predefined safety buffer distance may be stored in a storage device (e.g., the storage device 230 of Fig 2), or may be hard-coded in program instructions.
- the calculated virtual lane markers are displayed to a driver of ego-vehicle by a display device (e.g., the display device 240 of Fig 2).
- a view of the road with one or more virtual lane markers drawn thereon may be displayed in the instrument cluster screen of ego-vehicle.
- the Augmented Reality (AR) technology is applied in the vehicle driving assistance system 200, such that in the driver' view, the virtual lane markers seem like physical lane markers on the road.
- the virtual lane markers may be projected on the windshield of the ego-vehicle in such a way that the virtual lane markers seem to be drawn on the road.
- the virtual lane markers may be displayed on smart glasses worn by the driver, such that in the driver' view, the virtual lane markers seem like physical lane markers on the road.
- Figure 5 shows a schematic structure diagram of a data processing device 500 by which the data processing device used in embodiments of the disclosure (e.g., the data processing device 220 of Fig 2) may be implemented.
- the device 500 can include one or more of a processor 520, a memory 530, a power component 540, an input/output (I/O) interface 560 and a communication interface 580, which are communicatively connected, for example, by a bus 510.
- the processor 520 may generally control the operations of the device 500, such as the operations associated with data communication and computation process.
- the processor 520 may include one or more processing cores and can execute instructions to perform some or all of the steps in the method described in the present disclosure.
- the processor 520 may include various devices capable of processing, including but not limited to, general purpose processor, specific purpose processor, micro-processor, micro-controller, graphics processing unit (GPU), digital signal processor (DSP), ASIC, programmable logic device (PLD), FPGA, etc.
- the processor 520 may include a cache memory 525 or may communicate with the cache memory 525, in order to increase access speed of data.
- the memory 530 is configured to store various instructions and/or data to support the operations of the device 500. Examples of data can include any data, instructions or the like of application programs or methods operating on the device 500.
- the memory 530 can be implemented by any volatile or non-volatile storage or their combinations.
- the memory 430 can include a semiconductor memory such as RAM, static RAM (SRAM), dynamic RAM (DRAM), ROM, programmable ROM (PROM), EPROM, EEPROM, flash memory, etc.
- the memory 530 may also include any storage using paper, magnetic, and/or optical media, such as tape, hard disk, cassette, floppy, magneto-optic (MO), CD, DVD, Blue -ray, etc.
- the power component 540 provides power to various components of the device 500.
- the power component 540 may include an internal battery and/or an external power interface, and may include a power management system and other components associated with generating, managing and distributing power for the device 500.
- the I/O interface 560 provides an interface by which a user can interact with the device 500.
- the I/O interface 560 may include an interface based on, for example, PS/2, RS-232, USB, FireWire, Lightening, VGA, HDMI, DisplayPort, etc., to enable user to interact with the device 500 by peripheral devices such as keyboard, mouse, touch pad, touch screen, joystick, button, microphone, loudspeaker, display, camera, projection port, etc.
- the communication interface 580 is configured to enable the device 500 to communicate with other devices wiredly or wirelessly. Through the communication interface 580, the device 500 may access a network which is based on one or more communication standards, such as Wi-Fi, 2G, 3G, 4G communication networks. In an example embodiment, the communication interface 580 may also receive broadcast signals or broadcast-related information from an external broadcast management system through a broadcast channel. Examples of communication interface 580 may include an interface which is based on Near Field Communication (NFC), Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wide Band (UWB), Bluetooth (BT), etc.
- NFC Near Field Communication
- RFID Radio Frequency Identification
- IrDA Infrared Data Association
- UWB Ultra Wide Band
- Bluetooth Bluetooth
- Function blocks shown in the above block diagrams can be implemented as hardware, software, firmware or their combination.
- the function block When the function block is implemented in hardware, it may be, for example, electronic circuit, ASIC, suitable firmware, plug-in, functional card, etc.
- the elements of the present application are programs or code segments for performing required tasks.
- the programs or code segments may be stored in a machine -readable media, or may be transmitted over some transmission media or communication links by signals in carrier waves.
- the term "machine -readable media" may include any media that can store or transmit information.
- Examples of the machine -readable media may include electronic circuit, semiconductor memory, ROM, flash memory, erasable ROM (EROM), floppy, CD-ROM, optical disk, hard disk, optical fiber, radio frequency link, etc.
- the code segments can be downloaded via a computer network such as Internet, intranet and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Traffic Control Systems (AREA)
Abstract
The present disclosure provides a vehicle driving assistance method and system. The vehicle driving assistance method may include receiving data of standing objects on both sides of a road; calculating a road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road; calculating a distance between the road boundaries of both sides of the road; and calculating one or more virtual lane markers, based on the calculated distance and a predefined lane width. The vehicle driving assistance method and system according to the present disclosure can assist a driver of ego-vehicle in checking the travelling path for ego-vehicle, so as to improve safety of driving.
Description
DRIVING ASSISTANCE METHOD AND SYSTEM FOR VEHICLE
TECHNICAL FIELD
[0001] The application generally relates to a technical field of driving assistance. Particularly, the present application relates to a vehicle driving assistance method and system for drawing a virtual lane marker.
BACKGROUND
[0002] A lane marker is one of common traffic index lines, which is established on the surface of a road in order to separate traffic flows. However, the lane marker may be abraded and may become invisible, because of continuous exposure to the sun, rain, wind, snow or ice, alternation of cooling and heating, and/or abrasion from vehicles, etc.. Moreover, the lane marker may be covered by snow, when there is heavy snow. The lane marker may also be covered by dirt and/or silt. In some regions, no lane marker is established for an imperfection in traffic facilities. All these situations would cause that a driver of vehicle cannot identify the lane marker. As a result, the vehicle may deviate from its path, leading to an increased risk of driving.
[0003] Therefore, a vehicle driving assistance method, device and/or system are needed to draw a virtual lane marker and present the virtual lane marker to a driver of ego-vehicle in real time, when a physical lane marker is invisible.
SUMMARY
[0004] A vehicle driving assistance method and system for drawing a virtual lane marker are provided in embodiments of the present disclosure.
[0005] According to an aspect of the present disclosure, a vehicle driving assistance method is provided. The vehicle driving assistance method may include: receiving data of standing objects on both sides of a road; calculating a road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road; calculating a distance between the road boundaries of both sides of the road; and calculating one or more virtual lane markers, based on the calculated distance and a predefined lane width.
[0006] According to another aspect of the present disclosure, a vehicle driving assistance system is provided. The vehicle driving assistance system may include: a detection device, configured to detect data of standing objects on both sides of a road; and a data processing device, configured to perform operations including: receiving the data of standing objects on both sides of the road detected by the detection device; calculating a road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road; calculating a distance between the road boundaries of both sides of the road; and calculating one or more virtual lane markers, based on the calculated distance and a predefined lane width.
[0007] The driving assistance method and system according to embodiments of the disclosure can draw one or more virtual lane markers and present the virtual lane markers to a driver of ego-vehicle in real time, when a physical lane marker is invisible. Therefore, the driver may check the travelling path for ego-vehicle according to the virtual lane markers, so as to improve safety of driving.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] Features, benefits and technical effects of those exemplary embodiments in the application may be better understood when reading the following detailed description in connection with the accompanying drawings, in which like reference numbers indicate like elements.
[0009] Fig. 1 describes an example scene where a vehicle driving assistance method and system provided in embodiments of the disclosure may be applied;
[0010] Fig. 2 is a simplified schematic diagram of a vehicle driving assistance system, according to an embodiment of the disclosure;
[0011] Figs. 3A-B shows simplified models of example roads, according to embodiments of the disclosure;
[0012] Fig. 4 is a simplified flow chart of a vehicle driving assistance method, according to an embodiment of the disclosure; and
[0013] Figure 5 shows a schematic structure diagram of a data processing device, according to embodiments of the disclosure.
DETAILED DESCRIPTION
[0014] Detailed description for various aspects and example embodiments of the present application is provided. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present application. However, it is obvious for those skilled in the art that the present application can be practiced without some of the specific details. The embodiments are merely examples and the present application is not limited to the specific configurations and algorithms set forth in the example embodiments. However, the present application can cover any modification, replacement and improvement of elements, components and algorithms, without departing from the scope of the present application.
[0015] References to "one embodiment", "an embodiment", "demonstrative embodiment", "various embodiments" etc., indicate that the embodiment(s) so described may include a particular feature, structure, or characteristic, but not every embodiment necessarily includes the particular feature, structure, or characteristic. Further, repeated use of the phrase "in one embodiment" does not necessarily refer to the same embodiment, although it may.
[0016] Fig. 1 describes an example scene 100 where a vehicle driving assistance method and system provided in embodiments of the disclosure may be applied. As shown in scene 100, physical lane markers are covered by snow and are invisible. However, it is only an illustrative situation. In practice, the physical lane markers may become invisible for several reasons, such as, abrasion resulted from continuous exposure to the sun, rain, wind, snow or ice, alternation of cooling and heating, and/or abrasion from vehicles; coverage by dirt and/or silt; a lack of lane marker for an imperfection in traffic facilities; and/or poor visibility resulted from heavy fog or darkness and so forth.
[0017] In above situations, the vehicle driving assistance method and system according to embodiments of the disclosure can draw one or more virtual lane markers by taking standing objects on both side of a road as a reference and display the virtual lane markers to a driver of ego-vehicle. The phrase "standing objects on both side of a road" used in the disclosure is intended to include any object on each side of the road that may be taken as a reference. In scene 100, the standing objects may be trees. In some embodiments, the standing objects may
also be a building or public infrastructure (such as, a street lamp, a barrier and so on). In some embodiments, the standing objects may also be a certain shape of the ground (such as a curb, a ditch, a cliff, a wall, a mountain and so on).
[0018] Turning to Fig. 2, Fig. 2 is a simplified schematic diagram of a vehicle driving assistance system 200 according to an embodiment of the disclosure. The vehicle driving assistance system 200 may be applied in cars, and other kinds of vehicles, such as, those having internal-combustion engine, electric motor, etc. as power mechanism, emerging electric vehicles and so on. As shown in Fig. 2, the vehicle driving assistance system 200 may include a detection device 210, a data processing device 220, and may optionally include a storage device 230, a display device 240 and other devices as needed (e.g., in some embodiments, the vehicle driving assistance system 200 may also include a lane marker monitoring device (not shown)). These devices may be connected with each other.
[0019] In some embodiments, the detection device 210 may be configured to detect data of standing objects (such as, a tree, a building, a street lamp, a barrier, a curb, a ditch, a cliff, a wall, a mountain etc.) on both sides of a road on which ego-vehicle is running, and particularly, data of a boundary of the standing objects on the corresponding side of the road. The detection device 210 may include a camera, a laser scanner, an ultrasonic sensor, and/or a radar sensor etc.. In an embodiment, sensors, such as, a camera, a laser scanner, an ultrasonic sensor, and/or a radar sensor etc., may be used separately or in combination.
[0020] The data processing device 220 may include, for example, a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a processor, a microprocessor, a controller, a chip, a microchip, a circuit, a logic unit, an Integrated Circuit (IC), an Application-Specific IC (ASIC), or any other suitable multipurpose or specific processor or controller. For example, the data processing device 220 may execute one or more suitable instructions stored in the storage device 230.
[0021] In some embodiments, the data processing device 220 may be configured to receive the data of standing objects on both sides of the road detected by the detection device, for example, data of a boundary of the standing objects on the corresponding side of the road.
[0022] In some embodiments, the data processing device 220 may be configured to calculate a road boundary of each side of the road, according to the data of standing objects
on the corresponding side of the road. For example, the data processing device 220 may receive data of the right boundary of all standing objects on the left side of the road, and analyze the received data to calculate (for example, by regression analysis, least square method, interpolation method, or other methods) a function of the left boundary of the road. For example, the data processing device 220 may receive data of the left boundary of all standing objects on the right side of the road, and analyze the received data to calculate (for example, by regression analysis, least square method, interpolation method, or other methods) a function of the right boundary of the road. The function of each of the road boundaries of both sides of the road may be, for example, a linear function, a quadratic function, or a higher-order function.
[0023] In some embodiments, the data processing device 220 may be configured to calculate a distance between the road boundaries of both sides of the road.
[0024] The data processing device 220 may calibrate the functions of the road boundaries of both sides of the road into the same coordinate system, after the functions of the road boundaries of both sides of the road are derived. In an example, the data processing device 220 may establish a X-Y rectangular plane coordinate system by taking a position where the ego-vehicle is located as an original point, taking a direction in which the ego-vehicle travels as the direction of X axis, and taking a direction perpendicular to the X axis as the direction of Y axis. The data processing device 220 may then calibrate the functions of the road boundaries of both sides of the road into the established X-Y coordinate system. In other examples, the data processing device 220 may establish other types of coordinate systems, such as, a polar coordinate system, a geodetic coordinate system and so on.
[0025] In an example, assuming that the functions of the road boundaries of both sides of the road in the X-Y coordinate system are fi (xi,yi) and f2 (x2,y2) respectively, the distance between the road boundaries of both sides of the road may be expressed as D= I fi (xi,yi) - f2 (Χ2 ¾) I ·
[0026] In another example, the data processing device 220 may select a plurality of data points on the function of the road boundary of either side of the road. The number of the data points should guarantee that the derived distance has an error no more than a predetermined threshold, for example, 0.1 meter.
[0027] The data processing device 220 may be configured to perform following calculations for each of the plurality of data points. Figs. 3A-B shows simplified models of example roads 300a and 300b, according to embodiments of the disclosure. The calculations for each of the plurality of data points are described by reference to the simplified models 300a and 300b of Figs. 3A-B.
[0028] Simplified model 300a describes a situation of an ideal road, i.e., the functions of the road boundaries of both sides of the road are two parallel straight lines. A data point (marked as A) is selected on the function of the right boundary of the road. A data point may also be selected on the function of the left boundary of the road. It is not limited herein. A tangent of the function of the right boundary of the road is calculated by using the data point A as a point of tangency (in this situation, the tangent coincides with the function of the right boundary of the road). A normal of the function is calculated on the data point A. The normal intersects with the function of the left boundary of the road at a data point (marked as B). Since the functions of the road boundaries of both sides of the road are two parallel straight lines, a length of the line segment I AB I is determined as the distance between the road boundaries of both sides of the road.
[0029] Simplified model 300b describes a situation in which the functions of the road boundaries of both sides of the road are two parallel curves. A data point (marked as A) is selected on the function of the right boundary of the road. A data point may also be selected on the function of the left boundary of the road. It is not limited herein. A tangent (marked as T) of the function of the right boundary of the road is calculated by using the data point A as a point of tangency. A normal (marked as V) of the function on the data point A is calculated by using the data point A as a vertical point. The normal V intersects with the function of the left boundary of the road at a data point (marked as B). Since the functions of the road boundaries of both sides of the road are two parallel curves, a length of the line segment I AB I is determined as the distance between the road boundaries of both sides of the road.
[0030] For situations in which the functions of the road boundaries of both sides of the road are unparallel higher-order functions, calculus may be used to decrease order of the functions, such that the functions may be partitioned into a number of curve segments which are approximate to straight line segments. The above calculation steps may be performed for each
of the curve segments. A function of the distance between the road boundaries of both sides of the road may be fitted based on the calculations for each of the curve segments. Related prior art may be referenced for details of calculus, which will not be repeated herein.
[0031] In some embodiments, the data processing device 220 may be configured to calculate one or more virtual lane markers, based on the calculated distance and a predefined lane width. In an example, the data processing device 220 is configured to calculate a number of lanes, by rounding down a result of dividing the distance between the road boundaries of both sides of the road by the predefined lane width, and determine the virtual lane markers, based on the number of lanes. In another example, the data processing device 220 is configured to subtract a predefined safety buffer distance (such as, 1 meter) from the distance between the road boundaries of both sides of the road, before dividing the distance by the predefined lane width.
[0032] Referring again to the simplified model 300a of Fig. 3 A. In an embodiment, assuming that the distance between the road boundaries of both sides of the road is 7 meter (m) (i.e., I AB I =7m), a predefined safety buffer distance on each side of the road is 0.5m, and a predefined lane width is 2.5m, the number of lanes is calculated as [(7-l)/2.5]=2. In some embodiments, the data processing device 220 may draw one or more lane markers, according to the distance between the road boundaries of both sides of the road (e.g., 7m), the predefined lane width (e.g., 2.5m), the number of lanes (e.g., 2), and/or the predefined safety buffer distance on each side of the road (e.g., 0.5m). For example, the data processing device 220 may set a lane marker at the central line (marked as "1" in Fig. 3 A) between the road boundaries of both sides of the road, set a lane marker (marked as "2" in Fig. 3 A) parallel to lane marker 1 and 2.5m right from lane marker 1, and set a lane marker (marked as "3" in Fig. 3A) parallel to lane marker 1 and 2.5m left from lane marker 1. In other embodiments, the lane markers may be drawn in other ways, for example, determining a set of lane markers one by one from a road boundary of either side of the road, according to the predefined safety buffer distance on each side of the road (e.g., 0.5m) and the predefined lane width (e.g., 2.5m).
[0033] Referring back to Fig.2, the storage device 230 may include one or more types of computer-readable storage media capable of storing data, including volatile memory,
non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and the like.
[0034] In some embodiments, the storage device 230 may store instructions and data for execution of the data processing device 220, for example, data detected by the detection device 210, the predefined lane width, the predefined safety buffer distance and so on. Optionally, the storage device 230 may also store data calculated by the data processing device 220, such as, data of the virtual lane markers. It should be noted that the predefined lane width, the predefined safety buffer distance and other data may be hard-coded in program instructions.
[0035] In some embodiments, the vehicle driving assistance system 200 may include the display device 240. The display device 240 may be configured to display the virtual lane markers to a driver of ego-vehicle. The display device 240 may include a display screen mounted on the ego-vehicle (such as, an instrument cluster screen and a media display screen etc.), a mobile device carried by the driver (such as, a mobile phone, a tablet and a notebook etc.), a projection and display device (such as, a projector and any component that may show content projected by the projector, for example, the windshield of the ego-vehicle), a wearable device (such as, smart glasses), and so on. In an example, a view of the road with one or more virtual lane markers drawn thereon may be displayed in the instrument cluster screen of ego-vehicle. In another example, the Augmented Reality (AR) technology is applied in the vehicle driving assistance system 200, such that in the driver' view, the virtual lane markers seem like physical lane markers on the road. For example, the virtual lane markers may be projected on the windshield of the ego -vehicle in such a way that the virtual lane markers seem to be drawn on the road. Alternatively, the virtual lane markers may be displayed on smart glasses worn by the driver, such that in the driver' view, the virtual lane markers seem like physical lane markers on the road.
[0036] In an example, the driver of ego-vehicle may manually monitor lane markers on the road, and manually initiate the vehicle driving assistance system 200 by a button, a knob, an operation on a touchscreen or by voice, when the lane markers are invisible. Optionally, the vehicle driving assistance system 200 may include a lane marker monitoring device (not shown). In some embodiments, the lane marker monitoring device may be a front camera
installed on the windshield of the vehicle. The lane marker monitoring device may be configured to manually monitor physical lane markers on the road, and automatically initiate the vehicle driving assistance system, when it is determined that the physical lane markers are invisible.
[0037] Referring to Fig. 4, Fig. 4 is a simplified flow chart of a vehicle driving assistance method 400 according to an embodiment of the disclosure. As described above, the vehicle driving assistance method 400 may be manually initiated by a driver of ego-vehicle or automatically initiated by the lane marker monitoring device. The vehicle driving assistance method 400 may be performed by, for example, the vehicle driving assistance system 200 of Fig. 2.
[0038] In step 410, data of standing objects on both sides of a road (for example, data of a boundary close to the road of the standing objects) is received by a data processing device (e.g., the data processing device 220 of Fig 2). The data of standing objects on both sides of a road may be detected, for example, by a detection device (e.g., the detection device 210 of Fig 2).
[0039] In step 420, a road boundary of each side of the road is calculated by the data processing device (e.g., the data processing device 220 of Fig 2), according to the data of standing objects on the corresponding side of the road. For example, the data processing device may receive data of the right boundary of all standing objects on the left side of the road, and analyze the received data to calculate (for example, by regression analysis, least square method, interpolation method, or other methods) a function of the left boundary of the road. For example, the data processing device may receive data of the left boundary of all standing objects on the right side of the road, and analyze the received data to calculate (for example, by regression analysis, least square method, interpolation method, or other methods) a function of the right boundary of the road. The function of each of the road boundaries of both sides of the road may be, for example, a linear function, a quadratic function, or a higher-order function.
[0040] In step 430, a distance between the road boundaries of both sides of the road is calculated by the data processing device (e.g., the data processing device 220 of Fig 2). In some embodiments, the data processing device may calibrate the functions of the road
boundaries of both sides of the road into the same coordinate system, after the functions of the road boundaries of both sides of the road are derived. The data processing device may use several ways to calculate the distance between the road boundaries of both sides of the road, as described above.
[0041] In step 440, one or more virtual lane markers are calculated by the data processing device (e.g., the data processing device 220 of Fig 2), based on the calculated distance and a predefined lane width. In an embodiment, the data processing device is configured to calculate a number of lanes, by rounding down a result of dividing the distance between the road boundaries of both sides of the road by the predefined lane width, and determine the virtual lane markers, based on the number of lanes. In another embodiment, the data processing device is configured to subtract a predefined safety buffer distance (such as, 1 meter) from the distance between the road boundaries of both sides of the road, before dividing the distance by the predefined lane width. The predefined lane width and predefined safety buffer distance may be stored in a storage device (e.g., the storage device 230 of Fig 2), or may be hard-coded in program instructions.
[0042] Optionally, in step 450, the calculated virtual lane markers are displayed to a driver of ego-vehicle by a display device (e.g., the display device 240 of Fig 2). In an example, a view of the road with one or more virtual lane markers drawn thereon may be displayed in the instrument cluster screen of ego-vehicle. In another example, the Augmented Reality (AR) technology is applied in the vehicle driving assistance system 200, such that in the driver' view, the virtual lane markers seem like physical lane markers on the road. For example, the virtual lane markers may be projected on the windshield of the ego-vehicle in such a way that the virtual lane markers seem to be drawn on the road. Alternatively, the virtual lane markers may be displayed on smart glasses worn by the driver, such that in the driver' view, the virtual lane markers seem like physical lane markers on the road.
[0043] Figure 5 shows a schematic structure diagram of a data processing device 500 by which the data processing device used in embodiments of the disclosure (e.g., the data processing device 220 of Fig 2) may be implemented. As shown in figure 5, the device 500 can include one or more of a processor 520, a memory 530, a power component 540, an input/output (I/O) interface 560 and a communication interface 580, which are
communicatively connected, for example, by a bus 510.
[0044] The processor 520 may generally control the operations of the device 500, such as the operations associated with data communication and computation process. The processor 520 may include one or more processing cores and can execute instructions to perform some or all of the steps in the method described in the present disclosure. The processor 520 may include various devices capable of processing, including but not limited to, general purpose processor, specific purpose processor, micro-processor, micro-controller, graphics processing unit (GPU), digital signal processor (DSP), ASIC, programmable logic device (PLD), FPGA, etc. The processor 520 may include a cache memory 525 or may communicate with the cache memory 525, in order to increase access speed of data.
[0045] The memory 530 is configured to store various instructions and/or data to support the operations of the device 500. Examples of data can include any data, instructions or the like of application programs or methods operating on the device 500. The memory 530 can be implemented by any volatile or non-volatile storage or their combinations. The memory 430 can include a semiconductor memory such as RAM, static RAM (SRAM), dynamic RAM (DRAM), ROM, programmable ROM (PROM), EPROM, EEPROM, flash memory, etc. The memory 530 may also include any storage using paper, magnetic, and/or optical media, such as tape, hard disk, cassette, floppy, magneto-optic (MO), CD, DVD, Blue -ray, etc.
[0046] The power component 540 provides power to various components of the device 500. The power component 540 may include an internal battery and/or an external power interface, and may include a power management system and other components associated with generating, managing and distributing power for the device 500.
[0047] The I/O interface 560 provides an interface by which a user can interact with the device 500. The I/O interface 560 may include an interface based on, for example, PS/2, RS-232, USB, FireWire, Lightening, VGA, HDMI, DisplayPort, etc., to enable user to interact with the device 500 by peripheral devices such as keyboard, mouse, touch pad, touch screen, joystick, button, microphone, loudspeaker, display, camera, projection port, etc.
[0048] The communication interface 580 is configured to enable the device 500 to communicate with other devices wiredly or wirelessly. Through the communication interface 580, the device 500 may access a network which is based on one or more communication
standards, such as Wi-Fi, 2G, 3G, 4G communication networks. In an example embodiment, the communication interface 580 may also receive broadcast signals or broadcast-related information from an external broadcast management system through a broadcast channel. Examples of communication interface 580 may include an interface which is based on Near Field Communication (NFC), Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wide Band (UWB), Bluetooth (BT), etc.
[0049] Function blocks shown in the above block diagrams can be implemented as hardware, software, firmware or their combination. When the function block is implemented in hardware, it may be, for example, electronic circuit, ASIC, suitable firmware, plug-in, functional card, etc. When the function block is implemented in software, the elements of the present application are programs or code segments for performing required tasks. The programs or code segments may be stored in a machine -readable media, or may be transmitted over some transmission media or communication links by signals in carrier waves. The term "machine -readable media" may include any media that can store or transmit information. Examples of the machine -readable media may include electronic circuit, semiconductor memory, ROM, flash memory, erasable ROM (EROM), floppy, CD-ROM, optical disk, hard disk, optical fiber, radio frequency link, etc. The code segments can be downloaded via a computer network such as Internet, intranet and the like.
[0050] Although the above description is given by means of the embodiments of the present disclosure, those skilled in the art will appreciate that the present invention may also have various equivalent changes and modifications, without departing from the spirit of the present invention. These equivalent changes and modifications are also included within the scope of the present disclosure.
Claims
1. A vehicle driving assistance method, comprising:
receiving data of standing objects on both sides of a road;
calculating a road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road;
calculating a distance between the road boundaries of both sides of the road; and calculating one or more virtual lane markers, based on the calculated distance and a predefined lane width.
2. The vehicle driving assistance method according to claim 1, wherein calculating one or more virtual lane markers comprises:
calculating a number of lanes, by rounding down a result of dividing the calculated distance by the predefined lane width; and
determining the virtual lane markers, based on the number of lanes.
3. The vehicle driving assistance method according to claim 2, wherein calculating one or more virtual lane markers further comprises: subtracting a predefined safety buffer distance from the calculated distance, before dividing the calculated distance by the predefined lane width.
4. The vehicle driving assistance method according to claim 1, wherein calculating the road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road comprises:
calculating a function of the road boundary of each side of the road, by analyzing the data of standing objects on the corresponding side of the road.
5. The vehicle driving assistance method according to claim 4, wherein calculating the distance between the road boundaries of both sides of the road comprises:
calibrating the functions of the road boundaries of both sides of the road into the same
coordinate system; and
calculating a function of the distance between the road boundaries of both sides of the road, by calculating an absolute difference between the calibrated functions of the road boundaries of both sides of the road.
6. The vehicle driving assistance method according to claim 4, wherein calculating the distance between the road boundaries of both sides of the road comprises:
calibrating the functions of the road boundaries of both sides of the road into the same coordinate system;
selecting a plurality of data points on the function of the road boundary of either side of the road;
for each of the plurality of data points:
calculating a tangent of the function by using the data point as a point of tangency; calculating a normal of the function on the data point; taking an intersection point of the normal and the function of the road boundary of the other side of the road as a corresponding data point of the data point; and
calculating a distance between the data point and the corresponding data point; and fitting a function of the distance between the road boundaries of both sides of the road, according to the distance between each of the plurality of data points and its corresponding data point.
7. The vehicle driving assistance method according to any of claims 1-6, further comprising displaying the virtual lane markers to a driver of ego-vehicle.
8. A vehicle driving assistance system, comprising:
a detection device, configured to detect data of standing objects on both sides of a road; and
a data processing device, configured to perform operations comprising:
receiving the data of standing objects on both sides of the road detected by the detection device;
calculating a road boundary of each side of the road, according to the data of standing objects on the corresponding side of the road;
calculating a distance between the road boundaries of both sides of the road; and calculating one or more virtual lane markers, based on the calculated distance and a predefined lane width.
9. The vehicle driving assistance system according to claim 8, wherein the data processing device is configured to:
calculate a number of lanes, by rounding down a result of dividing the calculated distance by the predefined lane width; and
determine the virtual lane markers, based on the number of lanes.
10. The vehicle driving assistance system according to claim 9, wherein the data processing device is configured to subtract a predefined safety buffer distance from the calculated distance, before dividing the calculated distance by the predefined lane width.
11. The vehicle driving assistance system according to claim 8, wherein the data processing device is configured to calculate a function of the road boundary of each side of the road, by analyzing the data of standing objects on the corresponding side of the road.
12. The vehicle driving assistance system according to claim 11, wherein the data processing device is configured to:
calibrate the functions of the road boundaries of both sides of the road into a same coordinate system; and
calculate a function of the distance between the road boundaries of both sides of the road, by calculating an absolute difference between the calibrated functions of the road boundaries of both sides of the road.
13. The vehicle driving assistance system according to claim 11, wherein the data processing device is configured to:
calibrate the functions of the road boundaries of both sides of the road into the same coordinate system;
select a plurality of data points on the function of the road boundary of either side of the road;
for each of the plurality of data points:
calculate a tangent of the function by using the data point as a point of tangency; calculating a normal of the function on the data point; take an intersection point of the normal and the function of the road boundary of the other side of the road as a corresponding data point of the data point; and
calculate a distance between the data point and the corresponding data point; and fit a function of the distance between the road boundaries of both sides of the road, according to the distance between each of the plurality of data points and its corresponding data point.
14. The vehicle driving assistance system according to any of claims 8-13, further comprising a display device, configured to display the virtual lane markers to a driver of ego-vehicle.
15. The vehicle driving assistance system according to any of claims 8-13, wherein the detection device includes one or more of a camera, a laser scanner, and a radar sensor.
16. The vehicle driving assistance system according to claim 14, wherein the display device includes one or more of a display panel, a projector, a mobile device, and a wearable device.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610738623.8A CN107784864A (en) | 2016-08-26 | 2016-08-26 | Vehicle assistant drive method and system |
| CN201610738623.8 | 2016-08-26 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018036892A1 true WO2018036892A1 (en) | 2018-03-01 |
Family
ID=59677233
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2017/070810 Ceased WO2018036892A1 (en) | 2016-08-26 | 2017-08-17 | Driving assistance method and system for vehicle |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN107784864A (en) |
| WO (1) | WO2018036892A1 (en) |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| FR3089927A1 (en) * | 2018-12-14 | 2020-06-19 | Renault S.A.S. | Method and device for keeping a vehicle in its lane. |
| CN113077528A (en) * | 2020-01-06 | 2021-07-06 | 阿里巴巴集团控股有限公司 | Method and device for adding lane line and storage medium |
| DE102020214327A1 (en) | 2020-11-13 | 2022-05-19 | Continental Automotive Gmbh | Method and system for determining a position of a traffic lane |
| CN114987545A (en) * | 2022-06-07 | 2022-09-02 | 高德软件有限公司 | Safety detection method and device for driving reference line |
| US12005832B2 (en) | 2018-09-05 | 2024-06-11 | Koito Manufacturing Co., Ltd. | Vehicle display system, vehicle system, and vehicle |
Families Citing this family (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109427199B (en) * | 2017-08-24 | 2022-11-18 | 北京三星通信技术研究有限公司 | Augmented reality method and device for driving assistance |
| CN109543636B (en) * | 2018-11-29 | 2020-12-15 | 连尚(新昌)网络科技有限公司 | A method and device for detecting sharp bends in roads |
| CN111460865B (en) * | 2019-01-22 | 2024-03-05 | 斑马智行网络(香港)有限公司 | Driving support method, driving support system, computing device, and storage medium |
| CN111060123A (en) * | 2019-06-13 | 2020-04-24 | 广东星舆科技有限公司 | Method and system for realizing lane-level navigation based on common map |
| CN110782443B (en) * | 2019-10-23 | 2023-04-07 | 四川大学 | Railway track defect detection method and system |
| CN111368409A (en) * | 2020-02-27 | 2020-07-03 | 杭州飞步科技有限公司 | Vehicle flow simulation processing method, device, equipment and storage medium |
| CN111829549B (en) * | 2020-07-30 | 2022-05-24 | 吉林大学 | Snow pavement virtual lane line projection method based on high-precision map |
| CN111965665A (en) * | 2020-08-26 | 2020-11-20 | 北京小马慧行科技有限公司 | Method and apparatus for determining driving lane of autonomous vehicle |
| CN111986513B (en) * | 2020-08-26 | 2021-08-10 | 马鞍山贺辉信息科技有限公司 | Automobile driving auxiliary early warning system based on cloud computing and Internet of things |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110022317A1 (en) * | 2008-07-11 | 2011-01-27 | Toyota Jidosha Kabushiki Kaisha | Travel supporting control system |
| WO2012042339A1 (en) * | 2010-10-01 | 2012-04-05 | Toyota Jidosha Kabushiki Kaisha | Driving support apparatus and driving support method |
| WO2013133752A1 (en) * | 2012-03-07 | 2013-09-12 | Scania Cv Ab | Method and system for determining vehicle position |
| DE102012224498A1 (en) * | 2012-10-26 | 2014-04-30 | Hyundai Motor Company | Lane Detection Method and System |
| US20140200801A1 (en) * | 2013-01-16 | 2014-07-17 | Denso Corporation | Vehicle travel path generating apparatus |
| WO2015009218A1 (en) * | 2013-07-18 | 2015-01-22 | Scania Cv Ab | Determination of lane position |
Family Cites Families (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102012001950A1 (en) * | 2012-02-02 | 2013-08-08 | Daimler Ag | Method for operating a camera arrangement for a vehicle and camera arrangement |
| US20150293355A1 (en) * | 2012-10-10 | 2015-10-15 | Renault S.A.S. | Head-up display device and method |
| CN102910130B (en) * | 2012-10-24 | 2015-08-05 | 浙江工业大学 | Forewarn system is assisted in a kind of driving of real enhancement mode |
| CN103105174B (en) * | 2013-01-29 | 2016-06-15 | 四川长虹佳华信息产品有限责任公司 | A kind of vehicle-mounted outdoor scene safety navigation method based on AR augmented reality |
| KR20150140449A (en) * | 2014-06-05 | 2015-12-16 | 팅크웨어(주) | Electronic apparatus, control method of electronic apparatus and computer readable recording medium |
| KR102255432B1 (en) * | 2014-06-17 | 2021-05-24 | 팅크웨어(주) | Electronic apparatus and control method thereof |
| CN105333883B (en) * | 2014-08-07 | 2018-08-14 | 深圳点石创新科技有限公司 | A kind of guidance path track display method and device for head up display |
| KR102214604B1 (en) * | 2014-09-05 | 2021-02-10 | 현대모비스 주식회사 | Driving support image display method |
| CN204202618U (en) * | 2014-09-26 | 2015-03-11 | 广东好帮手电子科技股份有限公司 | A kind of vehicle-mounted real scene navigation system |
| US20160109701A1 (en) * | 2014-10-15 | 2016-04-21 | GM Global Technology Operations LLC | Systems and methods for adjusting features within a head-up display |
| KR102383425B1 (en) * | 2014-12-01 | 2022-04-07 | 현대자동차주식회사 | Electronic apparatus, control method of electronic apparatus, computer program and computer readable recording medium |
| CN104598124A (en) * | 2014-12-24 | 2015-05-06 | 深圳市矽韦氏科技有限公司 | Method and system for regulating field angle of head-up display device |
| CN104627078B (en) * | 2015-02-04 | 2017-03-08 | 上海咔酷咔新能源科技有限公司 | Car steering virtual system based on flexible and transparent OLED and its control method |
| CN105730237A (en) * | 2016-02-04 | 2016-07-06 | 京东方科技集团股份有限公司 | Traveling auxiliary device and method |
-
2016
- 2016-08-26 CN CN201610738623.8A patent/CN107784864A/en active Pending
-
2017
- 2017-08-17 WO PCT/EP2017/070810 patent/WO2018036892A1/en not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110022317A1 (en) * | 2008-07-11 | 2011-01-27 | Toyota Jidosha Kabushiki Kaisha | Travel supporting control system |
| WO2012042339A1 (en) * | 2010-10-01 | 2012-04-05 | Toyota Jidosha Kabushiki Kaisha | Driving support apparatus and driving support method |
| WO2013133752A1 (en) * | 2012-03-07 | 2013-09-12 | Scania Cv Ab | Method and system for determining vehicle position |
| DE102012224498A1 (en) * | 2012-10-26 | 2014-04-30 | Hyundai Motor Company | Lane Detection Method and System |
| US20140200801A1 (en) * | 2013-01-16 | 2014-07-17 | Denso Corporation | Vehicle travel path generating apparatus |
| WO2015009218A1 (en) * | 2013-07-18 | 2015-01-22 | Scania Cv Ab | Determination of lane position |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12005832B2 (en) | 2018-09-05 | 2024-06-11 | Koito Manufacturing Co., Ltd. | Vehicle display system, vehicle system, and vehicle |
| FR3089927A1 (en) * | 2018-12-14 | 2020-06-19 | Renault S.A.S. | Method and device for keeping a vehicle in its lane. |
| CN113077528A (en) * | 2020-01-06 | 2021-07-06 | 阿里巴巴集团控股有限公司 | Method and device for adding lane line and storage medium |
| DE102020214327A1 (en) | 2020-11-13 | 2022-05-19 | Continental Automotive Gmbh | Method and system for determining a position of a traffic lane |
| CN114987545A (en) * | 2022-06-07 | 2022-09-02 | 高德软件有限公司 | Safety detection method and device for driving reference line |
Also Published As
| Publication number | Publication date |
|---|---|
| CN107784864A (en) | 2018-03-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018036892A1 (en) | Driving assistance method and system for vehicle | |
| US11248925B2 (en) | Augmented road line detection and display system | |
| US10281919B2 (en) | Attributed roadway trajectories for self-driving vehicles | |
| CN113928340B (en) | Obstacle avoidance method and device applied to vehicle, electronic equipment and storage medium | |
| US11377119B2 (en) | Drifting correction between planning stage and controlling stage of operating autonomous driving vehicles | |
| JP6161942B2 (en) | Curve shape modeling device, vehicle information processing system, curve shape modeling method, and curve shape modeling program | |
| US20190317508A1 (en) | Cost design for path selection in autonomous driving technology | |
| CN107449440A (en) | The display methods and display device for prompt message of driving a vehicle | |
| CN106643781A (en) | Vehicle navigation display system and method thereof | |
| CN111094095A (en) | Automatically receive driving signals | |
| CN108016445B (en) | System and method for vehicular application of traffic flow | |
| US20140324329A1 (en) | Safe distance determination | |
| US12125222B1 (en) | Systems for determining and reporting vehicle following distance | |
| US11390292B2 (en) | Method for proposing a driving speed | |
| US20240378900A1 (en) | Image generation system, image generation method, and recording medium | |
| CN114720148B (en) | Method, device, equipment and storage medium for determining vehicle perception capability | |
| AU2013101373A4 (en) | Method and system for producing accurate digital map for vehicle systems | |
| CN115187937A (en) | Method and device for determining a road boundary for a vehicle | |
| CN114643984A (en) | Driving hedging methods, devices, equipment, media and products | |
| CN114550130A (en) | Method, device, equipment and storage medium for determining stopping point | |
| CN119550978B (en) | Adaptive cruise control method, device, equipment and storage medium | |
| CN119749553B (en) | Vehicle control methods, devices, vehicles and storage media | |
| CN111637898B (en) | Processing method and device for high-precision navigation electronic map | |
| CN119749592B (en) | A method, device, equipment and medium for making decisions on traffic flow at an intersection | |
| Liu et al. | Readiness Evaluation on Intersections for Automated Vehicle Based on Left-Turn Safety Sight Zones: A Field Test Study |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17754704 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17754704 Country of ref document: EP Kind code of ref document: A1 |