WO2013018961A1 - Apparatus and method for detecting lane - Google Patents
Apparatus and method for detecting lane Download PDFInfo
- Publication number
- WO2013018961A1 WO2013018961A1 PCT/KR2011/009801 KR2011009801W WO2013018961A1 WO 2013018961 A1 WO2013018961 A1 WO 2013018961A1 KR 2011009801 W KR2011009801 W KR 2011009801W WO 2013018961 A1 WO2013018961 A1 WO 2013018961A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lane
- feature points
- basis
- information
- fitting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
- B60W40/072—Curvature of the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/174—Segmentation; Edge detection involving the use of two or more images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/181—Segmentation; Edge detection involving edge growing; involving edge linking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0043—Signal treatments, identification of variables or parameters, parameter estimation or state estimation
- B60W2050/0048—Addition or subtraction of signals
- B60W2050/0049—Signal offset
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0043—Signal treatments, identification of variables or parameters, parameter estimation or state estimation
- B60W2050/0057—Frequency analysis, spectral techniques or transforms
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to an apparatus and method for detecting a lane.
- the driving system needs to detect the lines first by using an image recognition apparatus in order to detect that the driving system is leaving its line and send a warning to the outside.
- a recognition apparatus using images captures images of a target to be recognized by a driving system by using a camera, extracts features of the target by using digital image processing techniques, and checks the target by using the extracted features.
- a line which is a target, needs to be more accurately extracted from a moving object.
- a lane refers to a reference line for driving and becomes a basis on which all driving actions such as going forward, going backward, changing lanes, changing directions, forward parking, reverse parking, and perpendicular parking are performed.
- a method of detecting a lane is mainly applied to advanced safety vehicles (ASV) that are currently being high-functioning and intelligent.
- This method of detecting a lane is applied to a lane departure warning system to prevent dozing off behind the wheel, a rear parking guide and perpendicular parking guide system to help novice drivers park their cars, and a lane keeping system to apply torque to a steering wheel in a dangerous situation to keep a lane.
- the application of the lane detection method has gradually been expanded.
- Methods of detecting a lane include mapping lane points corresponding to coordinates of the captured images to the two-dimensional coordinate system, and detecting and displaying positions of the lanes.
- an object of the present invention is to provide an apparatus and method for accurately detecting a lane even when road changes are serious such as when a curved road with high curvature is formed of a broken line and the road ends in front of a vehicle and begins again.
- an apparatus for detecting a lane including: a camera module capturing an image; a control unit extracting a plurality of feature points from the image, carrying out lane fitting to connect the plurality of feature points with a single line, and tracking the lane fitted; and a display unit displaying the lane tracked, wherein the lane fitting includes obtaining information about feature points extracted at an arbitrary past time, estimating positions of the feature points, extracted at the past time, at a current time on the basis of driving information from the arbitrary past time to the current time, determining an offset representing lateral inclination of the lane on the basis of the positions at the current time, and carrying out curve fitting on the basis of the offset.
- the curve fitting may include determining coefficients of a curve equation in arbitrary dimension on the basis of the curve equation.
- the driving information may be at least one of speed information, acceleration information, and steering information.
- the camera module may include at least one pair of cameras separated by a horizontal interval in the same central axis in the same plane, or a single camera.
- the plurality of feature points may be extracted only from a region of interest defined only on a lower part of a road on the basis of the horizon within the image.
- the plurality of feature points may be extracted on the basis of gradient information or color information of the image.
- the control unit may transform the plurality of feature points to a world coordinate system and fit the lane on the basis of the feature points transformed to the world coordinate system.
- the tracking of the lane may be performed on every fitted lane.
- the apparatus may further include a storage unit storing information about the feature points extracted, wherein the control unit fits the lane on the basis of the information about the feature points stored in the storage unit.
- an apparatus for detecting a lane including: a camera module capturing an image; a control unit extracting a plurality of feature points from the image, carrying out lane fitting to connect the plurality of feature points with a single line, and tracking the lane fitted; and a display unit displaying the lane tracked, wherein the lane fitting includes obtaining an offset representing lateral inclination of the lane of a lane fitting result at an arbitrary past time, estimating an offset at a current time with respect to the offset on the basis of driving information from the arbitrary past time to the current time, and carrying out curve fitting on the basis of the offset at the current time.
- a method of detecting a lane including: capturing an image; extracting a plurality of feature points from the image; carrying out lane fitting to connect the plurality of feature points with a single line; tracking the lane fitted; and displaying the lane tracked, wherein the carrying out of the lane fitting includes obtaining information about feature points extracted at an arbitrary past time; estimating positions of the feature points, extracted at the arbitrary past time, at a current time on the basis of driving information from the arbitrary past time to the current time; determining an offset representing lateral inclination of the lane on the basis of the positions at the current time; and carrying out curve fitting on the basis of the offset.
- the carrying out of the curve fitting may include determining coefficients of a curve equation in arbitrary dimension on the basis of the curve equation.
- the driving information may be at least one of speed information, acceleration information, and steering information.
- the camera module may include at least one pair of cameras separated by a horizontal interval in the same central axis in the same plane, or a single camera.
- the extracting of the plurality of feature points may include: defining a region of interest only on a lower part of a road on the basis of the horizon within the image; and extracting the plurality of feature points only from the region of interest.
- the plurality of feature points may be extracted on the basis of gradient information or color information about the image.
- the fitting of the lane may include: transforming the plurality of feature points to a world coordinate system; and fitting the lane on the basis of the feature points transformed to the world coordinate system.
- the tracking of the lane may be performed on every fitted lane.
- the extracting of the feature points may further include storing information about the feature points, wherein the fitting of the lane includes fitting the lane on the basis of the information about the feature points.
- an apparatus for detecting a lane According to an apparatus for detecting a lane according to the present disclosure, all lanes appearing in a driving route are tracked, and lanes are detected from a tracking result, so that lanes can be accurately detected even in changing road conditions such as interchanges and new driving lanes can be detected quickly even when lanes are changed.
- an apparatus for detecting a lane when a curved road with high curvature is formed of a broken line, curve fitting is performed on the current lanes on the basis of feature points extracted in the past or a lane fitting result in the past, so that a result close to the actual lane can be obtained and at the same time, an accurate offset can be determined.
- FIG. 1 is a block diagram illustrating the configuration of an apparatus for detecting a lane according to an exemplary embodiment according to the present disclosure
- FIG. 2 is a flowchart illustrating a lane detection process according to an exemplary embodiment according to the present disclosure
- FIG. 3 is a view illustrating an image according to an exemplary embodiment according to the present disclosure
- FIG. 4 is a view illustrating a result of extracting feature points according to an exemplary embodiment according to the present disclosure
- FIG. 5 is a view illustrating a result of transforming extracted feature points to a world coordinate system according to an exemplary embodiment according to the present disclosure
- FIG. 6 is a view illustrating a lane fitting result according to an exemplary embodiment according to the present disclosure.
- FIG. 7 is a view displaying a lane tracking result according to an exemplary embodiment according to the present disclosure.
- FIG. 8 is a flowchart illustrating a curve fitting process according to an exemplary embodiment according to the present disclosure
- FIG. 9 is a view illustrating one example of a road on which an apparatus for detecting a lane drives according to an exemplary embodiment according to the present disclosure
- FIG. 10 is a view illustrating the comparison between the actual lane and a curve fitting result lane
- FIGS. 11a and 11b are views illustrating a result of obtaining information about feature points extracted at a past time according to an exemplary embodiment according to the present disclosure
- FIG. 12 is a view illustrating a result of estimating changed positions of feature points on the basis of driving information according to an exemplary embodiment according to the present disclosure.
- FIG. 13 is a view illustrating a curve fitting result according to an exemplary embodiment according to the present disclosure.
- Exemplary embodiments according to the present disclosure may stand alone and may also be applied to various types of terminals including mobile terminals, telematics terminals, smartphones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), notebook computers, tablet PCs, Wibro terminals, Internet protocol television (IPTV) terminals, televisions, 3D televisions, video equipment, navigation terminals, and audio video navigation (AVN) terminals.
- mobile terminals telematics terminals, smartphones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), notebook computers, tablet PCs, Wibro terminals, Internet protocol television (IPTV) terminals, televisions, 3D televisions, video equipment, navigation terminals, and audio video navigation (AVN) terminals.
- PDAs personal digital assistants
- PMPs portable multimedia players
- notebook computers tablet PCs
- Wibro terminals Wireless Fidelity terminals
- IPTV Internet protocol television
- 3D televisions televisions
- video equipment video equipment
- navigation terminals audio video
- Embodiments described herein may be implemented in a program command form, capable of being performed through various computer means, recorded in a computer-readable recording medium.
- Examples of the computer-readable recording medium may include a program command, a data file, and a data structure separately or in a combination thereof.
- the program command recorded in a recording medium may be a command designed or configured specially for an exemplary embodiment of the present invention, or usably known to a person having ordinary skills in the art.
- Examples of the computer-readable recording medium may include a hardware device specially configured to store and execute a program command such as magnetic media such as a hard disc, a floppy disc, and a magnetic tape; optical media such as Compact Disc-Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD); magneto-optical media such floptical disk; ROM; RAM; and flash memory.
- Examples of the program command may include a high level language code executable by a computer using an inter-print as well as a machine language code such as a code created by a compiler.
- the hardware device can be configured as at least one software module to execute an operation of an exemplary embodiment of the present invention. The same applies to an example thereof.
- FIG. 1 is a block diagram illustrating an apparatus for detecting a lane according to an exemplary embodiment of the present disclosure.
- the apparatus for detecting a lane may include a camera module 110, a control unit 120, a storage unit 130, an output unit 140, and a sensor unit 150.
- the camera module 110 is a camera system that captures front, rear, and/or lateral images at the same time by using a rotary reflector, a condenser lens, and an imaging device.
- the camera module 110 may have application to security facilities, surveillance cameras, and robot visions.
- the rotary reflector may have various shapes such as a conicoid or spherical, conical, or combined shape.
- the camera module 110 may include at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval in the same central axis in the arbitrary same plane of the lane detection apparatus 100, or a single camera. At this time, the horizontal interval may be determined in consideration of the distance between eyes of the average person, and may be set when the lane detection apparatus 100 is configured.
- the camera module 110 may be any camera module that can capture an image.
- a Charge Coupled Device CCD
- CMOS Complementary Metal Oxide Semiconductor
- the camera module 110 may include at least one of a stereo camera and a moving stereo camera in order to capture an image of the front.
- the stereo camera is an image apparatus that is composed of a plurality of cameras. With an image that is captured by the camera module 110, two-dimensional information about surrounding areas of the camera module 110 can be provided. By using a plurality of images that are captured by a plurality of cameras in different directions, three-dimensional information about the surrounding areas of the camera module 110 can be obtained.
- the moving stereo camera refers to a camera that fixes vergence for observational obstacles as the position of the stereo camera actively changes according to the distance of the obstacles.
- the stereo camera generally includes two cameras that are arranged next to each other to capture images, and the distance of the obstacles can be calculated according to stereo disparity of the captured images.
- the stereo camera is a passive camera in which an optical axis is always arranged parallel and fixed.
- the moving stereo camera can fix vergence by actively changing the geometric position of the optical axis.
- the control of vergence of the stereo camera according to the distance of the obstacles is called vergence control.
- the vergence control stereo camera may keep constant stereo disparity related to a moving object to thereby provide a 3D image observant with more natural 3D images and also provide useful information in terms of measuring the distance of the obstacles and processing stereo images.
- the control unit 120 may control the general operation of the lane detection apparatus 100.
- the control unit 120 may perform the control of various types of power driving units in order for the lane detection apparatus 100 to drive.
- control unit 120 processes an image received from the camera module 110, carries out lane detection, and processes other operations.
- control unit 120 may use driving information that is collected by the sensor unit 150 for the above-described lane detection.
- a lane detection process of the control unit 120 will be described in detail with reference to FIGS. 2 to 13.
- control unit 120 may perform functions related to lane keeping (including a lane departure warning message function and an automatic lane keeping function) on the basis of the position of the lane detection apparatus 100 (or a vehicle having the lane detection apparatus 100) that is detected by an arbitrary GPS module and the detected lane.
- functions related to lane keeping including a lane departure warning message function and an automatic lane keeping function
- the storage unit 130 may store data and programs for the operation of the control unit 120 and temporarily store data being input/output.
- the storage unit 120 may temporarily store an image that is received by the camera module 110, processing information related to the image, and lane detection information.
- the storage unit 120 may store operation expressions (for example, curve equations) used to process the image.
- the storage unit 120 may store feature points that are extracted at an arbitrary time during the lane detection process or a lane fitting result.
- the stored feature points extracted at the arbitrary time in the past or the stored lane fitting result may be used to perform the current lane fitting.
- the storage unit 130 may store software components including an operating system, a module performing a wireless communication unit, a module operating together with a user input unit, a module operating together with an A/V input unit, and a module operating together with the output unit 140.
- the operating system for example, LINUX, UNIX, OS X, WINDOWS, Chrome, Symbian, iOS, Android, VxWorks or other embedded operating systems
- the storage unit 130 may include at least one storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), magnetic memory, magnetic disk, and optical disk.
- the lane detection apparatus 100 may operate in relation with a web storage that performs the storage function of the storage unit 130 on the Internet.
- the output unit 140 generates outputs related to sight, hearing, or touch and may include a display unit 141 and a sound output module 142.
- the display unit 141 may output information that is processed by the lane detection apparatus 100. For example, when the lane detection apparatus 100 is driving, the display unit 141 may display a UI (User Interface) or a GUI (Graphic User Interface) related to driving.
- UI User Interface
- GUI Graphic User Interface
- the display unit 141 may display the images captured by the camera module 110 of the lane detection apparatus 100 and/or information about lanes detected by the control unit 120.
- the display unit 141 displays the images and the information about the detected lanes at the same time.
- the images and the information may be displayed separately at top and bottom or left and right, or the information about the detected lanes may be overlapped with the images.
- the display unit 141 may include at least one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display, and a 3D display.
- LCD liquid crystal display
- TFT LCD thin film transistor-liquid crystal display
- OLED organic light-emitting diode
- flexible display and a 3D display.
- the displays may be transparent type or transmissive type displays, which may be called transparent displays.
- Typical examples of the transparent displays may include a Transparent OLED (TOLED).
- a rear structure of the display unit 141 may also be formed of a transmissive structure.
- two or more display units 141 may exist.
- a plurality of display units may be arranged separately or integrally on one surface of the plurality of display units of the lane detection apparatus 100 or may be arranged separately on different surfaces thereof.
- the display unit 141 may have a layered structure with a touch sensor that senses a touch.
- the display unit 141 may serve as an input device as well as an output device.
- the touch sensor may be formed as a touch film, a touch sheet, a touch pad, or a touch panel.
- the touch sensor may be configured to convert variations in pressure applied to a particular portion of the display unit 141 or capacitance generated at the particular portion of the display unit 141 into electrical input signals.
- the touch sensor may be configured to sense pressure applied when being touched as well as touch position and touch area.
- a corresponding thereto is sent to a touch controller.
- the touch controller processes the signal and transmits corresponding data to the control unit 120. In this manner, the control unit 120 is informed which area of the display unit 141 is touched.
- the sound output module 142 may output audio data stored in the storage unit 130 in recording mode and voice recognition mode.
- the sound output module 142 may output sound signals related to a lane detection result (for example, an alarm regarding a kind of a detected lane) and functions regarding lane detection (for example, lane departure warning and automatic lane keeping alarm).
- the sound output module 142 may include a receiver, a speaker, and a buzzer.
- the sensor unit 150 collects driving information of the lane detection apparatus 100 and may include a speed sensor 151 and a steering sensor 152.
- the speed sensor 151 senses a speed of the lane detection apparatus 100. Since a gear ratio of a differential gear and the size of tires are determined in the lane detection apparatus 100, the speed sensor 151 may calculate the speed of the lane detection apparatus 100 on the basis of a transmission output shaft or rotation numbers of wheels. In addition, the speed sensor 151 may be formed of any one of a reed switch type, a magnetic type, and a hole type.
- the speed sensor 151 may sense acceleration of the lane detection apparatus 100 on the basis of variations of the speed of lane detection apparatus 100.
- the lane detection apparatus 100 may include a separate acceleration sensor to sense the acceleration of the lane detection apparatus 100.
- the steering sensor 152 senses a steering motion of the lane detection apparatus 100, that is, a wheel angle. Therefore, the steering sensor 152 is interlocked with a steering wheel of the lane detection apparatus 100 and may include a rotating rotor, a gear rotating integrally with the rotor, a sensing unit sensing a phase change caused by the rotation of a magnetic substance that generates magnetic forces, an operation unit that operates and outputs an input of the sensing unit, and a PCB and a housing to mount the operation unit.
- the lane detection apparatus 100 may include a communication unit that performs communications with an arbitrary terminal or server under the control of the control unit 120.
- the communication unit may include a wired/wireless communication module.
- the wireless internet technique may include a wireless local area network (WLAN), Wi-Fi, wireless broadband (WiBro), world interoperability for microwave access (WiMAX), high speed downlink packet access (HSDPA), IEEE 802.16, long term evolution (LTE), and wireless mobile broadband service (WMBS).
- Examples of the short range communication technology may include Bluetooth, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), and ZigBee.
- the wired communication technology may include Universal Serial Bus (USB) communication.
- the communication module may include CAN communication, Ethernet for a vehicle, flexray, and a Local Interconnect Network (LIN) in order to perform communications with an arbitrary vehicle having the lane detection apparatus 100.
- CAN communication Ethernet for a vehicle
- flexray flexray
- LIN Local Interconnect Network
- the communication unit may receive an image captured by an arbitrary camera module from the arbitrary terminal or server. Moreover, the communication unit may transmit lane detection information about the arbitrary image to the arbitrary terminal or server under the control of the control unit 120.
- the lane detection apparatus 100 may be implemented with a larger number of components shown in FIG. 1, or the lane detection apparatus 100 may be implemented with a smaller number of components.
- FIG. 2 is a flowchart illustrating a lane detection process according to the present disclosure.
- the lane detection apparatus 100 obtains an image S21.
- the camera module 110 may obtain a first image and a second image that are captured by at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval in the same central axis in the same plane of the lane detection apparatus 100, or an image that is captured by a single camera.
- the first image may be a left image captured by a left camera included in the one pair of cameras
- the second image may be a right image captured by a right camera included in the one pair of cameras.
- the camera module 110 may receive any one of the first image and the second image that are captured by the one pair of cameras.
- the camera module 110 may obtain an image 310 as shown in FIG. 3.
- the lane detection apparatus 100 extracts feature points of a lane S22.
- the control unit 120 extracts a plurality of feature points (edge points) 410 that are present in the image.
- the control unit 120 may set only the lower part of roads on the basis of the horizon as a Region of Interest (ROI), and extract the feature points 410 only in the region of interest.
- ROI Region of Interest
- the feature points 410 may be extracted by using various algorithms.
- the control unit 120 may extract the feature points 410 on the basis of gradient information of the obtained image 310. That is, when brightness or gradations of color between adjacent pixels of the obtained image 310 gradually change, the control unit 120 may not regard this as the feature points 410. On the other hand, when brightness or gradations of color between adjacent pixels of the obtained image 310 drastically change, the control unit 120 may regard this as the feature points 410 and extract corresponding pixel information.
- the feature points 410 may be formed of discontinuous points on the boundary of two regions that are distinct in terms of brightness or gradations of color between the pixels.
- the control unit 120 may extract feature points 410 on the basis of color information about the obtained image 310.
- color information about the obtained image 310 In general, on roads, general lanes appear white, centerlines appear yellow, and the other parts except lanes appear black. Therefore, the control unit 120 may extract the feature points 410 on the basis of color information about lanes. That is, the control unit 120 may create a region having colors that can be classified as lanes from the image 310 as one object, classify only the region corresponding to the roads as a region of interest in order to exclude other objects driving the roads, and extract the feature points 410 from the object created on the basis of the color information within the region of interest.
- the present invention is not limited thereto, and the feature points 410 may be extracted by various types of algorithms or filters for extracting feature points, such as an Edge Tracing algorithm or a Boundary Flowing algorithm.
- control unit 120 may store information about the extracted feature points in the storage unit 130 and then use the information in order to fit a lane from an image to be captured later. That is, the control unit 120 may determine an offset of a lane on the basis of information about feature points extracted in the past and perform curve fitting on the current lane on the basis of the determined offset value.
- the control unit 120 may transform the extracted feature points 410 to a world coordinate system.
- the control unit 120 may use transformation matrices or coordinate transformation equations.
- the transformation matrices may be homographic matrices that are stored in advance in the storage unit 130.
- the control unit 120 keeps the same vertical and horizontal intervals of a plurality of feature points 510 that are transformed to the world coordinate system to thereby detect errors that occur during coordinate transformation.
- the lane detection apparatus 100 fits the lane S23.
- control unit 120 carries out lane fitting in order to express the extracted feature points 510 into a single line 610.
- the control unit 120 may use any one method of a least square method, Random Sample Consensus (RANSAC), a general hough transform method, and a spline interpolation method in order to extract a straight line or a curved line from the image.
- RANSAC Random Sample Consensus
- general hough transform method a general hough transform method
- spline interpolation method a spline interpolation method
- the control unit 120 may carry out lane fitting on the basis of a curve equation. Specifically, the control unit 120 substitutes the extracted feature points 510 into the curve equation to obtain coefficients and carries out curve fitting on the basis of a result of the curve equation whose coefficients are obtained.
- the curve equation is stored in advance in the storage unit 130 and may be an arbitrary multidimensional equation.
- the control unit 120 determines that the feature points 510 form a straight line if a is 0 and that the feature points 510 are a curve if a is not 0.
- a is a curve derivative
- b is a curvature
- c is a heading angle
- d is an offset in the quadratic equation. If both a and b are 0, a straight line is detected, and c is a heading angle, and a d is an offset.
- control unit 120 may store the fitting result in the storage unit 130 and use the stored fitting result for the next lane fitting.
- control unit 120 may carry out curve fitting on the current lane on the basis of the feature points extracted in the past or the past lane fitting result.
- control unit 120 may determine an offset of a lane on the basis of the feature points extracted in the past or the past lane fitting result first, and then carry out curve fitting on the current lane on the basis of the determined offset value.
- the lane detection apparatus 100 tracks the lane S24.
- control unit 120 carries out tracking on the basis of the plurality of feature points 510 corresponding to the fitted lane.
- calibration refers to calculating the transformation relationship between the camera module 110 and the world coordinate system.
- control unit 120 may carry out tracking on all of the plurality fitted lanes. That is, the control unit 120 may carry out tracking on adjacent lanes present in the driving road in addition to the lane in which the lane detection apparatus 100 is driving.
- control unit 120 can quickly detect a lane from the existing tracking information without newly detecting a lane even when the lane detection apparatus 100 changes lanes.
- control unit 120 may estimate and detect lanes on the basis of tracking information of the plurality of lanes (for example, positions of lanes, a lane width, and curve equations of curved lanes).
- control unit 120 may store the tracking result in the storage unit 130.
- control unit 120 can correct errors on the basis of the stored tracking result even when some errors occur during lane fitting.
- control unit 120 may store the tracking result in the storage unit 130 and use the tracking result in order to fit a lane from an image to be obtained later. That is, the control unit 120 may determine an offset of a lane on the basis of the past lane tracking result first, and then carry out curve fitting on the current lane on the basis of the determined offset value.
- the lane detection apparatus 100 may display a lane tracking result S25.
- the output unit 140 may display the tracking result through the lane detection process as shown in FIG. 7.
- the output unit 140 may display the image 310 captured by the camera module 110 of the lane detection apparatus 100 and/or a lane tracking result 710 detected by the control unit 120.
- the output unit 140 displays the image 310 and the detected lane tracking result 710 at the same time.
- the image 310 and the detected lane tracking result 710 may be displayed separately at top and bottom or left and right, or the information about the detected lane may be overlapped with the image 310.
- the output unit 140 may output sound signals related to a lane detection result (for example, an alarm regarding a kind of a detected lane) and functions regarding lane detection (for example, lane departure warning and automatic lane keeping alarm).
- a lane detection result for example, an alarm regarding a kind of a detected lane
- functions regarding lane detection for example, lane departure warning and automatic lane keeping alarm
- FIG. 8 is a flowchart illustrating a curve fitting process according to an exemplary embodiment according to the present disclosure.
- the lane detection apparatus 100 obtains information about feature points extracted in the past S231.
- the control unit 120 may carry out lane fitting by using a curve equation.
- road changes are serious, for example, when a curved road with high curvature is formed of a broken line, and the curved road ends and begins again in front of a vehicle, the accuracy of the offset of the detected lane may significantly decrease.
- the offset represents whether the lane is located at the left or right of the lane detection apparatus 100.
- the offset refers to a value corresponding to a constant term (a coefficient in the first term). That is, when the constant term has a positive value, the lane is located at the right side of the lane detection apparatus 100. On the other hand, when the constant term has a negative value, the lane is located at the left side of the lane detection apparatus 100.
- the lane detection apparatus 100 may drive in an actual lane 910 that has high curvature and is formed of a broken line. Since the actual lane 910 is formed of a broken line, the actual lane 910 may be divided into blank segments 901 and line segments 902. Since the lane detection apparatus 100 is driving, the lane detection apparatus 100 is located at 80 on the x-axis at an arbitrary past time t-1 and at 20 on the x-axis at a current time t.
- the control unit 120 may acquire information about the feature points 520 extracted at the past time t-1.
- the feature points 520 are extracted at an arbitrary time in the past, that is, the time t-1, which may be obtained from the storage unit 130.
- the arbitrary time in the past may be determined beforehand or vary flexibly. In order to accurately detect the lanes of the road on which the lane detection apparatus 100 drives, the arbitrary time may not be determined as a time in the distant past.
- the control unit 120 When the control unit 120 obtains the information about the feature points 520 at the current time t, as shown in FIG. 11b, the information about the feature points 520 includes the original location information for the past time t-1 and cannot be used to fit a lane at the current time t. Therefore, the control unit 120 needs to correct the information about the feature points 520 by estimating the information to be a correct value with respect to the current time t.
- the lane detection apparatus 100 estimates the changed positions of the feature points on the basis of the driving information S232.
- the information about the feature points 520 needs to be estimated to a correct value with respect to the current time t.
- the image that is acquired by the lane detection apparatus 100 at the past time t-1 is moved to the opposite side surface and rear at the current time t due to the driving motion of the lane detection apparatus 100 to one side and front thereof (result of analyzing curvilinear motion).
- control unit 120 may estimate the changed positions of the feature points 520 in order to correct the positions of the feature points 520 at the past time t-1 into then positions at the current time t.
- control unit 120 may estimate the changed positions on the basis of the driving information of the lane detection apparatus 100 that is colleted by the sensor unit 150.
- the driving information may include speed information, acceleration information, or steering motion information of the lane detection apparatus 100 that is collected by the sensor unit 150.
- the control unit 120 may determine whether the positions of the feature points 520 are changed from top to bottom or from left to right, on the basis of the collected driving information. In addition, as shown in FIG. 12, the control unit 120 may change the feature points 520 to the estimated positions (520') on the basis of a result of the determination.
- the lane detection apparatus 100 determine an offset on the basis of the feature points whose positions are estimated S233.
- the control unit 120 determines a constant term that represents an offset of a lane among coefficients of a curve equation on the basis of the feature points 520 whose positions are estimated.
- control unit 120 may determine a value c that represents an offset of a lane as coordinate values according to the positions of the feature points 520 whose positions are estimated.
- control unit 120 can accurately determine the offset of the lane on the basis of the information at the past time and the driving information of the lane detection apparatus 100.
- the control unit 120 may determine the offset on the basis of the lane fitting result at the past time instead of the feature points 520 whose positions are estimated. For example, the control unit 120 obtains the value c representing the offset of the lane from an equation that represents a lane detected from the lane fitting result at the past time, estimates a change in the value c on the basis of the driving information, and determines the changed estimated value as the offset at the current time.
- the lane detection apparatus 100 carries out curve fitting at the current time S234.
- the control unit 120 may carry out curve fitting on the lane at the current time of the lane on the basis of the determined offset to thereby obtain the other coefficients of the curve equation.
- the value c is determined as a predetermined offset on the basis of the feature points 520 whose positions are estimated, regardless of curve fitting, and curve fitting is performed on the basis of the extracted feature points 510 to determine the other constants a and b.
- the control unit 120 can determine an accurate offset of the actual lane 910 by using the information about the feature points 520 extracted at the past time (see FIG. 9). In addition, by carrying out curve fitting on the basis of the correctly determined offset, and as a lane fitting result 930 at the current time, the lane can be accurately detected.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Human Computer Interaction (AREA)
- Geometry (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
Disclosed is an apparatus for detecting a lane, the apparatus including: a camera module capturing an image; a control unit extracting a plurality of feature points from the image, carrying out lane fitting to connect the plurality of feature points with a single line, and tracking the lane fitted; and a display unit displaying the lane tracked, wherein the lane fitting includes obtaining information about feature points extracted at an arbitrary past time, estimating positions of the feature points, extracted at the past time, at a current time on the basis of driving information from the arbitrary past time to the current time, determining an offset representing lateral inclination of the lane on the basis of the positions at the current time, and carrying out curve fitting on the basis of the offset. In addition, disclosed is a method of detecting a lane, the method including: capturing an image; extracting a plurality of feature points from the image; carrying out lane fitting to connect the plurality of feature points with a single line; tracking the lane fitted; and displaying the lane tracked, wherein the carrying out of the lane fitting includes obtaining information about feature points extracted at an arbitrary past time; estimating positions of the feature points, extracted at the arbitrary past time, at a current time on the basis of driving information from the arbitrary past time to the current time; determining an offset representing lateral inclination of the lane on the basis of the positions at the current time; and carrying out curve fitting on the basis of the offset.
Description
The present invention relates to an apparatus and method for detecting a lane.
In general, when a driving system is moving by tracking lines, the driving system needs to detect the lines first by using an image recognition apparatus in order to detect that the driving system is leaving its line and send a warning to the outside.
That is, a recognition apparatus using images captures images of a target to be recognized by a driving system by using a camera, extracts features of the target by using digital image processing techniques, and checks the target by using the extracted features. In order for such an image recognition apparatus to perform its original functions smoothly, a line, which is a target, needs to be more accurately extracted from a moving object.
These image recognition apparatuses are widely applicable to cars, robot technology (RT), automated guided vehicles (AGV), and the like. In particular, in the automotive field, since accurate lane detection needs to be ensured even during high-speed driving, technical difficulty is higher than any field. Here, a lane refers to a reference line for driving and becomes a basis on which all driving actions such as going forward, going backward, changing lanes, changing directions, forward parking, reverse parking, and perpendicular parking are performed.
A method of detecting a lane is mainly applied to advanced safety vehicles (ASV) that are currently being high-functioning and intelligent. This method of detecting a lane is applied to a lane departure warning system to prevent dozing off behind the wheel, a rear parking guide and perpendicular parking guide system to help novice drivers park their cars, and a lane keeping system to apply torque to a steering wheel in a dangerous situation to keep a lane. The application of the lane detection method has gradually been expanded.
Methods of detecting a lane according to the above-described techniques of capturing images of driving roads and detecting lanes from the captured images include mapping lane points corresponding to coordinates of the captured images to the two-dimensional coordinate system, and detecting and displaying positions of the lanes.
Here, when a lane is curved, a curve is detected by using a curve equation from the obtained lane points or a lane width. However, according to this method, when a curved road with high curvature is formed of a broken line, and the road ends and begins again in the front of a vehicle, the accuracy of the detected lanes may sharply decrease.
Therefore, an object of the present invention is to provide an apparatus and method for accurately detecting a lane even when road changes are serious such as when a curved road with high curvature is formed of a broken line and the road ends in front of a vehicle and begins again.
To achieve these and other advantages and in accordance with the purpose of the present invention, as embodied and broadly described herein, there is provided an apparatus for detecting a lane, the apparatus including: a camera module capturing an image; a control unit extracting a plurality of feature points from the image, carrying out lane fitting to connect the plurality of feature points with a single line, and tracking the lane fitted; and a display unit displaying the lane tracked, wherein the lane fitting includes obtaining information about feature points extracted at an arbitrary past time, estimating positions of the feature points, extracted at the past time, at a current time on the basis of driving information from the arbitrary past time to the current time, determining an offset representing lateral inclination of the lane on the basis of the positions at the current time, and carrying out curve fitting on the basis of the offset.
The curve fitting may include determining coefficients of a curve equation in arbitrary dimension on the basis of the curve equation.
The driving information may be at least one of speed information, acceleration information, and steering information.
The camera module may include at least one pair of cameras separated by a horizontal interval in the same central axis in the same plane, or a single camera.
The plurality of feature points may be extracted only from a region of interest defined only on a lower part of a road on the basis of the horizon within the image.
The plurality of feature points may be extracted on the basis of gradient information or color information of the image.
The control unit may transform the plurality of feature points to a world coordinate system and fit the lane on the basis of the feature points transformed to the world coordinate system.
The tracking of the lane may be performed on every fitted lane.
The apparatus may further include a storage unit storing information about the feature points extracted, wherein the control unit fits the lane on the basis of the information about the feature points stored in the storage unit.
According to another aspect of the present invention, there is provided an apparatus for detecting a lane, the apparatus including: a camera module capturing an image; a control unit extracting a plurality of feature points from the image, carrying out lane fitting to connect the plurality of feature points with a single line, and tracking the lane fitted; and a display unit displaying the lane tracked, wherein the lane fitting includes obtaining an offset representing lateral inclination of the lane of a lane fitting result at an arbitrary past time, estimating an offset at a current time with respect to the offset on the basis of driving information from the arbitrary past time to the current time, and carrying out curve fitting on the basis of the offset at the current time.
According to another aspect of the present invention, there is provided a method of detecting a lane, the method including: capturing an image; extracting a plurality of feature points from the image; carrying out lane fitting to connect the plurality of feature points with a single line; tracking the lane fitted; and displaying the lane tracked, wherein the carrying out of the lane fitting includes obtaining information about feature points extracted at an arbitrary past time; estimating positions of the feature points, extracted at the arbitrary past time, at a current time on the basis of driving information from the arbitrary past time to the current time; determining an offset representing lateral inclination of the lane on the basis of the positions at the current time; and carrying out curve fitting on the basis of the offset.
The carrying out of the curve fitting may include determining coefficients of a curve equation in arbitrary dimension on the basis of the curve equation.
The driving information may be at least one of speed information, acceleration information, and steering information.
The camera module may include at least one pair of cameras separated by a horizontal interval in the same central axis in the same plane, or a single camera.
The extracting of the plurality of feature points may include: defining a region of interest only on a lower part of a road on the basis of the horizon within the image; and extracting the plurality of feature points only from the region of interest.
The plurality of feature points may be extracted on the basis of gradient information or color information about the image.
The fitting of the lane may include: transforming the plurality of feature points to a world coordinate system; and fitting the lane on the basis of the feature points transformed to the world coordinate system.
The tracking of the lane may be performed on every fitted lane.
The extracting of the feature points may further include storing information about the feature points, wherein the fitting of the lane includes fitting the lane on the basis of the information about the feature points.
According to an apparatus for detecting a lane according to the present disclosure, all lanes appearing in a driving route are tracked, and lanes are detected from a tracking result, so that lanes can be accurately detected even in changing road conditions such as interchanges and new driving lanes can be detected quickly even when lanes are changed.
In addition, according to an apparatus for detecting a lane according to the present disclosure, when a curved road with high curvature is formed of a broken line, curve fitting is performed on the current lanes on the basis of feature points extracted in the past or a lane fitting result in the past, so that a result close to the actual lane can be obtained and at the same time, an accurate offset can be determined.
FIG. 1 is a block diagram illustrating the configuration of an apparatus for detecting a lane according to an exemplary embodiment according to the present disclosure;
FIG. 2 is a flowchart illustrating a lane detection process according to an exemplary embodiment according to the present disclosure;
FIG. 3 is a view illustrating an image according to an exemplary embodiment according to the present disclosure;
FIG. 4 is a view illustrating a result of extracting feature points according to an exemplary embodiment according to the present disclosure;
FIG. 5 is a view illustrating a result of transforming extracted feature points to a world coordinate system according to an exemplary embodiment according to the present disclosure;
FIG. 6 is a view illustrating a lane fitting result according to an exemplary embodiment according to the present disclosure;
FIG. 7 is a view displaying a lane tracking result according to an exemplary embodiment according to the present disclosure;
FIG. 8 is a flowchart illustrating a curve fitting process according to an exemplary embodiment according to the present disclosure;
FIG. 9 is a view illustrating one example of a road on which an apparatus for detecting a lane drives according to an exemplary embodiment according to the present disclosure;
FIG. 10 is a view illustrating the comparison between the actual lane and a curve fitting result lane;
FIGS. 11a and 11b are views illustrating a result of obtaining information about feature points extracted at a past time according to an exemplary embodiment according to the present disclosure;
FIG. 12 is a view illustrating a result of estimating changed positions of feature points on the basis of driving information according to an exemplary embodiment according to the present disclosure; and
FIG. 13 is a view illustrating a curve fitting result according to an exemplary embodiment according to the present disclosure.
Exemplary embodiments according to the present disclosure may stand alone and may also be applied to various types of terminals including mobile terminals, telematics terminals, smartphones, portable terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), notebook computers, tablet PCs, Wibro terminals, Internet protocol television (IPTV) terminals, televisions, 3D televisions, video equipment, navigation terminals, and audio video navigation (AVN) terminals.
Embodiments described herein may be implemented in a program command form, capable of being performed through various computer means, recorded in a computer-readable recording medium. Examples of the computer-readable recording medium may include a program command, a data file, and a data structure separately or in a combination thereof. The program command recorded in a recording medium may be a command designed or configured specially for an exemplary embodiment of the present invention, or usably known to a person having ordinary skills in the art. Examples of the computer-readable recording medium may include a hardware device specially configured to store and execute a program command such as magnetic media such as a hard disc, a floppy disc, and a magnetic tape; optical media such as Compact Disc-Read Only Memory (CD-ROM) or Digital Versatile Disc (DVD); magneto-optical media such floptical disk; ROM; RAM; and flash memory. Examples of the program command may include a high level language code executable by a computer using an inter-print as well as a machine language code such as a code created by a compiler. The hardware device can be configured as at least one software module to execute an operation of an exemplary embodiment of the present invention. The same applies to an example thereof.
Technical terms used in the present disclosure are used to merely illustrate specific embodiments, and should be understood that they are not intended to limit the present invention. As far as not being defined differently, all terms used herein including technical or scientific terms may have the same meaning as those generally understood by an ordinary person skilled in the art to which the present disclosure belongs to, and should not be construed in an excessively comprehensive meaning or an excessively restricted meaning. In addition, if a technical term used in the description of the present disclosure is an erroneous term that fails to clearly express the idea of the present disclosure, it should be replaced by a technical term that can be properly understood by the skilled person in the art.
A singular representation may include a plural representation as far as it represents a definitely different meaning from the context. Terms comprise or include used herein should be understood that they are intended to indicate an existence of several components or several steps, disclosed in the specification, and it may also be understood that part of the components or steps may not be included or additional components or steps may further be included.
The suffixes attached to components to be used in the following description, such as module and unit, were given or interchangeably used for facilitation of the detailed description of the present disclosure. Therefore, the suffixes do not have different meanings from each other.
Moreover, detailed descriptions related to well-known functions or configurations will be ruled out in order not to unnecessarily obscure subject matters of the present invention. Also, the accompanying drawings are given for easy understanding of the preferred embodiments of the present invention, and should not be construed as limiting the sprit of the present invention disclosed in the present disclosure.
Hereinafter, the exemplary embodiments disclosed in the present disclosure will be described in detail with the accompanying drawings.
FIG. 1 is a block diagram illustrating an apparatus for detecting a lane according to an exemplary embodiment of the present disclosure.
With reference to FIG. 1, the apparatus for detecting a lane (hereinafter, also referred to as a lane detection apparatus ) 100 may include a camera module 110, a control unit 120, a storage unit 130, an output unit 140, and a sensor unit 150.
The camera module 110 is a camera system that captures front, rear, and/or lateral images at the same time by using a rotary reflector, a condenser lens, and an imaging device. The camera module 110 may have application to security facilities, surveillance cameras, and robot visions. The rotary reflector may have various shapes such as a conicoid or spherical, conical, or combined shape. The camera module 110 may include at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval in the same central axis in the arbitrary same plane of the lane detection apparatus 100, or a single camera. At this time, the horizontal interval may be determined in consideration of the distance between eyes of the average person, and may be set when the lane detection apparatus 100 is configured. In addition, the camera module 110 may be any camera module that can capture an image.
In addition, as the imaging device of the camera module 110, a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) may be used. Since an image (that is, a front image) that is projected onto an imaging surface of the imaging device is reflected from the rotary reflector and may be a distorted image, this image may not be fit for humans to observe at it is. Therefore, for accurate observation of images, the camera module 110 may transform coordinates of the output of the imaging device by using a microprocessor to thereby create a new panorama image.
The camera module 110 may include at least one of a stereo camera and a moving stereo camera in order to capture an image of the front.
The stereo camera is an image apparatus that is composed of a plurality of cameras. With an image that is captured by the camera module 110, two-dimensional information about surrounding areas of the camera module 110 can be provided. By using a plurality of images that are captured by a plurality of cameras in different directions, three-dimensional information about the surrounding areas of the camera module 110 can be obtained.
The moving stereo camera refers to a camera that fixes vergence for observational obstacles as the position of the stereo camera actively changes according to the distance of the obstacles. The stereo camera generally includes two cameras that are arranged next to each other to capture images, and the distance of the obstacles can be calculated according to stereo disparity of the captured images.
The stereo camera is a passive camera in which an optical axis is always arranged parallel and fixed. On the other hand, the moving stereo camera can fix vergence by actively changing the geometric position of the optical axis.
The control of vergence of the stereo camera according to the distance of the obstacles is called vergence control. The vergence control stereo camera may keep constant stereo disparity related to a moving object to thereby provide a 3D image observant with more natural 3D images and also provide useful information in terms of measuring the distance of the obstacles and processing stereo images.
The control unit 120 may control the general operation of the lane detection apparatus 100. For example, the control unit 120 may perform the control of various types of power driving units in order for the lane detection apparatus 100 to drive.
According to an exemplary embodiment disclosed in the present disclosure, the control unit 120 processes an image received from the camera module 110, carries out lane detection, and processes other operations. In addition, according to the exemplary embodiment disclosed in the present disclosure, the control unit 120 may use driving information that is collected by the sensor unit 150 for the above-described lane detection.
A lane detection process of the control unit 120 will be described in detail with reference to FIGS. 2 to 13.
In addition, the control unit 120 may perform functions related to lane keeping (including a lane departure warning message function and an automatic lane keeping function) on the basis of the position of the lane detection apparatus 100 (or a vehicle having the lane detection apparatus 100) that is detected by an arbitrary GPS module and the detected lane.
The storage unit 130 may store data and programs for the operation of the control unit 120 and temporarily store data being input/output.
According to one exemplary embodiment according to the present disclosure, the storage unit 120 may temporarily store an image that is received by the camera module 110, processing information related to the image, and lane detection information. In addition, the storage unit 120 may store operation expressions (for example, curve equations) used to process the image.
In addition, according to one exemplary embodiment according to the present disclosure, the storage unit 120 may store feature points that are extracted at an arbitrary time during the lane detection process or a lane fitting result. The stored feature points extracted at the arbitrary time in the past or the stored lane fitting result may be used to perform the current lane fitting.
In some exemplary embodiments, the storage unit 130 may store software components including an operating system, a module performing a wireless communication unit, a module operating together with a user input unit, a module operating together with an A/V input unit, and a module operating together with the output unit 140. The operating system (for example, LINUX, UNIX, OS X, WINDOWS, Chrome, Symbian, iOS, Android, VxWorks or other embedded operating systems) may include various types of software components and/or drivers in order to control system tasks such as memory management and power management.
The storage unit 130 may include at least one storage medium including a flash memory type, a hard disk type, a multimedia card micro type, a memory card type (e.g., SD or DX memory), Random Access Memory (RAM), Static Random Access Memory (SRAM), Read-Only Memory (ROM), Electrically Erasable Programmable Read-only Memory (EEPROM), Programmable Read-only Memory (PROM), magnetic memory, magnetic disk, and optical disk. The lane detection apparatus 100 may operate in relation with a web storage that performs the storage function of the storage unit 130 on the Internet.
The output unit 140 generates outputs related to sight, hearing, or touch and may include a display unit 141 and a sound output module 142.
The display unit 141 may output information that is processed by the lane detection apparatus 100. For example, when the lane detection apparatus 100 is driving, the display unit 141 may display a UI (User Interface) or a GUI (Graphic User Interface) related to driving.
According to one exemplary embodiment according to the present disclosure, the display unit 141 may display the images captured by the camera module 110 of the lane detection apparatus 100 and/or information about lanes detected by the control unit 120. The display unit 141 displays the images and the information about the detected lanes at the same time. Here, the images and the information may be displayed separately at top and bottom or left and right, or the information about the detected lanes may be overlapped with the images.
The display unit 141 may include at least one of a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display, and a 3D display.
Among them, some of the displays may be transparent type or transmissive type displays, which may be called transparent displays. Typical examples of the transparent displays may include a Transparent OLED (TOLED). A rear structure of the display unit 141 may also be formed of a transmissive structure.
According to how the lane detection apparatus 100 is embodied, two or more display units 141 may exist. For example, a plurality of display units may be arranged separately or integrally on one surface of the plurality of display units of the lane detection apparatus 100 or may be arranged separately on different surfaces thereof.
In addition, the display unit 141 may have a layered structure with a touch sensor that senses a touch. In this case, the display unit 141 may serve as an input device as well as an output device. For example, the touch sensor may be formed as a touch film, a touch sheet, a touch pad, or a touch panel.
The touch sensor may be configured to convert variations in pressure applied to a particular portion of the display unit 141 or capacitance generated at the particular portion of the display unit 141 into electrical input signals. The touch sensor may be configured to sense pressure applied when being touched as well as touch position and touch area.
When touch input is received in association with the touch sensor, a corresponding thereto is sent to a touch controller. The touch controller processes the signal and transmits corresponding data to the control unit 120. In this manner, the control unit 120 is informed which area of the display unit 141 is touched.
The sound output module 142 may output audio data stored in the storage unit 130 in recording mode and voice recognition mode. The sound output module 142 may output sound signals related to a lane detection result (for example, an alarm regarding a kind of a detected lane) and functions regarding lane detection (for example, lane departure warning and automatic lane keeping alarm). The sound output module 142 may include a receiver, a speaker, and a buzzer.
The sensor unit 150 collects driving information of the lane detection apparatus 100 and may include a speed sensor 151 and a steering sensor 152.
The speed sensor 151 senses a speed of the lane detection apparatus 100. Since a gear ratio of a differential gear and the size of tires are determined in the lane detection apparatus 100, the speed sensor 151 may calculate the speed of the lane detection apparatus 100 on the basis of a transmission output shaft or rotation numbers of wheels. In addition, the speed sensor 151 may be formed of any one of a reed switch type, a magnetic type, and a hole type.
In addition, the speed sensor 151 may sense acceleration of the lane detection apparatus 100 on the basis of variations of the speed of lane detection apparatus 100. Alternatively, the lane detection apparatus 100 may include a separate acceleration sensor to sense the acceleration of the lane detection apparatus 100.
The steering sensor 152 senses a steering motion of the lane detection apparatus 100, that is, a wheel angle. Therefore, the steering sensor 152 is interlocked with a steering wheel of the lane detection apparatus 100 and may include a rotating rotor, a gear rotating integrally with the rotor, a sensing unit sensing a phase change caused by the rotation of a magnetic substance that generates magnetic forces, an operation unit that operates and outputs an input of the sensing unit, and a PCB and a housing to mount the operation unit.
In addition, the lane detection apparatus 100 may include a communication unit that performs communications with an arbitrary terminal or server under the control of the control unit 120. At this time, the communication unit may include a wired/wireless communication module. Here, examples of the wireless internet technique may include a wireless local area network (WLAN), Wi-Fi, wireless broadband (WiBro), world interoperability for microwave access (WiMAX), high speed downlink packet access (HSDPA), IEEE 802.16, long term evolution (LTE), and wireless mobile broadband service (WMBS). Examples of the short range communication technology may include Bluetooth, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), and ZigBee. In addition, the wired communication technology may include Universal Serial Bus (USB) communication.
The communication module may include CAN communication, Ethernet for a vehicle, flexray, and a Local Interconnect Network (LIN) in order to perform communications with an arbitrary vehicle having the lane detection apparatus 100.
In addition, the communication unit may receive an image captured by an arbitrary camera module from the arbitrary terminal or server. Moreover, the communication unit may transmit lane detection information about the arbitrary image to the arbitrary terminal or server under the control of the control unit 120.
Not all of the components of the lane detection apparatus 100 as shown in FIG. 1 are essential components. The lane detection apparatus 100 may be implemented with a larger number of components shown in FIG. 1, or the lane detection apparatus 100 may be implemented with a smaller number of components.
FIG. 2 is a flowchart illustrating a lane detection process according to the present disclosure.
With reference to FIG. 2, the lane detection apparatus 100 obtains an image S21.
The camera module 110 may obtain a first image and a second image that are captured by at least one pair of cameras (for example, stereo cameras or stereoscopic cameras) that are separated by a horizontal interval in the same central axis in the same plane of the lane detection apparatus 100, or an image that is captured by a single camera. Here, the first image may be a left image captured by a left camera included in the one pair of cameras, while the second image may be a right image captured by a right camera included in the one pair of cameras. In addition, the camera module 110 may receive any one of the first image and the second image that are captured by the one pair of cameras.
In one example, as shown in FIG. 3, the camera module 110 may obtain an image 310 as shown in FIG. 3.
Next, the lane detection apparatus 100 extracts feature points of a lane S22.
As shown in FIG. 4, in order to distinguish the lane from the image 310 captured by the camera module 110, the control unit 120 extracts a plurality of feature points (edge points) 410 that are present in the image. Here, the control unit 120 may set only the lower part of roads on the basis of the horizon as a Region of Interest (ROI), and extract the feature points 410 only in the region of interest.
The feature points 410 may be extracted by using various algorithms.
For example, the control unit 120 may extract the feature points 410 on the basis of gradient information of the obtained image 310. That is, when brightness or gradations of color between adjacent pixels of the obtained image 310 gradually change, the control unit 120 may not regard this as the feature points 410. On the other hand, when brightness or gradations of color between adjacent pixels of the obtained image 310 drastically change, the control unit 120 may regard this as the feature points 410 and extract corresponding pixel information. In this case, the feature points 410 may be formed of discontinuous points on the boundary of two regions that are distinct in terms of brightness or gradations of color between the pixels.
Alternatively, the control unit 120 may extract feature points 410 on the basis of color information about the obtained image 310. In general, on roads, general lanes appear white, centerlines appear yellow, and the other parts except lanes appear black. Therefore, the control unit 120 may extract the feature points 410 on the basis of color information about lanes. That is, the control unit 120 may create a region having colors that can be classified as lanes from the image 310 as one object, classify only the region corresponding to the roads as a region of interest in order to exclude other objects driving the roads, and extract the feature points 410 from the object created on the basis of the color information within the region of interest.
One example of the algorithms by which the control unit 120 extracts the feature points 410 has been described. However, the present invention is not limited thereto, and the feature points 410 may be extracted by various types of algorithms or filters for extracting feature points, such as an Edge Tracing algorithm or a Boundary Flowing algorithm.
According to one exemplary embodiment according to the present disclosure, the control unit 120 may store information about the extracted feature points in the storage unit 130 and then use the information in order to fit a lane from an image to be captured later. That is, the control unit 120 may determine an offset of a lane on the basis of information about feature points extracted in the past and perform curve fitting on the current lane on the basis of the determined offset value.
As such, the process of carrying out curve fitting on the current lane on the basis of information about feature points extracted in the past will be described below in detail with reference to FIGS. 8 to 12.
In addition, according to one exemplary embodiment according to the present disclosure, as shown in FIG. 5, after extracting the feature points 410, the control unit 120 may transform the extracted feature points 410 to a world coordinate system. At this time, the control unit 120 may use transformation matrices or coordinate transformation equations. The transformation matrices may be homographic matrices that are stored in advance in the storage unit 130. In addition, the control unit 120 keeps the same vertical and horizontal intervals of a plurality of feature points 510 that are transformed to the world coordinate system to thereby detect errors that occur during coordinate transformation.
The lane detection apparatus 100 fits the lane S23.
As shown in FIG. 6, the control unit 120 carries out lane fitting in order to express the extracted feature points 510 into a single line 610. The control unit 120 may use any one method of a least square method, Random Sample Consensus (RANSAC), a general hough transform method, and a spline interpolation method in order to extract a straight line or a curved line from the image.
According to one exemplary embodiment according to the present disclosure, when the extracted feature points 510 correspond to a curve, the control unit 120 may carry out lane fitting on the basis of a curve equation. Specifically, the control unit 120 substitutes the extracted feature points 510 into the curve equation to obtain coefficients and carries out curve fitting on the basis of a result of the curve equation whose coefficients are obtained. At this time, the curve equation is stored in advance in the storage unit 130 and may be an arbitrary multidimensional equation.
For example, when the curve equation is a quadratic equation, the control unit 120 may substitute the plurality of feature points 510 into the quadratic equation, for example, y=ax2+bx+c (where a is a curvature, b is a heading angle, and c is an offset) to carry out curve fitting. At this time, as a result of the substitution, the control unit 120 determines that the feature points 510 form a straight line if a is 0 and that the feature points 510 are a curve if a is not 0.
In another example, when the curve equation is a cubic equation, the control unit 120 substitutes the plurality of feature points 510 into the cubic equation, for example, y=ax3+bx2+cx+d (where a is a curve derivative, b is a curvature, c is a heading angle, and d is an offset) to carry out curve fitting. Here, if a is 0, b is a curvature, c is a heading angle, and d is an offset in the quadratic equation. If both a and b are 0, a straight line is detected, and c is a heading angle, and a d is an offset.
In addition, the control unit 120 may store the fitting result in the storage unit 130 and use the stored fitting result for the next lane fitting.
According to one exemplary embodiment according to the present disclosure, the control unit 120 may carry out curve fitting on the current lane on the basis of the feature points extracted in the past or the past lane fitting result.
When road changes are serious, for example, when a curved road with high curvature is formed of a broken line, and the curved road ends and begins again in front of a vehicle, the accuracy of the offset may decrease during curve fitting. Therefore, the control unit 120 may determine an offset of a lane on the basis of the feature points extracted in the past or the past lane fitting result first, and then carry out curve fitting on the current lane on the basis of the determined offset value.
As such, curve fitting of the current lane on the basis of the past lane fitting result will be described in detail with reference to FIGS. 8 to 13.
Finally, the lane detection apparatus 100 tracks the lane S24.
In order to reduce calibration time and noise, the control unit 120 carries out tracking on the basis of the plurality of feature points 510 corresponding to the fitted lane. At this time, calibration refers to calculating the transformation relationship between the camera module 110 and the world coordinate system.
According to one exemplary embodiment according to the present disclosure, the control unit 120 may carry out tracking on all of the plurality fitted lanes. That is, the control unit 120 may carry out tracking on adjacent lanes present in the driving road in addition to the lane in which the lane detection apparatus 100 is driving.
In this manner, the control unit 120 can quickly detect a lane from the existing tracking information without newly detecting a lane even when the lane detection apparatus 100 changes lanes.
In addition, when some lane is omitted or lost or the image is omitted due to temporary operation failures of the camera module 110, the control unit 120 may estimate and detect lanes on the basis of tracking information of the plurality of lanes (for example, positions of lanes, a lane width, and curve equations of curved lanes).
In this case, the control unit 120 may store the tracking result in the storage unit 130. Thus, the control unit 120 can correct errors on the basis of the stored tracking result even when some errors occur during lane fitting.
According to one exemplary embodiment according to the present disclosure, the control unit 120 may store the tracking result in the storage unit 130 and use the tracking result in order to fit a lane from an image to be obtained later. That is, the control unit 120 may determine an offset of a lane on the basis of the past lane tracking result first, and then carry out curve fitting on the current lane on the basis of the determined offset value.
Additionally, the lane detection apparatus 100 may display a lane tracking result S25.
The output unit 140 may display the tracking result through the lane detection process as shown in FIG. 7.
The output unit 140 may display the image 310 captured by the camera module 110 of the lane detection apparatus 100 and/or a lane tracking result 710 detected by the control unit 120. The output unit 140 displays the image 310 and the detected lane tracking result 710 at the same time. Here, the image 310 and the detected lane tracking result 710 may be displayed separately at top and bottom or left and right, or the information about the detected lane may be overlapped with the image 310.
Alternatively, the output unit 140 may output sound signals related to a lane detection result (for example, an alarm regarding a kind of a detected lane) and functions regarding lane detection (for example, lane departure warning and automatic lane keeping alarm).
Hereinafter, a lane fitting process during the lane detection process, particularly, a curve fitting process will be described in detail.
FIG. 8 is a flowchart illustrating a curve fitting process according to an exemplary embodiment according to the present disclosure.
With reference to FIG. 8, the lane detection apparatus 100 obtains information about feature points extracted in the past S231.
According to the exemplary embodiment according to the present disclosure, when the extracted feature points 510 correspond to a curve, the control unit 120 may carry out lane fitting by using a curve equation. At this time, when road changes are serious, for example, when a curved road with high curvature is formed of a broken line, and the curved road ends and begins again in front of a vehicle, the accuracy of the offset of the detected lane may significantly decrease.
Here, the offset represents whether the lane is located at the left or right of the lane detection apparatus 100. When the lane is expressed into an arbitrary equation, the offset refers to a value corresponding to a constant term (a coefficient in the first term). That is, when the constant term has a positive value, the lane is located at the right side of the lane detection apparatus 100. On the other hand, when the constant term has a negative value, the lane is located at the left side of the lane detection apparatus 100.
For example, as shown in FIG. 9, the lane detection apparatus 100 may drive in an actual lane 910 that has high curvature and is formed of a broken line. Since the actual lane 910 is formed of a broken line, the actual lane 910 may be divided into blank segments 901 and line segments 902. Since the lane detection apparatus 100 is driving, the lane detection apparatus 100 is located at 80 on the x-axis at an arbitrary past time t-1 and at 20 on the x-axis at a current time t.
As shown in FIG. 10, in terms of the actual lane 910, when lane fitting is performed at the current time t, since only the plurality of feature points 510 that are extracted from the line segments 902 exist, the control unit 120 carries out lane fitting on the basis of the feature points 510. Therefore, the control unit 120 that detects the plurality of feature points 510, extracted from the road with the high curvature, as a curve carries out lane fitting on the basis of the curve equation, and as a result, a detection result of the lane has a curve fitting result 920 that has a curvature. In this case, lane detection with respect to the line segments 902 can be accurate, but lane detection with respect to the blank segments 901 with high curvature may be inaccurate. In addition, as a curve fitting result 920 in association with the actual lane 910 that is present at the right side of the lane detection apparatus 100 (the starting point of FIG. 10), the offset of the lane is present at the left side of the lane detection apparatus 100. Therefore, an inaccurate offset occurs, and thus safe driving becomes difficult.
Therefore, as shown in FIG. 11a, in order to find out the accurate offset, the control unit 120 may acquire information about the feature points 520 extracted at the past time t-1. When the current time is referred to as t, the feature points 520 are extracted at an arbitrary time in the past, that is, the time t-1, which may be obtained from the storage unit 130. The arbitrary time in the past may be determined beforehand or vary flexibly. In order to accurately detect the lanes of the road on which the lane detection apparatus 100 drives, the arbitrary time may not be determined as a time in the distant past.
When the control unit 120 obtains the information about the feature points 520 at the current time t, as shown in FIG. 11b, the information about the feature points 520 includes the original location information for the past time t-1 and cannot be used to fit a lane at the current time t. Therefore, the control unit 120 needs to correct the information about the feature points 520 by estimating the information to be a correct value with respect to the current time t.
Therefore, the lane detection apparatus 100 estimates the changed positions of the feature points on the basis of the driving information S232.
Since the lane detection apparatus 100 is driving on the road, the information about the feature points 520 needs to be estimated to a correct value with respect to the current time t.
For example, when the lane detection apparatus 100 is driving on the curved road, as shown in FIG. 9, the image that is acquired by the lane detection apparatus 100 at the past time t-1 is moved to the opposite side surface and rear at the current time t due to the driving motion of the lane detection apparatus 100 to one side and front thereof (result of analyzing curvilinear motion).
Therefore, the control unit 120 may estimate the changed positions of the feature points 520 in order to correct the positions of the feature points 520 at the past time t-1 into then positions at the current time t.
At this time, the control unit 120 may estimate the changed positions on the basis of the driving information of the lane detection apparatus 100 that is colleted by the sensor unit 150. The driving information may include speed information, acceleration information, or steering motion information of the lane detection apparatus 100 that is collected by the sensor unit 150.
The control unit 120 may determine whether the positions of the feature points 520 are changed from top to bottom or from left to right, on the basis of the collected driving information. In addition, as shown in FIG. 12, the control unit 120 may change the feature points 520 to the estimated positions (520') on the basis of a result of the determination.
Then, the lane detection apparatus 100 determine an offset on the basis of the feature points whose positions are estimated S233.
The control unit 120 determines a constant term that represents an offset of a lane among coefficients of a curve equation on the basis of the feature points 520 whose positions are estimated.
For example, when curve fitting is performed on a curved lane by using a quadratic equation, that is, y=ax2+bx+c, the control unit 120 may determine a value c that represents an offset of a lane as coordinate values according to the positions of the feature points 520 whose positions are estimated.
In this manner, even when the actual lane 910 with high curvature is formed of a broken line and ends in front of the lane detection apparatus 100, and thus it is difficult to estimate an accurate offset, the control unit 120 can accurately determine the offset of the lane on the basis of the information at the past time and the driving information of the lane detection apparatus 100.
According to one exemplary embodiment according to the present disclosure, the control unit 120 may determine the offset on the basis of the lane fitting result at the past time instead of the feature points 520 whose positions are estimated. For example, the control unit 120 obtains the value c representing the offset of the lane from an equation that represents a lane detected from the lane fitting result at the past time, estimates a change in the value c on the basis of the driving information, and determines the changed estimated value as the offset at the current time.
Finally, the lane detection apparatus 100 carries out curve fitting at the current time S234.
The control unit 120 may carry out curve fitting on the lane at the current time of the lane on the basis of the determined offset to thereby obtain the other coefficients of the curve equation.
For example, when curve fitting is performed on the curved road at the current time by using the quadratic equation, that is, y=ax2+bx+c, the value c is determined as a predetermined offset on the basis of the feature points 520 whose positions are estimated, regardless of curve fitting, and curve fitting is performed on the basis of the extracted feature points 510 to determine the other constants a and b.
Therefore, as shown in FIG. 13, even when as a curve fitting result, part of a lane is disconnected or lost and the lane actually, located at the right side of the lane detection apparatus 100, is falsely detected as being located at the left side thereof, the control unit 120 can determine an accurate offset of the actual lane 910 by using the information about the feature points 520 extracted at the past time (see FIG. 9). In addition, by carrying out curve fitting on the basis of the correctly determined offset, and as a lane fitting result 930 at the current time, the lane can be accurately detected.
The present invention may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the present invention to those skilled in the art. Further, the present invention is only defined by scopes of claims.
Claims (19)
- An apparatus for detecting a lane, the apparatus comprising:a camera module capturing an image;a control unit extracting a plurality of feature points from the image, carrying out lane fitting to connect the plurality of feature points with a single line, and tracking the lane fitted; anda display unit displaying the lane tracked,wherein the lane fitting comprises obtaining information about feature points extracted at an arbitrary past time, estimating positions of the feature points, extracted at the past time, at a current time on the basis of driving information from the arbitrary past time to the current time, determining an offset representing lateral inclination of the lane on the basis of the positions at the current time, and carrying out curve fitting on the basis of the offset.
- The apparatus of claim 1, wherein the curve fitting comprises determining coefficients of a curve equation in arbitrary dimension on the basis of the curve equation.
- The apparatus of claim 1, wherein the driving information is at least one of speed information, acceleration information, and steering information.
- The apparatus of claim 1, wherein the camera module comprises at least one pair of cameras separated by a horizontal interval in the same central axis in the same plane, or a single camera.
- The apparatus of claim 1, wherein the plurality of feature points are extracted only from a region of interest defined only on a lower part of a road on the basis of the horizon within the image.
- The apparatus of claim 1, wherein the plurality of feature points are extracted on the basis of gradient information or color information of the image.
- The apparatus of claim 1, wherein the control unit transforms the plurality of feature points to a world coordinate system and fits the lane on the basis of the feature points transformed to the world coordinate system.
- The apparatus of claim 1, wherein the tracking of the lane is performed on every fitted lane.
- The apparatus of claim 1, further comprising a storage unit storing information about the feature points extracted,wherein the control unit fits the lane on the basis of the information about the feature points stored in the storage unit.
- An apparatus for detecting a lane, the apparatus comprising:a camera module capturing an image;a control unit extracting a plurality of feature points from the image, carrying out lane fitting to connect the plurality of feature points with a single line, and tracking the lane fitted; anda display unit displaying the lane tracked,wherein the lane fitting comprises obtaining an offset representing lateral inclination of the lane of a lane fitting result at an arbitrary past time, estimating an offset at a current time with respect to the offset on the basis of driving information from the arbitrary past time to the current time, and carrying out curve fitting on the basis of the offset at the current time.
- A method of detecting a lane, the method comprising:capturing an image;extracting a plurality of feature points from the image;carrying out lane fitting to connect the plurality of feature points with a single line;tracking the lane fitted; anddisplaying the lane tracked,wherein the carrying out of the lane fitting comprises obtaining information about feature points extracted at an arbitrary past time;estimating positions of the feature points, extracted at the arbitrary past time, at a current time on the basis of driving information from the arbitrary past time to the current time;determining an offset representing lateral inclination of the lane on the basis of the positions at the current time; andcarrying out curve fitting on the basis of the offset.
- The method of claim 11, wherein the carrying out of the curve fitting comprises determining coefficients of a curve equation in arbitrary dimension on the basis of the curve equation.
- The method of claim 11, wherein the driving information is at least one of speed information, acceleration information, and steering information.
- The method of claim 11, wherein the camera module comprises at least one pair of cameras separated by a horizontal interval in the same central axis in the same plane, or a single camera.
- The method of claim 11, wherein the extracting of the plurality of feature points comprises:defining a region of interest only on a lower part of a road on the basis of the horizon within the image; andextracting the plurality of feature points only from the region of interest.
- The method of claim 11, wherein the plurality of feature points are extracted on the basis of gradient information or color information about the image.
- The method of claim 11, wherein the fitting of the lane comprises:transforming the plurality of feature points to a world coordinate system; andfitting the lane on the basis of the feature points transformed to the world coordinate system.
- The method of claim 11, wherein the tracking of the lane is performed on every fitted lane.
- The method of claim 11, wherein the extracting of the feature points further comprises storing information about the feature points,wherein the fitting of the lane comprises fitting the lane on the basis of the information about the feature points.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR10-2011-0077912 | 2011-08-04 | ||
| KR1020110077912A KR101578434B1 (en) | 2011-08-04 | 2011-08-04 | Apparatus for detecting lane and method thereof |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2013018961A1 true WO2013018961A1 (en) | 2013-02-07 |
Family
ID=47629458
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2011/009801 Ceased WO2013018961A1 (en) | 2011-08-04 | 2011-12-19 | Apparatus and method for detecting lane |
Country Status (2)
| Country | Link |
|---|---|
| KR (1) | KR101578434B1 (en) |
| WO (1) | WO2013018961A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105270179A (en) * | 2014-05-30 | 2016-01-27 | Lg电子株式会社 | Driver assistance apparatus and vehicle |
| CN107472134A (en) * | 2016-07-07 | 2017-12-15 | 宝沃汽车(中国)有限公司 | Auxiliary image system, control method and the vehicle of vehicle |
| CN119006474A (en) * | 2024-10-24 | 2024-11-22 | 中公高科养护科技股份有限公司 | Road measurement method based on image processing and related equipment |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116409326B (en) * | 2023-03-17 | 2025-11-18 | 宁波路特斯机器人有限公司 | A lane line tracking method and lane line tracking system |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH085388A (en) * | 1994-06-21 | 1996-01-12 | Nissan Motor Co Ltd | Road detection device |
| JP2006331389A (en) * | 2005-04-26 | 2006-12-07 | Fuji Heavy Ind Ltd | Lane recognition device |
| JP2007264714A (en) * | 2006-03-27 | 2007-10-11 | Fuji Heavy Ind Ltd | Lane recognition device |
| KR20110001427A (en) * | 2009-06-30 | 2011-01-06 | 태성전장주식회사 | Lane Fast Detection Method by Extracting Region of Interest |
-
2011
- 2011-08-04 KR KR1020110077912A patent/KR101578434B1/en active Active
- 2011-12-19 WO PCT/KR2011/009801 patent/WO2013018961A1/en not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH085388A (en) * | 1994-06-21 | 1996-01-12 | Nissan Motor Co Ltd | Road detection device |
| JP2006331389A (en) * | 2005-04-26 | 2006-12-07 | Fuji Heavy Ind Ltd | Lane recognition device |
| JP2007264714A (en) * | 2006-03-27 | 2007-10-11 | Fuji Heavy Ind Ltd | Lane recognition device |
| KR20110001427A (en) * | 2009-06-30 | 2011-01-06 | 태성전장주식회사 | Lane Fast Detection Method by Extracting Region of Interest |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105270179A (en) * | 2014-05-30 | 2016-01-27 | Lg电子株式会社 | Driver assistance apparatus and vehicle |
| EP2949534A3 (en) * | 2014-05-30 | 2016-02-24 | LG Electronics Inc. | Driver assistance apparatus capable of diagnosing vehicle parts and vehicle including the same |
| US9352689B2 (en) | 2014-05-30 | 2016-05-31 | Lg Electronics Inc. | Driver assistance apparatus capable of diagnosing vehicle parts and vehicle including the same |
| CN107472134A (en) * | 2016-07-07 | 2017-12-15 | 宝沃汽车(中国)有限公司 | Auxiliary image system, control method and the vehicle of vehicle |
| CN119006474A (en) * | 2024-10-24 | 2024-11-22 | 中公高科养护科技股份有限公司 | Road measurement method based on image processing and related equipment |
Also Published As
| Publication number | Publication date |
|---|---|
| KR20130015738A (en) | 2013-02-14 |
| KR101578434B1 (en) | 2015-12-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2013022153A1 (en) | Apparatus and method for detecting lane | |
| KR101647370B1 (en) | road traffic information management system for g using camera and radar | |
| CN111046743B (en) | Barrier information labeling method and device, electronic equipment and storage medium | |
| JP4832321B2 (en) | Camera posture estimation apparatus, vehicle, and camera posture estimation method | |
| JP5663352B2 (en) | Image processing apparatus, image processing method, and image processing program | |
| EP3557527B1 (en) | Object detection device | |
| WO2011052827A1 (en) | Slip detection apparatus and method for a mobile robot | |
| WO2021112462A1 (en) | Method for estimating three-dimensional coordinate values for each pixel of two-dimensional image, and method for estimating autonomous driving information using same | |
| WO2020004817A1 (en) | Apparatus and method for detecting lane information, and computer-readable recording medium storing computer program programmed to execute same method | |
| WO2020067751A1 (en) | Device and method for data fusion between heterogeneous sensors | |
| WO2020154990A1 (en) | Target object motion state detection method and device, and storage medium | |
| WO2020235734A1 (en) | Method for estimating distance to and location of autonomous vehicle by using mono camera | |
| CN114998436B (en) | Object labeling method, device, electronic device and storage medium | |
| WO2018101603A1 (en) | Road object recognition method and device using stereo camera | |
| WO2013018961A1 (en) | Apparatus and method for detecting lane | |
| WO2020159076A1 (en) | Landmark location estimation apparatus and method, and computer-readable recording medium storing computer program programmed to perform method | |
| JP4344860B2 (en) | Road plan area and obstacle detection method using stereo image | |
| US20200132471A1 (en) | Position Estimating Device | |
| KR20190134303A (en) | Apparatus and method for image recognition | |
| JP2002321579A (en) | Warning information generation method and vehicle side image generation device | |
| WO2014051335A1 (en) | Vehicle side mirror system and vehicle side and rear image display method thereof | |
| JP2011103058A (en) | Erroneous recognition prevention device | |
| JP2021103410A (en) | Mobile body and imaging system | |
| JP2008286648A (en) | Distance measuring device, distance measuring system, distance measuring method | |
| JP5836774B2 (en) | Overhead video generation device, overhead video generation method, video display system, and navigation device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11870309 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 11870309 Country of ref document: EP Kind code of ref document: A1 |