[go: up one dir, main page]

SE540698C2 - Method and control unit for estimating bias of yaw rate sensor - Google Patents

Method and control unit for estimating bias of yaw rate sensor

Info

Publication number
SE540698C2
SE540698C2 SE1750104A SE1750104A SE540698C2 SE 540698 C2 SE540698 C2 SE 540698C2 SE 1750104 A SE1750104 A SE 1750104A SE 1750104 A SE1750104 A SE 1750104A SE 540698 C2 SE540698 C2 SE 540698C2
Authority
SE
Sweden
Prior art keywords
vehicle
yaw rate
rate sensor
route segment
bias
Prior art date
Application number
SE1750104A
Other languages
Swedish (sv)
Other versions
SE1750104A1 (en
Inventor
Andersson Jonny
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to SE1750104A priority Critical patent/SE540698C2/en
Priority to DE102018000599.3A priority patent/DE102018000599A1/en
Publication of SE1750104A1 publication Critical patent/SE1750104A1/en
Publication of SE540698C2 publication Critical patent/SE540698C2/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C19/00Gyroscopes; Turn-sensitive devices using vibrating masses; Turn-sensitive devices without moving masses; Measuring angular rate using gyroscopic effects
    • G01C19/56Turn-sensitive devices using vibrating masses, e.g. vibratory angular rate sensors based on Coriolis forces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/114Yaw movement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2400/00Indexing codes relating to detected, measured or calculated conditions or factors
    • B60G2400/05Attitude
    • B60G2400/052Angular rate
    • B60G2400/0523Yaw rate
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2600/00Indexing codes relating to particular elements, systems or processes used on suspension systems or suspension control systems
    • B60G2600/08Failure or malfunction detecting means
    • B60G2600/082Sensor drift
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60GVEHICLE SUSPENSION ARRANGEMENTS
    • B60G2800/00Indexing codes relating to the type of movement or to the condition of the vehicle and to the end result to be achieved by the control action
    • B60G2800/90System Controller type
    • B60G2800/94Electronic Stability Program (ESP, i.e. ABS+ASC+EMS)

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Manufacturing & Machinery (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Gyroscopes (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

Method (400) and control unit (310) in a vehicle (100) for estimating bias of a yaw rate sensor (110) on-board the vehicle (100). The method (500) comprises: measuring (501) a yaw rate with the yaw rate sensor (110) during a time period (T) while passing a route segment (200); calculating (502) a yaw angle by integrating the yaw rate measured (501) by the yaw rate sensor (110) over the time period (T); determining (503) vehicle directional change velocity while the vehicle (100) is passing the route segment (200), with a vehicle directional determining device (120); calculating (505) a vehicle directional change of the vehicle (100); and estimating (506) the yaw rate sensor bias by comparing the calculated (502) yaw angle with the calculated (505) vehicle directional change.

Description

METHOD AND CONTROL UNIT FOR ESTIMATING BIAS OF YAW RATE SENSOR TECHNICAL FIELD This document relates to a method and a control unit in a vehicle. More particularly, a method and a control unit is described, for estimating bias of a yaw rate sensor on-board the vehicle.
BACKGROUND The inertial sensors of a vehicle may be used for numerous applications, such as e.g. safety systems and / or Advanced Driver Assistance Systems (ADAS).
For some functions, the accuracy of the inertial sensor is not necessarily very critical, e.g. in an Electronic Stability Program (ESP). However, for other functions, for example when calculating the absolute speed of an object, the own vehicle’s longitudinal and lateral speed as well as the rotational velocity, i.e. yaw rate, is important to know with high accuracy, in order to make correct calculations and predictions.
Also for positioning of the own vehicle, the own vehicle’s movement must be known or be estimated with high accuracy between points with satellite reception (dead-reckoning). Dead reckoning is also used in other applications, for example memorising the trajectory of a leading vehicle. Then, the own vehicle’s movement must be taken into account.
The general problem with yaw rate sensors is that they often have a bias, meaning that the sensor reports a small value when driving straight and the yaw rate sensor consequently is expected to report a zero value. The reasons for the bias may be e.g. variating temperature.
A known method for estimating and eliminating the yaw rate sensor bias is low-pass filtering of the value reported by the yaw rate sensor when the vehicle is confirmed to be stationary, e.g. when starting the vehicle. However, the vehicle may be driven a long distance between staying stationary, e.g. professional long haulage vehicles having two drivers.
Another known method is low-pass filtering of the value during driving, which requires very slow filtering. The problem with this method is when roads are bending in the same direction for a long time, which is not very unusual on high velocity roads as sharp curves typically is to be avoided.
Document US20140163808 discloses a method of acquiring bias of a yaw rate sensor for a vehicle. The method comprises matching Global Positioning System (GPS) based map information and vehicle speed information, determining curvature of the road, calculate a map based yaw degree based on steering angle information. The estimated curvature is then compared with the yaw rate determined by the yaw rate sensor. The difference between the values is considered to constitute the yaw rate bias of the yaw rate sensor.
The problem with this method is that it is based on map data, which often is incorrect; speed sensor information, which also often is incorrect, and steering angle information as determined by a steering angle sensor, which also is prone to errors. Thus, the determined difference between the estimated curvature and the curvature may be a result of speed sensor bias, steering angle sensor bias and / or map data errors.
Thus, the known methods for calibration and bias detection while driving has serious flaws. It would be desired to improve calibration of the on-board yaw rate sensor of a vehicle, in order to further improve various calculations and predictions which are based on the yaw rate sensor measurements.
SUMMARY It is therefore an object of this invention to solve at least some of the above problems and improve yaw rate sensor measurements.
According to a first aspect of the invention, this objective is achieved by a method in a vehicle for estimating bias of a yaw rate sensor on-board the vehicle. The method comprises measuring a yaw rate with the yaw rate sensor during a time period while passing a route segment. The method further comprises calculating a yaw angle by integrating the yaw rate measured by the yaw rate sensor over the time period. Also, the method also comprises determining vehicle directional change velocity while the vehicle is passing the route segment, with a camera based on road curvature of the route segment, by detecting lane markings of the route segment, and vehicle velocity of the vehicle while passing the route segment as determined by a speedometer of the vehicle. In further addition, the method also comprises calculating a vehicle directional change of the vehicle by integrating the determined vehicle directional change velocity over the time period. The method comprises estimating the yaw rate sensor bias by comparing the calculated yaw angle with the calculated vehicle directional change.
According to a second aspect of the invention, this objective is achieved by a control unit in a vehicle. The control unit aims at detecting bias of a yaw rate sensor on-board the vehicle. The control unit is configured to measure a yaw rate with the yaw rate sensor during a time period while passing a route segment. The control unit is further configured to calculate a yaw angle by integrating the yaw rate measured by the yaw rate sensor over the time period.
Also, the control unit in addition is configured to determine vehicle directional change velocity while the vehicle is passing the route segment, with a camera based on road curvature of the route segment, by detecting lane markings of the route segment, and measured vehicle velocity of the vehicle while passing the route segment as determined by a speedometer of the vehicle. Additionally, the control unit also is configured to calculate a vehicle directional change of the vehicle by integrating the determined vehicle directional change velocity over the time period. The control unit is furthermore configured to detect the yaw rate sensor bias by comparing the calculated yaw angle with the calculated vehicle directional change.
Thanks to the described aspects, by measuring yaw rate with the yaw rate sensor and compare it with curvature of a lane, either as determined by a camera, or by an on-board global positioning system/ digital compass, the yaw rate sensor may be adjusted while driving the vehicle. Thereby, a more precise output of the yaw rate sensor is achieved, leading to better predictions in predictions/ calculations wherein the yaw rate of the vehicle is required.
Other advantages and additional novel features will become apparent from the subsequent detailed description.
FIGURES Embodiments of the invention will now be described in further detail with reference to the accompanying figures, in which: Figure 1A illustrates a vehicle according to an embodiment of the invention; Figure 1B illustrates an example of the vehicle according to Figure 1A, as regarded from above according to an embodiment; Figure 2 illustrates an example of a traffic scenario and an embodiment of the invention; Figure 3 illustrates an example of a vehicle interior according to an embodiment; Figure 4 depicts a vehicle as regarded from behind, passing a road bank, according to an embodiment; Figure 5 is a flow chart illustrating an embodiment of the method; Figure 6 is an illustration depicting a system according to an embodiment.
DETAILED DESCRIPTION Embodiments of the invention described herein are defined as a method and a control unit, which may be put into practice in the embodiments described below. These embodiments may, however, be exemplified and realised in many different forms and are not to be limited to the examples set forth herein; rather, these illustrative examples of embodiments are provided so that this disclosure will be thorough and complete.
Still other objects and features may become apparent from the following detailed description, considered in conjunction with the accompanying drawings. It is to be understood, however, that the drawings are designed solely for purposes of illustration and not as a definition of the limits of the herein disclosed embodiments, for which reference is to be made to the appended claims. Further, the drawings are not necessarily drawn to scale and, unless otherwise indicated, they are merely intended to conceptually illustrate the structures and procedures described herein.
Figure 1A illustrates a scenario with a vehicle 100. The vehicle 100 is driving on a road in a driving direction 105.
The vehicle 100 may comprise e.g. a truck, a car, a motorcycle, a trailer, a bus, a bike, a train, a tram, or other similar manned or unmanned means of conveyance running e.g. on wheels.
The vehicle 100 may be driver controlled or driverless (i.e. autonomously controlled) in different embodiments. However, for enhanced clarity, the vehicle 100 is subsequently described as having a driver.
The vehicle 100 comprises a yaw rate sensor 110. The yaw rate sensor 110 is a gyroscopic device that measures the vehicle’s angular velocity around its vertical axis. The angle between the vehicle's heading and vehicle actual movement direction is called slip angle, which is related to the yaw rate. The yaw-rate sensor 110 may be of piezoelectric type or micromechanical type in different embodiments.
In the piezoelectric type of yaw rate sensor 110, the yaw rate sensor 110 is a “tuning fork”-shaped structure with four piezo elements (two on top and two below). During straight ahead driving, the upper ones produce no voltage as no Coriolis force acts. But in cornering, the rotational movement causes the upper part of the tuning fork to leave the oscillatory plane creating an alternating current voltage which is proportional to the yaw rate and oscillatory speed. The output signal’s sign depends on the direction, i.e. left or right.
In the micromechanical type of yaw rate sensor 110, the Coriolis acceleration is measured by a micro-mechanical capacitive acceleration sensor placed on an oscillating element. This acceleration is proportional to the product of yaw rate and the oscillatory velocity, which is maintained electronically at a constant value.
Often, Micro Electro Mechanical System (MEMs) sensors may be used as yaw rate sensor 110 in the vehicle 100. However, such MEMs gyro rate sensors may have a bias due to thermal noise and possibly also other factors, which may vary over time. It would thus be an advantage to be able to (re)-determine yaw rate bias of the sensor 110 repeatedly, at a regular basis, or continuously.
The yaw rate sensor 110 is used in the vehicle 100 for various applications in the vehicle 100, as previously discussed in the background section.
The vehicle 100 may comprises a camera 120. In some embodiments, the camera 120 may comprise a global navigation satellite system device or digital compass, as will be further discussed in conjunction with presentation of Figure 3. In the embodiment illustrated in Figure 1A, the camera 120 comprises a forwardly directed sensor, such as e.g. a camera. A common concept among the various different embodiments disclosed herein is that a bias of the yaw rate sensor 110 is estimated by: measuring yaw rate with the yaw rate sensor 110, calculate a vehicle directional change of the vehicle 100 and compute a difference between the respective values. In case the difference between them exceeds a threshold limit, the yaw rate sensor 110 may be adjusted by the computed difference.
The camera 120 in Figure 1A, i.e. forwardly directed sensor 120, may in some embodiments be configured for capturing images of the vehicle surroundings ahead of the vehicle 100 and thereby determining the road curvature. The camera 120/ sensor may for example detect and capture images of road markings.
In the illustrated embodiment, which is merely an arbitrary example, the forwardly directed camera 120/ sensor may be situated e.g. at the front of the vehicle 100, behind the windscreen of the vehicle 100.
Mounting the forwardly directed camera 120/ sensor behind the windshield have some advantages compared to externally mounted sensor systems. These advantages include the possibility to use windshield wipers for cleaning and using the light from headlights to illuminate objects in the sensor’s field of view. It is also protected from dirt, snow, rain and to some extent also from damage, vandalism and / or theft. Such vehicle camera 120/ sensor may also be used for a variety of other tasks such as e.g. for lane departure warning etc.
The camera 120/ sensor may be directed towards the front of the vehicle 100, in the driving direction 105. The camera 120/ sensor may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, a radar, a lidar, an ultrasound device, a time-of-flight camera, or similar device, in different embodiments.
By comparing the yaw rate of the yaw rate sensor 110 with a curvature determined in a trustworthy way, i.e. less prone to errors than the yaw rate sensor 110, the yaw rate sensor 110 may be calibrated while driving the vehicle 100, e.g. at regular basis at predetermined time intervals. Thereby, better predictions may be made, based on the vehicle yaw rate.
Figure 1 B illustrates the vehicle 100 previously illustrated in Figure 1A from an above view, according to an embodiment.
Figure 2 illustrates an example of a traffic scenario and an embodiment of the invention. The vehicle 100 is passing a route segment 200 comprising a curvature. In the illustration of Figure 2, the vehicle 100 is illustrated in a first position A at t=0, and a second position B at t=T.
The camera 120/ sensor may be turned and / or re-directed in different directions in some embodiments. The camera 120/ sensor may estimate a polynomial or clothoid from the detected lane markings. A clothoid directly contains curvature information and the curvature can be easily extracted from a polynomial lane model by the formula: Image available on "Original document" By accumulating the difference between measured integrated yaw angle ?measand integrated heading change of the road ?roadthe bias of the yaw rate sensor 110 may be calculated if the accumulation is done in the same lane for a longer period (-60s).
The lane markings on the road may be detected by the camera 120/ sensor. Further, the lane markings may be interpreted by image recognition. The camera 120/ sensor thus may comprise or be connected to a control unit configured for image recognition/ computer vision and object recognition.
Computer vision is a technical field comprising methods for acquiring, processing, analysing, and understanding images and, in general, high-dimensional data from the real world in order to produce numerical or symbolic information. A theme in the development of this field has been to duplicate the abilities of human vision by electronically perceiving and understanding an image. Understanding in this context means the transformation of visual images (the input of retina) into descriptions of world that can interface with other thought processes and elicit appropriate action. This image understanding can be seen as the disentangling of symbolic information from image data using models constructed with the aid of geometry, physics, statistics, and learning theory. Computer vision may also be described as the enterprise of automating and integrating a wide range of processes and representations for vision perception.
The image data of the camera 120/ sensor may take many forms, such as e.g. images, video sequences, views from multiple cameras, or multi-dimensional data from a scanner.
The following steps may be taken, in some embodiments, during the calculation of the bias from comparing the yaw rate with the sensor data: when the vehicle is relatively centered in the lane and speed is relatively high, the initial heading towards the road 0(0) may be stored. Further, measured yaw angle may be integrated using the yaw rate sensor: Image available on "Original document" Road curvature may be integrated over the same distance: Image available on "Original document" where ? is the host vehicle’s heading towards the road. When the measurement ends at position B, the final heading towards the road is stored, ?(?) is saved.
If (T > Tmin)and the final position in the lane at position B is relatively centered the bias angle is calculated as: Image available on "Original document" Image available on "Original document" Integration may also be ended if T > Tmaxand a new integration may begin if the basic conditions are still fulfilled, according to some embodiments.
Figure 3 illustrates an example of a vehicle interior of the vehicle 100, as it may be perceived by the driver of the vehicle 100.
The vehicle 100 comprises a control unit 310, wherein various calculations when estimating bias of the yaw rate sensor 110 may be performed.
The geographical position of the vehicle 100 may be determined by the positioning unit in the vehicle 100, which may be based on a satellite navigation system such as the Navigation Signal Timing and Ranging (Navstar) Global Positioning System (GPS), Differential GPS (DGPS), Galileo, GLONASS, or the like.
The geographical position of the positioning unit, (and thereby also of the vehicle 100) may be made continuously with a certain predetermined or configurable time intervals according to various embodiments.
Positioning by satellite navigation is based on distance measurement using triangulation from a number of satellites 320-1, 320-2, 320-3, 320-4. In this example, four satellites 320-1, 320-2, 320-3, 320-4 are depicted, but this is merely an example. More than four satellites 320-1, 320-2, 320-3, 320-4 may be used for enhancing the precision, or for creating redundancy. The satellites 320-1, 320-2, 320-3, 320-4 continuously transmit information about time and date (for example, in coded form), identity (which satellite 320-1, 320-2, 320-3, 320-4 that broadcasts), status, and where the satellite 320-1, 320-2, 320-3, 320-4 are situated at any given time. The GPS satellites 320-1, 320-2, 320-3, 320-4 sends information encoded with different codes, for example, but not necessarily based on Code Division Multiple Access (CDMA). This allows information from an individual satellite 320-1, 320-2, 320-3, 320-4 distinguished from the others' information, based on a unique code for each respective satellite 320-1, 320-2, 320-3, 320-4. This information can then be transmitted to be received by the appropriately adapted positioning device comprised in the vehicle 100.
Distance measurement can according to some embodiments comprise measuring the difference in the time it takes for each respective satellite signal transmitted by the respective satellites 320-1, 320-2, 320-3, 320-4 to reach the positioning unit. As the radio signals travel at the speed of light, the distance to the respective satellite 320-1, 320-2, 320-3, 320-4 may be computed by measuring the signal propagation time.
The positions of the satellites 320-1, 320-2, 320-3, 320-4 are known, as they continuously are monitored by approximately 15-30 ground stations located mainly along and near the earth's equator. Thereby the geographical position, i.e. latitude and longitude, of the vehicle 100 may be calculated by determining the distance to at least three satellites 320-1, 320-2, 320-3, 320-4 through triangulation. For determination of altitude, signals from four satellites 320-1, 320-2, 320-3, 320-4 may be used according to some embodiments.
Having determined the geographical position of the positioning unit (or in another way), the direction of the vehicle 100 may be determined during driving.
Alternatively, or in addition (for possible redundancy), the heading change of the road may be measured with on-board digital compass, such as a magnetometer, a fibre optic gyrocompass or similar.
The heading change over the integrated period may then be derived from the change in heading at the time the measurement started at t=0 to the end of the measurement at t=T.
Then, the following steps may be taken: when the vehicle speed is relatively high, the initial heading towards a global landmark (e.g. North) ? (0) may be stored. The measured yaw angle relative the global coordinates is integrated using the yaw rate sensor: Image available on "Original document" When the measurement ends the final heading towards the global landmark at t=T, 0(?) may be stored. If (T > Tmin), the bias angle may be calculated as: Image available on "Original document" The control unit 310 may communicate with the camera 120, the positioning unit and the yaw rate sensor 110, e.g. via a wired or wireless communication bus. The communication bus may comprise e.g. a Controller Area Network (CAN) bus, a Media Oriented Systems Transport (MOST) bus, or similar. However, the communication may alternatively be made over a wireless connection comprising, or at least be inspired by any wireless communication technologies such as Wi-Fi, Wireless Local Area Network (WLAN), Ultra Mobile Broadband (UMB), Bluetooth (BT), Near Field Communication (NFC), Radio-Frequency Identification (RFID), Z-wave, ZigBee, IPv6 over Low power Wireless Personal Area Networks (6LoW-PAN), Wireless Highway Addressable Remote Transducer (HART) Protocol, Wireless Universal Serial Bus (USB), optical communication such as Infrared Data Association (IrDA) or infrared transmission to name but a few possible examples of wireless communications in some embodiments.
Figure 4 illustrates a vehicle 100 passing a curvature with a road banking 400. The road banking 400 has an angle a.
In some embodiments, an approximation may be made, that the road banking 400 is zero, or neglectable, while comparing rotation angle with a global reference, meaning that the effects of a banked road 400 is ignored. Alternatively, when a significant road banking 400 is detected, the calculation may be aborted. If the road banking 400 can be measured with sufficient accuracy, the effects may be handled with additional transformations and calculations, according to some embodiments.
Road banking 400 may be determined by measuring sideways acceleration with a sensor, in some embodiments. In other embodiments, the size of the road banking may be stored in a database, associated with a geographical position. By determining the geographical position of the vehicle 100, the size of the road banking 400 may be extracted, in some embodiments.
When the curvature of the road is determined by the sensor 120, it may be noted that the curvature is determined in the plane of the road banking 400, independently of the size of the road banking 400.
According to some embodiments, a measurement may be taken if/ when T > Tminfor the current measurement method and another weighted accumulation or low-pass filtering is done to calculate a final value from several measurement occasions; or a measurement has been taken by another method, such as standstill bias estimation.
An example of the final yaw rate bias can be described with the following accumulation: Image available on "Original document" where W(n) is a weight determined by the measurement method, typically standstill measurement would qualify as a higher weight than on-road calibration because it will require shorter time for the calibration.
Outliers may be filtered out and rejected when a number of measurements have been taken, if necessary.
Figure 5 illustrates an example of a method 500 according to an embodiment. The flow chart in Figure 5 shows the method 500 for use in a vehicle 100. The method 500 aims at estimating bias of a yaw rate sensor 110 on-board the vehicle 100.
The vehicle 100 may be e.g. a truck, a bus, a car, or similar means of conveyance.
The yaw rate sensor bias may be determined by measuring the yaw rate with the yaw rate sensor 110, estimating the yaw rate by another measurement methodology, e.g. by a camera 120.
In order to correctly be able to detect and estimate the yaw rate sensor bias, the method 500 may comprise a number of steps 501-508. However, some of these steps 501-508 may be performed in various alternative manners. Some method steps may only be performed in some optional embodiments; such as e.g. steps 504, 507-508. Further, the described steps 501-508 may be performed in a somewhat different chronological order than the numbering suggests. The method 500 may comprise the subsequent steps: Step 501 comprises measuring a yaw rate with the yaw rate sensor 110 during a time period T while passing a route segment 200.
The yaw rate may be measured for a time period exceeding a minimum time limit, while not exceeding a maximum time limit, in some embodiments.
Step 502 comprises calculating a yaw angle by integrating the yaw rate measured 501 by the yaw rate sensor 110 over the time period T.
The yaw angle may be calculated by integration in some embodiments: Image available on "Original document" Image available on "Original document" where ?measis the total measured change in yaw angle, ?meas(t) is measured yaw rate, 0 is the point in time when the vehicle 100 enters the route segment 200 and T is the point in time when the vehicle 100 exits the route segment 200.
The yaw angle may be calculated by integration, in some embodiments wherein the camera 120 may comprise a global navigation satellite system device or digital compass: Image available on "Original document" Image available on "Original document" where ?measis the yaw angle, ?measis a measured yaw rate, 0 is the point in time when the vehicle 100 enters the route segment 200 and T is the point in time when the vehicle 100 exits the route segment 200.
Step 503 comprises determining vehicle directional change velocity while the vehicle 100 is passing the route segment 200, with a camera 120.
The vehicle directional change velocity may be determined based on road curvature of the route segment 200, by detecting lane markings of the route segment 200, and vehicle velocity of the vehicle 100 while passing the route segment 200 as determined by a speedometer of the vehicle 100 in some embodiments wherein the camera 120 may comprise a camera.
In some embodiments, wherein the camera 120 may comprise a global navigation satellite system device or digital compass, the vehicle directional change velocity while the vehicle 100 is passing the route segment 200 may be determined by determining a change in heading ??0??over the route segment 200.
The vehicle directional change velocity may in some embodiments be determined for a time period exceeding a minimum time limit, while not exceeding a maximum time limit.
Step 504, which may be comprised only in some alternative embodiments, comprises estimating road banking 400 of the route segment 200.
In some alternative embodiments, the vehicle directional change velocity is determined 503 taking the estimated 504 road banking 400 into regard.
The vehicle directional change may in some embodiments be calculated by integration over the time period T : Image available on "Original document" is the host vehicle’s heading towards the route segment, vx(t ) is vehicle longitudinal velocity of the vehicle 100, vy(t ) is vehicle lateral velocity of the vehicle 100, croad= curvature of the road as measured by a sensor 120 measuring road markings; Step 505 comprises calculating a vehicle directional change of the vehicle 100.
The vehicle directional change may be calculated by integrating the determined 503 vehicle directional change velocity over the time period T in some embodiments wherein the camera 120 may comprise a camera.
Step 506 comprises estimating the yaw rate sensor bias by comparing the calculated 502 yaw angle with the calculated 505 vehicle directional change.
The yaw rate sensor bias, ?bias, may be estimated by: Image available on "Original document" Alternatively, in some embodiments, the yaw rate sensor bias, ?bias, may be estimated by: Image available on "Original document" where T is the point in time when the vehicle 100 exits the route segment 200.
Step 507, which may be comprised only in some alternative embodiments, comprises combining a plurality of yaw rate sensor bias, based on vehicle directional change velocity determined 503 at distinct moments in time and / or by using distinct methods, and by standstill bias estimation: Image available on "Original document" where W(n ) is a weight 0 ? W(n ) ? 1, determined by the used measurement method.
Step 508, which may be comprised only in some alternative embodiments, comprises adjusting the yaw rate sensor 110 with the estimated 506 yaw rate sensor bias. The yaw rate sensor 110 may thereby be calibrated.
Figure 6 illustrates an embodiment of a system 600 in a vehicle 100 for detecting bias of a yaw rate sensor 110 on-board the vehicle 100. The system 600 may perform at least some of the previously described steps 501-508 according to the method 500 described above and illustrated in Figure 5.
The system 600 comprises at least one control unit 310 in the vehicle 100, for detecting bias of the yaw rate sensor 110 on-board the vehicle 100. The control unit 310 is configured to measure a yaw rate with the yaw rate sensor 110 during a time period T while passing a route segment 200. Further, the control unit 310 is configured to calculate a yaw angle by integrating the yaw rate measured by the yaw rate sensor 110 over the time period T. Also, the control unit 310 is configured to determine vehicle directional change velocity while the vehicle 100 is passing the route segment 200, with a camera 120. The control unit 310 is furthermore configured to calculate a vehicle directional change of the vehicle 100. In addition, the control unit 310 is configured to detect the yaw rate sensor bias by comparing the calculated yaw angle with the calculated vehicle directional change.
In some embodiments, wherein the camera120 comprises a camera, the control unit 310 may be further configured to determine the vehicle directional change velocity based on road curvature of the route segment 200, by detecting lane markings of the route segment 200, and measured vehicle velocity of the vehicle 100 while passing the route segment 200 as determined by a speedometer of the vehicle 100. Also, the control unit 310 may be configured to calculate the vehicle directional change of the vehicle 100 by integrating the determined vehicle directional change velocity over the time period T.
In yet some embodiments, wherein the camera120 comprises a global navigation satellite system device or digital compass; the control unit 310 may be configured to determine the vehicle directional change velocity while the vehicle 100 is passing the route segment 200 by determining an initial heading ? (0) when entering the route segment 200, and determining a final heading ? (T ) when exiting the route segment 200. Further, the control unit 310 may be configured to calculate the yaw angle by integration: Image available on "Original document" where Ymeasis the yaw angle, ?measis a measured yaw rate, 0 is the point in time when the vehicle 100 enters the route segment 200 and T is the point in time when the vehicle 100 exits the route segment 200. The control unit 310 may also be further configured to detect the yaw rate sensor bias, ?meas, by: Image available on "Original document" where T is the point in time when the vehicle 100 exits the route segment 200.
The control unit 310 comprises a receiving circuit 610 configured for receiving a signal from the yaw rate sensor 110 and from the camera 120.
Further, the control unit 310 comprises a processor 620 configured for performing at least some steps of the previously described method 500, according to some embodiments.
Such processor 620 may comprise one or more instances of a processing circuit, i.e. a Central Processing Unit (CPU), a processing unit, a processing circuit, an Application Specific Integrated Circuit (ASIC), a microprocessor, or other processing logic that may interpret and execute instructions. The herein utilised expression “processor” may thus represent a processing circuitry comprising a plurality of processing circuits, such as, e.g., any, some or all of the ones enumerated above.
Furthermore, the control unit 310 may comprise a memory 625 in some embodiments. The optional memory 625 may comprise a physical device utilised to store data or programs, i.e., sequences of instructions, on a temporary or permanent basis. According to some embodiments, the memory 625 may comprise integrated circuits comprising silicon-based transistors. The memory 625 may comprise e.g. a memory card, a flash memory, a USB memory, a hard disc, or another similar volatile or non-volatile storage unit for storing data such as e.g. ROM (Read-Only Memory), PROM (Programmable Read-Only Memory), EPROM (Erasable PROM), EEPROM (Electrically Erasable PROM), etc. in different embodiments.
Further, the control unit 310 may comprise a signal transmitter 630 in some embodiments. The signal transmitter 630 may be configured for transmitting a signal to the yaw rate sensor 110, i.e. a command to adjust the sensor 110 with the detected bias; and / or to a presentational device such as a display device, loud speaker etc., for informing the driver of the detected yaw rate sensor bias.
In addition, the system 600 also comprises at least one yaw rate sensor 110, for determining yaw rate of the vehicle 100.
The system 600 furthermore comprises a camera 120. The camera 120 may comprise e.g. a camera, a stereo camera, an infrared camera, a video camera, radar, lidar, ultrasonic sensor, time- of- flight camera, or thermal camera or similar. The camera 120 may alternatively, or in addition comprise a global navigation satellite system device or digital compass.
The camera 120 utilised for performing at least a part of the method 500 may in some embodiments have another main purpose than performing the method 500, i.e. be already existing in the vehicle 100.
The above described method steps 501-508 to be performed in the vehicle 100 may be implemented through the one or more processors 620 within the control unit 310, together with computer program product for performing at least some of the functions of the steps 501-508. Thus, a computer program product, comprising instructions for performing the steps 501-508 in the control unit 310 may perform the method 500 comprising at least some of the steps 501-508 for detecting bias of the yaw rate sensor 110 on-board the vehicle 100, when the computer program is loaded into the one or more processors 620 of the control unit 310.
Further, some embodiments of the invention may comprise a vehicle 100, comprising the control unit 310, for detecting and tracking an object 200, according to at least some of the method steps 501-508.
The computer program product mentioned above may be provided for instance in the form of a data carrier carrying computer program code for performing at least some of the steps 501-508 according to some embodiments when being loaded into the one or more processors 620 of the control unit 310. The data carrier may be, e.g., a hard disk, a CD ROM disc, a memory stick, an optical storage device, a magnetic storage device or any other appropriate medium such as a disk or tape that may hold machine readable data in a non-transitory manner. The computer program product may furthermore be provided as computer program code on a server and downloaded to the control unit 310 remotely, e.g., over an Internet or an intranet connection.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described method 500; the control unit 310; the computer program; the system 600 and / or the vehicle 100. Various changes, substitutions and / or alterations may be made, without departing from invention embodiments as defined by the appended claims.
As used herein, the term "and/ or" comprises any and all combinations of one or more of the associated listed items. The term “or” as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. In addition, the singular forms "a", "an" and "the" are to be interpreted as “at least one”, thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. It will be further understood that the terms "includes", "comprises", "including" and / or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and / or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and / or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/ distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms such as via Internet or other wired or wireless communication system.

Claims (9)

PATENT CLAIMS
1. A method (500) in a vehicle (100) for estimating bias of a yaw rate sensor (110) onboard the vehicle (100), wherein the method (500) comprises: measuring (501) a yaw rate with the yaw rate sensor (110) during a time period (T) while passing a route segment (200); calculating (502) a yaw angle by integrating the yaw rate measured (501) by the yaw rate sensor (110) over the time period (T); determining (503) vehicle directional change velocity while the vehicle (100) is passing the route segment (200), with a camera (120), based on road curvature of the route segment (200), by detecting lane markings of the route segment (200), and vehicle velocity of the vehicle (100) while passing the route segment (200) as determined by a speedometer of the vehicle (100); calculating (505) a vehicle directional change of the vehicle (100) by integrating the determined (503) vehicle directional change velocity over the time period (T); and estimating (506) the yaw rate sensor bias by comparing the calculated (502) yaw angle with the calculated (505) vehicle directional change.
2. The method (500) according to claim 1, wherein: the yaw angle is calculated (502) by integration: Image available on "Original document" where ?measis the total measured change in yaw angle, ?meas(t) is measured yaw rate, 0 is the point in time when the vehicle (100) enters the route segment (200) and T is the point in time when the vehicle (100) exits the route segment (200); the vehicle directional change is calculated (504) by integration over the time period (T): Image available on "Original document" where 0(t) is the host vehicle’s heading towards the route segment, vx(t ) is vehicle longitudinal velocity of the vehicle (100), vy(t ) is vehicle lateral velocity of the vehicle (100), croad= curvature of the road as measured by the camera (120) measuring road markings; and wherein the yaw rate sensor bias, ?bias, is estimated (506) by: Image available on "Original document"
3. The method (500) according to claim 2, further comprising: estimating (504) road banking (400) of the route segment (200); and wherein the vehicle directional change velocity is determined (503) taking the estimated (504) road banking (400) into regard.
4. The method (500) according to any of claims 1 -3, wherein the yaw rate is measured (501) and the vehicle directional change velocity is determined (503) for a time period exceeding a minimum time limit, while not exceeding a maximum time limit.
5. The method (500) according to any of claims 1-4, further comprising: combining (507) a plurality of yaw rate sensor bias, based on vehicle directional change velocity determined (503) at distinct moments in time and / or by using distinct methods, according to any of claims 1-4 and by standstill bias estimation: Image available on "Original document" where W(n ) is a weight 0 ? W(n ) ? 1, determined by the used measurement method.
6. The method (500) according to any of claims 1-5, further comprising: adjusting (508) the yaw rate sensor (110) with the estimated (506) yaw rate sensor bias.
7. A control unit (310) in a vehicle (100), for detecting bias of a yaw rate sensor (110) on-board the vehicle (100), wherein the control unit (310) is configured to: measure a yaw rate with the yaw rate sensor (110) during a time period (T) while passing a route segment (200); calculate a yaw angle by integrating the yaw rate measured by the yaw rate sensor (110) over the time period (T); determine vehicle directional change velocity while the vehicle (100) is passing the route segment (200), with a camera (120), based on road curvature of the route segment (200), by detecting lane markings of the route segment (200), and measured vehicle velocity of the vehicle (100) while passing the route segment (200) as determined by a speedometer of the vehicle (100); calculate a vehicle directional change of the vehicle (100) by integrating the determined vehicle directional change velocity over the time period (T); and detect the yaw rate sensor bias by comparing the calculated yaw angle with the calculated vehicle directional change.
8. A computer program comprising program code for performing a method (500) according to any of claims 1-6 when the computer program is executed in a control unit (310), according to claim 7.
9. A system (600) for detecting bias of a yaw rate sensor (110) on-board the vehicle (100), which system (600) comprises: a control unit (310) according to claim 7; a yaw rate sensor (110); and a camera (120).
SE1750104A 2017-02-07 2017-02-07 Method and control unit for estimating bias of yaw rate sensor SE540698C2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
SE1750104A SE540698C2 (en) 2017-02-07 2017-02-07 Method and control unit for estimating bias of yaw rate sensor
DE102018000599.3A DE102018000599A1 (en) 2017-02-07 2018-01-25 Method and control unit for estimating the deviation of a yaw rate sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
SE1750104A SE540698C2 (en) 2017-02-07 2017-02-07 Method and control unit for estimating bias of yaw rate sensor

Publications (2)

Publication Number Publication Date
SE1750104A1 SE1750104A1 (en) 2018-08-08
SE540698C2 true SE540698C2 (en) 2018-10-16

Family

ID=62910189

Family Applications (1)

Application Number Title Priority Date Filing Date
SE1750104A SE540698C2 (en) 2017-02-07 2017-02-07 Method and control unit for estimating bias of yaw rate sensor

Country Status (2)

Country Link
DE (1) DE102018000599A1 (en)
SE (1) SE540698C2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113260543B (en) * 2018-12-28 2024-11-12 爱知制钢株式会社 Vehicle driving control method and vehicle control system
US20220340202A1 (en) * 2019-06-21 2022-10-27 Aichi Steel Corporation Vehicular control method and control system
DE102019215026A1 (en) * 2019-09-30 2021-04-01 Robert Bosch Gmbh Method and device for determining a highly accurate estimate of a yaw rate for controlling a vehicle
JP7615915B2 (en) * 2021-06-29 2025-01-17 トヨタ自動車株式会社 Vehicle behavior judgment system and vehicle behavior judgment method
DE102023118015A1 (en) * 2023-07-07 2025-01-09 Valeo Schalter Und Sensoren Gmbh CONTROL OF THE PATH FORECAST FOR A VEHICLE

Also Published As

Publication number Publication date
SE1750104A1 (en) 2018-08-08
DE102018000599A1 (en) 2018-08-09

Similar Documents

Publication Publication Date Title
KR101454153B1 (en) Navigation system for unmanned ground vehicle by sensor fusion with virtual lane
US10788830B2 (en) Systems and methods for determining a vehicle position
CN106289275B (en) Unit and method for improving positioning accuracy
EP3445626B1 (en) Method and control unit in a vehicle for estimating a stretch of a road based on a set of tracks of another vehicle
CN106257242B (en) Units and Methods for Conditioning Road Boundaries
CN106546977B (en) Vehicle Radar Perception and Localization
US10303168B2 (en) On-vehicle control device, host vehicle position and posture specifying device, and on-vehicle display device
US10053090B2 (en) Method for providing user-defined customization of a vehicle
CN112074885A (en) Lane sign positioning
KR20100059911A (en) Correction of a vehicle position by means of characteristic points
US20180188031A1 (en) System and method for calibrating vehicle dynamics expectations for autonomous vehicle navigation and localization
CN109313033B (en) Updating of navigation data
CN110192085B (en) Method and control unit for ground bearing capacity analysis
SE540698C2 (en) Method and control unit for estimating bias of yaw rate sensor
KR101704405B1 (en) System and method for lane recognition
KR102660497B1 (en) System for positioning location information of car
WO2015129175A1 (en) Automated driving device
US12179758B2 (en) Vehicle control device, vehicle control method, vehicle motion control system, and lane estimation device
CN110869869A (en) Method for operating a highly automated vehicle (HAF), in particular a highly automated vehicle
EP4113063A1 (en) Localization of autonomous vehicles using camera, gps, and imu
SE542273C2 (en) Method and control arrangement for lateral vehicle displacement
EP3339808A1 (en) Positioning objects in an augmented reality display
US20250078318A1 (en) Method, apparatus, and computer program product for calibration of camera to vehicle alignment
CN116394954A (en) Hypothetical reasoning for vehicles
Nagothu et al. INS-GPS enabled driving aid using Doppler sensor

Legal Events

Date Code Title Description
NUG Patent has lapsed