SE2350556A1 - A method for preparing an extrinsic calibration procedure - Google Patents
A method for preparing an extrinsic calibration procedureInfo
- Publication number
- SE2350556A1 SE2350556A1 SE2350556A SE2350556A SE2350556A1 SE 2350556 A1 SE2350556 A1 SE 2350556A1 SE 2350556 A SE2350556 A SE 2350556A SE 2350556 A SE2350556 A SE 2350556A SE 2350556 A1 SE2350556 A1 SE 2350556A1
- Authority
- SE
- Sweden
- Prior art keywords
- calibration
- trajectory
- sensor
- sensors
- land vehicle
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
- G01S7/4026—Antenna boresight
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D18/00—Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P21/00—Testing or calibrating of apparatus or devices covered by the preceding groups
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4052—Means for monitoring or calibrating by simulation of echoes
- G01S7/4082—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
- G01S7/4086—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder in a calibrating environment, e.g. anechoic chamber
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
- G01S7/4972—Alignment of sensor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/932—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93271—Sensor installation details in the front of the vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93272—Sensor installation details in the back of the vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93274—Sensor installation details on the side of the vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/04—Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
- G06Q10/047—Optimisation of routes or paths, e.g. travelling salesman problem
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
Abstract
The present disclosure relates to calibration of sensors, and in particular to calibration of sensors on land vehicles. According to a first aspect, the disclosure relates to a method for preparing an extrinsic calibration procedure, the calibration procedure comprising calibrating one or more sensors (11) arranged on a land vehicle (1) by driving a calibration trajectory (40) within a limited area (2). The method comprises generating (S4) a calibration trajectory (40) within boundaries of the limited area (2) such that it includes a set of motion primitives determined to together provide sufficient observability to achieve calibration performance within a certain bounds, while optimizing time efficiency of driving the calibration trajectory and providing (S5) the generated calibration trajectory for the land vehicle to drive during the calibration procedure.
Description
A method for preparing an extrinsic calibration procedure Technical Field The present disclosure relates to calibration of sensors, and in particular to calibration of sensors on land vehicles.
Background Autonomous land vehicles and Advance Driver-Assistance System, ADAS, technologies rely on accurate perception systems. These perception systems use information from a multitude of sensors of different modalities (such as cameras, RADARs, LlDARs, and others) to provide a rich view of the surroundings. Before information from these sensors can be combined into a single reference frame, a transformation between the sensors must be determined. The exact locations of the sensors are then estimated, which is also referred to as extrinsic calibration.
Extrinsic calibration needs to be performed relatively often, and particularly whenever a land vehicle is serviced or when it leaves the factory. The calibration includes performing a set of (multiple driving manoeuvres and turns in different directions) until the sensors have gathered enough data to perform the calibration. The efficiency of calibration is related to a measurement of information associated with each manoeuvre performed. To accurately perform calibration, one needs to ensure that enough "informative" manoeuvres have been performed by the land vehicle. The information of a manoeuvre, or of a set of manoeuvres is not easily measurable by the naked eye.
Today, different features between the common field-of-view of the sensors are matched for calibration performed offline. To acquire calibration data for offline calibration an extensive set of manoeuvres is performed which often errs on the side of caution. This is of course not ideal, as it is time consuming. ln the future, calibration is expected to be performed in the land vehicle itself without manual intervention, i.e. online. As of today there is no process for determining manoeuvres required for online calibration in a constrained space.
Summary lt is an objective of the present disclosure to alleviate at least some of the drawbacks of the prior art. lt is a further objective to provide techniques that makes calibration of sensors arranged on land vehicles more efficient. ln particular it is an object to make it possible to only perform the smallest number of manoeuvres needed to achieve a reasonable level of calibration accuracy. These objectives and others are at least partly achieved by the method, system, and land vehicle according to the independent claims, and by the embodiments according to the dependent claims.
According to a first aspect, the disclosure relates to a method for preparing an extrinsic calibration procedure, the calibration procedure comprising calibrating one or more sensors arranged on a land vehicle by driving a calibration trajectory within a limited area. The method comprises generating a calibration trajectory within boundaries of the limited area such that it includes a set of motion primitives determined to together provide sufficient observability to achieve calibration performance within certain bounds, while optimising time efficiency of driving the calibration trajectory and providing the generated calibration trajectory for the land vehicle to drive during the calibration procedure. The proposed method enables generating a calibration trajectory which is tailored to the specific sensor configuration of the land vehicle. Furthermore, the calibration trajectory is tailored to fit the different workshop and test track configurations available. The generated trajectory also minimizes the amount of time spent in the calibration process, while still ensuring that enough calibration information is gathered. Hence, the proposed method contributes to more efficient calibration of one or more sensors, which may be of different types. ln some embodiments, the generating comprises solving an optimization problem comprising ensuring that the generated calibration trajectory includes the set of motion primitives and stays within the boundaries of the limited area, while minimizing a cost parameter defining time required to drive the calibration trajectory.
By solving an optimization problem, the best possible calibration trajectory may be generated. ln some embodiments, the generating comprises using a motion planner and a land vehicle model. Hence, the proposed method can be implemented utilizing already existing techniques. ln some embodiments, the method comprises obtaining boundaries of the limited area. By obtaining the boundaries of the limited area, the calibration trajectory can be tailored to a particular area. ln some embodiments, the method comprises obtaining, based on sensor type of the one or more sensors, a set of motion primitives that together provide sufficient observability to achieve calibration performance within the certain bounds. Hence, the set of motion primitives can be pre-computed and obtained when calibration is to be performed. ln some embodiments, the obtaining comprises retrieving the motion primitives, based on the sensor type, from a database storing sets of motion primitives, that together provide sufficient observability to achieve calibration performance within certain bounds, for different types of sensors. The database can store motion primitives for various sensors and can be used by multiple vehicles. ln some embodiments, the method comprises determining, for one or more sensor types for collecting sensor data for calibration, a set of motion primitives that together provide sufficient observability to achieve calibration performance within the certain bounds. By determining the motion primitives per sensor type several vehicles using the same sensor type can use the determined set of motion primitives. ln some embodiments, the determining comprises, for each sensor type collecting while driving the land vehicle along a sample trajectory, calibration data and onboard sensor data indicative of a vehicle pose, wherein the collected calibration data provide sufficient observability to achieve the calibration performance within 4 the certain bounds. ln these embodiments the method further comprises, for each sensor type, determining a distribution of calibration data along the sample trajectory and selecting, based on the determined distribution and the co||ected onboard sensor data, a set of motion primitives corresponding to segments of the sample trajectory that together contain an amount of calibration information that is itself is sufficient to achieve the calibration performance within the certain bounds. Hence, motion primitives associated with a specific sensor type can be determined by driving a sample trajectory. ln some embodiments, the onboard sensor data comprises inertial motion data, in particular angular velocity and/or linear acceleration of the sensor and of the reference frame. Thus, the vehicle pose can be determined based on motion data. ln some embodiments, the inertial motion data comprises angular velocity in three degrees of freedom and linear acceleration in three degrees of freedom and wherein the selecting comprises determining a set of motion primitives that corresponds to maneuvers causing excitation in the inertial motion data in all, or several, of the degrees of freedom. Thus, the set of motion primitives can be determined by analyzing raw data from motion sensors. ln some embodiments, the inertial motion data is co||ected by the one or more sensors themselves, or by motion sensors collocated with the one or more sensors. Hence, data co||ected by other sensors different from the sensors to be calibrated, collocated with the one or more sensors to be calibrated, can also be utilized. ln some embodiments, the method comprises obtaining one or more trajectory segments within the limited area that can provide visual or structural calibration information suitable for calibration of the sensor type, and the generating comprises including the obtained one or more trajectory segments in the calibration trajectory. Thus, the method can also be extended to also consider visual and structural features in the limited area. ln some embodiments, the obtaining comprises obtaining vision data corresponding to visual data co||ected by the one or more sensors when driving around with the land vehicle in the limited area and identifying, based on the obtained vision data, segments of the trajectory including visual content corresponding to calibration information suitable for calibration of the sensor type. Hence, the visual and/or structural information may be detected by driving around in the limited area. ln some embodiments, the method comprises obtaining the vision data based on a simulation of the land vehicle driving in the limited area. Therefore, no real driving in the limited area is required. ln some embodiments, the method comprises controlling the land vehicle to drive along the provided calibration trajectory. Hence, the time required for the calibration procedure can be reduced. ln some embodiments, the one or more sensors comprises one or more of cameras, LlDARs and RADARs. Hence, the method is applicable for various sensor types and also for land vehicles with several sensor types.
According to a second aspect, the disclosure relates to use of the method according to the first aspect, for online extrinsic calibration of a plurality of vehicle sensors arranged on the same land vehicle.
According to a third aspect, the disclosure relates to a computer program comprising instructions which, when the computer program is executed by a computer, cause the computer to carry out the method according to the first aspect.
According to a fourth aspect, the disclosure relates to a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to the first aspect.
According to a fifth aspect, the disclosure relates to a control arrangement configured to perform the method according to the first aspect.
According to a sixth aspect, the disclosure relates to a land vehicle comprising the system according to the fifth aspect.
Brief description of the drawinqs The embodiments disclosed herein are illustrated by way of example, and by not by way of limitation, in the figures of the accompanying drawings. Like reference numerals refer to corresponding parts throughout the drawings, in which: Fig. 1 illustrate sensors arranged on a land vehicle.
Fig. 2a and 2b illustrate a non-informative manoeuvre and an informative manoeuvre.
Fig. 3 illustrates a sample trajectory based on state estimation, where the segment selection is done based on observability criteria which is computed using singular value decomposition of the mutual information matrix.
Fig. 4 illustrates a calibration trajectory where selected motion primitives are combined to generate an optimal path in a limited space for capturing all excitations needed for calibration.
Fig. 5a is a flow chart of an example method according to the first aspect.
Fig. 5b illustrates determination of motion primitives in further detail.
Fig. 6 illustrates a control arrangement configured to perform the proposed method. Fig. 7a illustrates signals of two motion sensors after calibration with data collected along a sample trajectory.
Fig. 7b illustrates signals of two motion sensors after calibration with data collected only along selected trajectory segments of a reference path.
Detailed description ln the following disclosure, the following definitions are used: Autonomous land vehicle: A land vehicle that can travel without human input.
Autonomous land vehicles typically use vehicle sensors to sense their surroundings.
Autonomous driving: Assisted driving at different levels from driver assistance such as cruise control where the land vehicle only controls accelerating/decelerating, up to full driving automation where the land vehicle performs all driving tasks without driver interaction. 7 ADAS: Advance Driver-Assistance System, where the land vehicle only can control steering and accelerating/decelerating without driver interaction.
Calibration information: Calibration information selects the sensor data that is useful for calibration. Typically calibration information selects the sensor data that excites the different degrees of freedom (DoF) the vehicle sensors and therefor contribute to system observability.
Camera: A camera, or image sensor, produces images of the surroundings by detecting lights emitted from the surroundings on a photosensitive surface (image plane) using a camera lens. A camera is a passive sensor.
Collocated sensors: are sensors placed as close as possible to each other, for example placed side by side, together, or adjacent. The relation would typically be a rigid relation. The relation may e.g., be retrieved from a manufacturer as known extrinsic parameters. The relation may also be established with known calibration routines, such as "Kalibr", which is commonly used for IMU to Camera calibration, see for example Paul Furgale, Joern Rehder, Roland Siegwart "Unified Temporal and Spatial Calibration for Multi-Sensor Systems" in Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan (2013). For example, a motion sensor may be collocated with a vehicle sensor. ln other words, collocated sensors are typically positioned such that sensor signals produced vary in the same, or similar, way as the land vehicle moves.
Extrinsic calibration: Extrinsic oatibration aims to obtain the extrinsic parameters that denna the gaomatrâc reiationship, that is, the rotation rnatrâx and trarisiatâori vector, between two coordinate 'irames.
Extrinsic parameters: the rotation matrix R and transiatiorw vector t between two coordinate frames. The extrinsic parameters typtcaiiy inciude six Doiï.
Frame: A coordinate frame with an origin and three orthogonal axes, typically referred to as the x, y and z-axes. May also be referred to as a coordinate system. 8 LIDAR: Light detection and ranging. Lidar is a distant sensing technique that produces infrared or laser light pulses that reflect off target objects. The LIDAR detects these reflections and the time between emissions and reception of the light pulse allows for distance estimate. A LIDAR is an active sensor.
Motion primitives: are pre-computed anotions or manoeuvres that the vehicle can perform.
Motion sensor: a sensor configured to sense motion of the sensor, typically angular velocity and linear acceleration. Angular velocity is for example measured as rotational movement about one to three perpendicular axes, i.e., as roll, pitch and/or yaw. Translational movement is for example measured in one to three perpendicular axes, i.e., as surge, heave and sway. The motion sensor is an internal state sensor, also known as proprioceptive sensor, that records the dynamical state of a dynamic system. The motion sensor includes, for example, one or more gyroscopes for measuring angular velocity and one or more accelerometers for measuring force and/or acceleration. ln some embodiments, the gyroscope is a 3-axis gyroscope measuring angular velocity in three degrees of freedom, DoF. ln some embodiments, the accelerometer is a 3-axis accelerometer measuring force and/or acceleration in three DoF. A motion sensor with a 3-axis gyroscope and a 3-axis accelerometer may be referred to as a 6-axis motion sensor. One example of a motion sensor is an lnertial Measurement Unit (IMU). Hence, the motion sensor may be a 6-axis IMU. The 6-axis IMU has 6 DoF. ln a particular example, the motion sensor is collocated and rigidly attached to the vehicle sensor, or rigidly mounted along with the vehicle sensor. ln some embodiments, angular velocity components can be estimated based on optical flow vectors from vehicle sensor. Hence, the motion sensor is then embedded or even a part of the vehicle sensor. For newer generations of sensors like event cameras, it is possible to estimate the angular velocity seamlessly.
Observability: is a measure of how well internal states of a system can be inferred from knowledge of its external outputs. ln this context observability refers to how 9 well the extrinsic calibration information can be recovered by matching sensor poses.
Online calibration: Onboard computing in real-time, or at least via data processing in real-time, e.g., in the cloud. The calibration is performed without any human intervention, e.g., automatically. For example, online calibration is performed on the fly directly in the vehicle itself without manual intervention, typically when some driving maneuvers are performed in the vehicle to create rich sensor data. No external means are needed, like fiducial markers or similar.
Pose: A pose of an object comprises position and orientation of the object. lf coordinate frames are attached to the object and another object, a geometric relationship between these two coordinate frames may be represented by a transformation matrix including a rotation matrix R and a translation vector t. The rotation matrix R describes the orientation of one coordinate frame relative to the other coordinate frame. The translation vector describes the position of the origin of one coordinate frame with respect to the other coordinate frame. Hence, a pose comprises a rotational component and a translational component. The pose includes, for example, six Degrees of Freedom (Došf).
RADAR: Radio detection and ranging. A RADAR emits electromagnetic (EM) waves within a region of interest and receives reflections from targets for signal processing and range information. lt can be used to determine relative speed and position of identified obstacles using the Doppler property of the EM waves. A RADAR is an active sensor.
Reference frame: a coordinate frame that is known in relation to a known coordinate frame, such as the world coordinate frame, i.e., a world coordinate system. The reference frame may be a frame of the land vehicle or more specifically the land vehicle platform. The reference frame may in such case be referred to as a body frame or land vehicle base frame, B. Hence, the reference frame is for example a base frame of the land vehicle, which is typically located in center of the lO land vehicle rear axis. Alternatively, the reference frame may be a frame of another sensor. The reference frame may then be referred to as a sensor frame.
Trajectory: a path that the vehicle follows during the calibration. The trajectory comprises of position and orientation of the vehicle.
Vehicle sensor: a sensor used in a land vehicle for perception purposes, for example a LIDAR, a RADAR, a camera or an ultrasonic sensor. The vehicle sensor is a kind of exteroceptive sensor, or external state sensor, that perceives and gathers information from the land vehicle's environment. A vehicle sensor may also be referred to as an autonomous sensor as it is used by an autonomous land vehicle in autonomous driving.
Ultrasonic sensor: an ultrasonic sensor measures the distance of a target object by emitting ultrasonic sound waves and converts the reflected sound into an electric signal.
The proposed technique is based on the notion of "calibration information", which is similar to what is proposed by Lv, Jiajun, et al. in the paper "Observability-Aware lntrinsic and Extrinsic Calibration of LiDAR-IMU Systems." IEEE Transactions on Robotics (2022). Calibration information is basically sensor data that contributes to the sensor calibration. The referenced paper proposes a selection policy which selects only the most informative segments of a trajectory for calibration, which significantly improves the calibration efficiency by processing only the selected informative segments. However, this method is not suitable when calibration is performed in a limited area (such as in a workshop), as the optimal segments are selected after data collection.
The inventor of the present invention has realized that the notion of "calibration information" can be used to create a motion planner for workshops that plans a set of manoeuvres for a land vehicle to perform to calibrate its sensors. More specifically, a method suitable for use in workshops is herein proposed. The method can generate a set of manoeuvres for a land vehicle to perform to calibrate its sensors within the limited area of the workshop. More specifically, the proposed ll technique is based on the idea that a selection policy which considers informative trajectory segments can also be used as input to motion planning. ln other words, when calibration is performed in a limited area, a calibration trajectory can be determined beforehand, such that the calibration trajectory stays withing a limited area, such as a workshop, and also includes segments that provide enough calibration information to perform calibration with required accuracy. A further requirement on the trajectory is that it is time efficient, as it is typically desirable to perform the calibration as quickly as possible.
For better understanding of the proposed technique, extrinsic calibration of vehicle sensors will first be described with reference to Fig. 1. Thereafter, the notion of calibration information is illustrated in Fig. 2. The proposed technique for generating a calibration trajectory is then described with reference to Figs. 3-7.
Calibration of vehicle sensors is needed to perform advanced land vehicle operations such as driver assistance and autonomous driving. Data from a multitude ofdifferent modalities of sensors such as cameras, LIDARs, RADARs and others are needed to create an accurate and reliable view of the surroundings of the land vehicle. Different modalities of sensors often need different types of calibration techniques, making the calibration complicated and time consuming. Some calibration techniques require sensor state estimations making the calibration computationally demanding and sometimes slow. lt is a goal of sensor calibration to figure out how different sensors within the land vehicle are geometrically related to each other with respect to a base frame, see Fig. 1 described below. Knowing the exact location of the vehicle sensors 11 of the land vehicle 1 is a prerequisite to obtain good perception estimates in a common reference frame.
Fig. 1 illustrates a top view of an example land vehicle 1. The land vehicle 1 is for example a car, bus, truck, or other kind of land vehicle. Any such land vehicle 1 comprises a body and chassis, engine/motor parts, drive transmission, and steering parts, suspension and brake parts, and electrical parts. The body may be integrated with the chassis, or the body and the chassis may be different parts that are fixed 12 to each other, e.g., with bolts. The chassis is the main mounting for all parts of the land vehicle and comprises a steel frame. The chassis further comprises a front axis 12 and a rear axis 13 depicted in Fig. 1. The body is typically made of metal and/or fiber glass. The electrical parts comprise an electronic system including electronic control units (ECUs). lt should be understood that land vehicle 1 in Fig. 1 is an example only, and that a land vehicle in this disclosure may have, e.g., more axes and other means not illustrated herein.
The land vehicle 1 further comprises a plurality of sensors and a control arrangement 10. The sensors are used to monitor different functions and states of the land vehicle, to provide information to the driver or to different systems of the land vehicle 1. Each sensor converts sensed events or changes of a property into a signal or data that is sent to, or collected by, the control arrangement 10. Such signals or data are, for example, sent over a CAN (Controller Area Network), or similar, of land vehicle 1.
The plurality of sensors comprises vehicle sensors 11 for monitoring the surroundings of the land vehicle 1. The output from the vehicle sensors 11 may provide input to an autonomous control system of the control arrangement 10, for use in autonomous driving. These vehicle sensors 11 need to be calibrated to provide an accurate perception of the environment. The plurality of sensors also comprises motion sensors ms1, ms2.
As illustrated in Fig. 1, the land vehicle base frame B is located in the centre of the rear axis 13 of land vehicle 1. Sensor readings from vehicle sensors 11 such as LlDARs, cameras, RADARs, and motion sensors ms1, ms2, are represented in their respective sensor frames as LUÛ, C00, RUÛand IW, respectively, where k denotes the index of the corresponding sensor used. Each frame is illustrated with three orthogonal axes in Fig. 1.The example land vehicle 1 in Fig. 1 comprises two LlDARs and two cameras with collocated motions sensors attached to a front part of the land vehicle 1, and two LlDARs and one RADAR with collocated motions sensors attached to a rear part of the land vehicle 1. A motion sensor is also attached to the centre of the rear axis 13 at the base frame B. 13 The first motion sensor ms1 is collocated with a vehicle sensor 11. Hence, the first motion sensor ms1 is placed side by side, together, or in a particular relation to vehicle sensor 11. ln some embodiments, the first motion sensor ms1 is embedded in the vehicle sensor 11, which means that they can be handled as one unit. ln particular, the first motion sensor ms1 may be a part of the vehicle sensor 11, or even the same unit. The first motion sensor ms1 is configured to sense angular velocity w(ms1) and linear acceleration f(ms1). ln some embodiments, the first motion sensor ms1 and the vehicle sensor 11 have the same orientation axis and/or has none or very small translation between them.
The second motion sensor ms2 has a known pose in a reference frame of land vehicle 1. Hence, the extrinsic parameters of the second motion sensor ms2 are known. The refence frame is for example the land vehicle base frame B of land vehicle 1, or a sensorframe ofanother vehicle sensor 11 _ Hence, the second motion sensor ms2 may be collocated with a vehicle sensor 11 _ The second motion sensor ms2 is configured to sense angular velocity w(ms2) and linear acceleration f(ms2). The second motion sensor ms2 may in some embodiments be a motion sensor of a GNSS, Global Navigation Satellite System, of land vehicle 1. ln some embodiments, the first motion sensor ms1 and the second motion sensor ms2 are lnertial Measurement Units (lMUs).
The control arrangement 10 is configured to calibrate the vehicle sensor 11. Extrinsic calibration of sensors on land vehicles may be performed offline when the land vehicle is standing still, or online while the land vehicle is driving. Offline calibration techniques are often time consuming and require additional hardware such as fiducial markers. Online calibration techniques may be performed onboard by comparing estimated sensor readings (or sensor states) of the sensor to be calibrated to sensor readings (or states) of a sensor in a relevant sensor frame. Online calibration is typically performed in a limited area, such as in a workshop or on a test track. During extrinsic calibration sensor data is collected. The calibration data must provide sufficient observability to achieve the desired calibration performance. Observability is a measure of how well internal states of a system can 14 be inferred from knowledge of its external outputs, here the sensor readings. ln other words, to achieve observability it is required that the sensors are excited during the calibration. Sensor excitation can be achieved different ways depending on sensor type. For example, an IMU requires certain motion (e.g. certain manoeuvres) to be excited, while LlDARs and image sensors rely on visual and structural content in its environment. However, LlDARs and image sensors also benefit from motion, as sensor data from the movement of the sensors can be used to estimate change in position over time, which is commonly referred to as odometry. When calibrating a land vehicle in a limited area such as in a workshop it may be challenging to achieve the excitation of one or more sensor types in an efficient manner.
Figs. 2a-b illustrate a land vehicle 1 driving in a limited area 2. ln Fig. 2a the land vehicle 1 performs a non-informative maneuver (indicated by the arrow), that does not provide sensors that require motion to be excited with information that excites the vehicle sensors 11 (hereafter simply called sensors). For example, when driving too straight, one risks not providing the sensors with enough calibration information to allow for a proper calibration.
Fig. 2b illustrate a more informative maneuver (indicated by the arrow) that provides the sensors (e.g., LlDARs) with information that excites and allows for calibration of the sensors. Hence, if the land vehicle 1 is forced to swerve around, the sensor measurements will be sufficiently excited to allow the calibration to succeed.
The proposed concept of determining a calibration trajectory will now be described with reference to Figs. 3-4. To enable determination of a time-efficient trajectory informative segments first have to be determined. One way of doing this is to drive a land vehicle along a reference path in a non-limited area (for example on a common road, open field or similar), while recording sensor data. Fig. 3 illustrates an example sample trajectory 30, or overall trajectory. By analysing the sensor data, trajectory segments that contain a lot of calibration information are identified. ln Fig. 3 these trajectory segments are illustrated with a solid line (herein called selective or informative trajectory segments 31), while segments containing an insignificant amount of calibration information are illustrated with a dashed line (non-informative trajectory segments 32). lt can be observed that in this example there is typically not much calibration information in the non-informative trajectory segments 32, and they may therefore be ignored for calibration. The method for identifying the informative and non-informative trajectory segments 31, 32 depends on the sensor type. ln other words, the amount of calibration information generally depend on to what extent the sensors are excited. Different sensors can be excited in different ways. For example, for sensors measuring inertial motion, trajectory segments which encapsulate sufficient excitations for different degrees of freedom can be selected.
The idea is based on the idea of utilizing this knowledge when deciding a calibration trajectory to drive when performing sensor calibration in a limited area. More specifically, as some trajectory segments can be ignored for calibration, it is not required that the land vehicle drives these segments. lnstead, it is proposed to generate a calibration trajectory that can fit in the limited area and that at the same time includes the trajectory segments that will provide a lot of calibration information. To do this, certain suitable maneuvers, or motion primitives, have to be identified. A motion primitive is a certain motion that the land vehicle can perform, which is typically defined by a path and velocity. Motion primitives of the selected trajectory segments can be defined by recording poses of the land vehicle while driving along the reference trajectory 30, either based on state estimates or based on positioning (such as GNSS). ln other words, the selected trajectory segments can be used to motion primitives that encapsulate a high amount of calibration information for a particular sensor type. More specifically, for each sensor type a set of motion primitives that together provide sufficient calibration information for calibration with a certain accuracy can be identified. ln one example implementation, motion primitives that are rich in calibration information are determined based on calibration data being raw IMU signals, in particular on angular velocity. lt is assumed that sensor data from an IMU located in the base frame B (Fig. 1) is also available. The goal is then to find motion primitives that capture maximum information for calibration purposes. The primitives can be identified by calculating the maximum information between an IMU signal in the base frame and an IMU signal located close to the sensor to be calibrated. This can be done using a Fisher information matrix, see equation (2) below. To compute the Fisher information matrix, first, the residual error between the angular velocities is optimized (as in a rigid body the angular velocities are same at all the points if they have same orientation). The residual error between the angular velocities can be defined as: where, Rš, is the optimal extrinsics between the two lMUs, wBl. is the angular velocity in the base frame and w" is the angular velocity of the sensor (measured by the IMU of the sensor). The general principle of solving these equations is to iteratively update the estimate using Gauss-Newton or Levenberg Marquardt algorithm and compute the Fisher Information matrix as: where, ] is the Jacobian of the residual and ............ f. '. v I Å > ( f' ' ___- e E., W mt» (Rmwg, - wïa, j (find .Jí m cuBlà Singular value decomposition is performed on the Fisher Information matrix and the batch of motion primitives are chosen if the smallest singular value is greater than a pre-defined observability threshold. Next, an information-matrix based search based on angular velocities is performed. 17 The computations are performed in a batch ofdata and compare the singular value decomposition of the lowest eigenvalue. Based on a pre-defined threshold it is possible to select some trajectory segments which encapsulate sufficient excitations for different degrees of freedom. Typically, the calibration data contains most calibration information where there is lateral movement (turns), which excites different degrees of freedom of the sensors (see Fig. 3). Provided that the corresponding poses at the times when sensors are significantly excited are known either from state estimates or from GNSS poses the poses to generate automatic calibration aware calibration trajectories using constrained planning are recorded.
Fig. 4 illustrate what a calibration trajectory 40 created based on selected trajectory segments 31 of a sample trajectory 30 could look like. The calibration trajectory 40 in Fig. 4 encapsulate the selected trajectory segments 31 of the sample trajectory 30 in Fig. 3, that provide a lot of calibration information, but ignores the rest. ln this way, calibration can be performed within the limited area 2. The concept of generating the calibration trajectory 40 based on calibration information rich motion primitives will now be described in further detail with reference to Fig. 5a.
Fig. 5a illustrates the proposed method for preparing an extrinsic calibration procedure. The extrinsic calibration procedure comprises calibrating one or more sensors 11 arranged on a land vehicle 1 by driving a calibration trajectory 40 within a limited area 2. The eaiibratieh trajectory 40 is typieaiiy defined before the caiibratien procedure starts.
The extrinsic eaiibratien procedure is a procedure that aims at ebtainirtg extrinsic parameters that define the geometrie relationship, that is, the rotation ntatrix and tirahsiatieh veeter, between two eeordinate frarnes. The method rhay be aeeiied ter varieus types ot sensors, er eernbihatiens af sensors. The method is far exampie suitabie fet sensors that require certain htetieh ter eaiibratieit, such as for calibrating lMUs er LIDAR/camera (using visual/LIDAR odometry and match the poses).
Extrinsic sensor calibration can be heavily affected by the environment and calibration path, as these are factors that affect observability. The proposed technique is based on the idea that knowiedge about tvhich rnaneuvers are suitabie for caiibrating different types of sensors has been actiiiired and stored, for exampie, in a database. iviore specificaiiy, in an initiai step of the method motion primitives that together provide stifficieiit observabiiity to achieve caiibration performance vvithin certain bounds are determined and stored as expiained above. ln other words, in some embodiments, the method comprises, determining S0, for one or more sensor types, a set of motion primitives that together provide sufficient observability to achieve calibration performance within the certain bounds. The certain bounds delimits an accuracy interval, or accuracy range, corresponding to an acceptable calibration performance, such as accuracy. The certain bounds (or Iimits) of accuracy depend on requirements and sensor type(s) of the one or more Sensors. The higher accuracy, the more motion primitives are typically required to excite the sensors sufficiently.
The deterniinatioii of motion primitives can for exampie be performed by tand vehioie manufacttirers during production. The motion primitives are for exarnpie stored in a centrai database. in other words, the soitabie motion primitives are determined prior to the actuai caiibration is performed. in other words, it typioaiiy just has to be done one and for aii. i-ioweven if new sensor tyfpes are added, then suitabie hiotion primitives for these sensor types have to be added. As motion primitives depend on sensor type, the deterniination St) is performed for each sensor type. The motion prirnitives niay be determined in different ways, for exampie by experiments, simoiations and/or based on caictiiations eg., using eoiiations 142 beiovv.
For example, motion primitives may be determined by driving a sample trajectory 30. The sample trajectory can be any path, such as a path outside the limited area on a regular road or on another open area, such as the path in Fig. 3. Then the following steps (S0a-S0c) are then performed for each sensor type, see Fig. 5b.
First calibration data and onboard sensor data (also referred to as pose data i.e., any data that can be used to determine a pose) indicative of a vehicle pose are collected SOa while driving the land vehicle 1 along the sample trajectory 30. The calibration data is data used for sensor calibration. The calibration data may be captured by the sensor to be calibrated or by a sensor close to (e.g., co-located with) the sensor to be calibrated. The sensor data indicative of a vehicle pose is any data that can be used to determine the vehicle pose along the sample trajectory 30, such as inertial motion data and/or position data (e.g., GNSS). ln some embodiments, the calibration data and the onboard sensor data is the same data. For example, when calibrating an IMU, the IMU data can serve as both calibration data and onboard sensor data. ln some embodiments, the onboard sensor data indicative of the vehicle pose is collected by the one or more sensors 11 themselves, or by other sensors arranged on the land vehicle 1. ln some embodiments, the onboard sensor data indicative of the vehicle pose is collected by the one or more other sensors arranged close to, or collocated with, the one or more sensors 11. For example, the onboard sensor data and/or the calibration data is raw IMU data. ln some embodiments, the onboard sensor data and/or the calibration data comprises inertial motion data, in particular angular velocity and/or linear acceleration of the sensor and of the reference frame. lf the one or more sensors are lMUs then observability is typically achieved by causing motion in all (or most) degrees of freedom, DoF. ln some embodiments, the inertial motion data comprises angular velocity in three degrees of freedom and linear acceleration in three degrees of freedom and wherein the selecting SOc comprises determining a set of motion primitives that corresponds to maneuvers causing excitation in the inertial motion data in all, or several, of the DoF. lt is a prerequisite that the amount of calibration data collected along the sample trajectory 30 is sufficient to perform calibration with a certain accuracy. Stated differently, the sensor data captured provides enough observability to calibrate all the one or more sensors 11 with a certain accuracy. ln other words, the collected calibration data provide sufficient observability to achieve the calibration performance within the certain range.
However, as explained above, not all segments along the sample trajectory 30 comprise any (or only very little) information that contributes to the calibration. ln other words, some segments along the sample trajectory 30 are rich in calibration information (solid line in Fig. 3) while others are poor (dashed line in Fig. 3). Hence, in the next step trajectory segments (solid line) that are useful for calibration, as well as segments that are unusable, are identified. ln other words, in the next step a distribution of the calibration data along the sample trajectory 30 is determined S0b. Stated differently, determining a distribution involves analyzing which parts (e.g., segments) of the sample trajectory can provide a significant amount of calibration information, that is provide a certain amount of observability, see example in equations 1-2 above.
Based on the distribution, parts of the sample trajectory 30 that contributes most to the calibration can be selected as explained above in connection with Fig. 4. Hence, the determining S0 involves a process of selecting information-centric calibration poses from the sample trajectory 30. ln other words, a last step of the determining S0 comprises selecting S0c, based on the determined distribution and the collected onboard sensor data, a set of motion primitives corresponding to segments of the sample trajectory 30 that together contain an amount of calibration information that is itself is sufficient to achieve the calibration performance within the certain bounds. This determination S0c will depend on the type of sensor. A type of sensor may for example include either sensors using different modalities, or sensors from different manufacturers, as different manufacturers would have different bias and noise characteristics. For example, the determining S0 may be performed based on raw IMU signals as described above in connection with Fig. 3 above.
To determine the calibration trajectory, the dimensions of the limited area 2 also have to be known. ln other words, in some embodiments, the method comprises obtaining S1 boundaries of the limited area 2. The boundaries may be outer boundaries, such as walls, and possibly also internal boundaries of obstacles within the limited area, such as pillars. The boundaries are typically known from a map or similar stored in a data storage, such as on a computer or server, and can therefore be obtained by reading map data from a data storage. Alternatively, the boundaries may be entered via a user interface or by other feasible means.
Depending on which type of the one or more sensors 11 are to be calibrated a suitable set of motion primitives is then determined. The set of motion primitives are iso|ated motions or maneuvers that is land vehicle 1 can make. For example, motion primitives are collected based on sensor type(s) of the one or more sensors 11.
Each motion primitive is defined by a path and a velocity along the path. ln other words, in some embodiments, the method comprises, obtaining S2, based on sensor type of the one or more sensors 11, a set of motion primitives that together provide sufficient observability to achieve calibration performance within the certain bounds. The range defined by the bounds corresponds to a certain level of calibration accuracy, i.e. predefined calibration performance, which is for example defined by requirements. lf the one or more sensors to be calibrated are of different types, then different motion primitives may be required for the respective types. For example, if a land vehicle comprises different types of sensors, then the set of motion primitives will include motion primitives suitable for the different types. ln some embodiments, the obtaining S2 comprises retrieving the motion primitives, based on the sensor type, from a database storing sets of motion primitives, that together provide sufficient observability to achieve calibration performance within the certain bounds, for different types of sensors. As mentioned above the database with motion primitives for each type of sensor has been created beforehand, for example according to step S0. Hence, the obtaining may involve sending a request to the database specifying one or more sensor types. ln response to the request a set of motion primitives that matches the sensor types defined in the request may be received.
The land vehicle 1 may also comprise sensors that benefit from visual or structural data from accurate calibration. For example, to improve the calibration quality it is beneficial that LlDARs or cameras see the same thing when they are in their common field of view to improve accuracy. For example, image sensors may require a texture that is feature rich while RADAR or LIDAR may require a structure or feature rich environment. ln a limited area there are typically certain areas suitable for visual or structural calibration. This may be areas having naturally occurring texture or structure, such as doors, corners etc. in a workshop. lt may also be a wall with a suitable pattern. To further improve calibration accuracy such segments can be identified and included in the calibration trajectory 40 for vehicles using this type of sensor. ln other words, in some embodiments, the method comprises, obtaining S3 one or more trajectory segments within the limited area 2 that can provide visual or structural calibration information suitable for calibration of the sensor type.
These trajectory segments can for example be identified by driving around in the limited area 2. ln other words, in some embodiments, the obtaining S3 comprises obtaining S3a vision data corresponding to visual data collected by the one or more sensors 11 when driving around with the land vehicle 1 in the limited area 2 and identifying S3b, based on the obtained vision data, segments of the trajectory including visual content corresponding to calibration information suitable for calibration of the sensor type. The identified segments are typically segments that excite the sensors and thereby provide observability.
The trajectory segments that can provide visual or structural information may also be identified by simulation in a 3D environment. lf one assumes the existence of a somewhat accurate computer 3D model of either a workshop and the driving area around it or alternatively of a test track. Based on such a 3D model, one can figure out what kind of maneuvers would convey the most information for the calibration ofdifferent sensors. As an example, LlDARs would benefit from driving around and close by structures with rich geometry patterns (with LIDAR the "calibration information" is richer the richer the geometry of the surfaces it sees). lt should be noted that the structures may be structures or patterns that naturally exist in a test environment, such as corners or doors. ln other words, in some embodiments, the method comprises obtaining S3b the vision data based on a simulation of the land vehicle 1 driving in the limited area 2. Thus, assuming the existence of such a 3D model and given the objective of calibrating lidars and/or Cameras, one can determine, via computer automated methods, where and how to drive, in order to ensure that these sensors get rich "calibration information".
When the set of motion primitives have been identified a calibration trajectory 40 that includes the determined set of motion primitives can be created. ln other words, the calibration trajectory 40 is generated to include or cover all primitives of the determined set. This may be done in different ways by basically stitching the motion primitives of the set to each other in a suitable manner until an optimal solution is achieved. ln other words, the method comprises generating S4 a calibration trajectory 40 within boundaries of the limited area such that it includes the set of motion primitives determined to together provide sufficient observability to achieve calibration performance within certain bounds, while optimising time efficiency of driving the calibration trajectory 40. Hence, the generating involves finding the fastest (e.g. shortest) path that can fit into the limited area and which includes all the motion primitives of the determined set. The generated calibration trajectory is for example an open loop, or closed loop that the land vehicle shall drive in the limited area. The calibration trajectory is not necessarily one single loop, but may comprise several laps and may also involve reversing etc. ln other words, the calibration trajectory may have any shape feasible to drive within the limited area. The calibration trajectory defines the path and velocity to drive (including reverse driving) during the calibration procedure. lf the land vehicle comprises sensors 11 that depend on visual and/or structural content in the environment and suitable segments have been identified (see step S3) then these trajectory segments shall also be included in the calibration trajectory 40. ln other words, in some embodiments, the generating S4 comprises including the obtained one or more trajectory segments in the calibration trajectory 40.
The generating S4 may involve solving an optimization problem, i.e. finding the best (i.e. fastest or shortest) calibration trajectory, from all feasible solutions, which fulfils the requirements. ln other words, in some embodiments, the method comprises generating S4 comprises solving an optimization problem comprising ensuring that the generated calibration trajectory 40 includes the set of motion primitives and stays within the boundaries of the limited area, while minimizing a cost parameter defining time required to drive the calibration trajectory 40.
To improve efficiency various algorithms may be utilised for the generating S4. For example, motion planners can be used to determine a path while avoiding obstacles. A motion planner is an algorithm which automatically, based on a vehicle model, plans the route (aka trajectory) that a vehicle shall travel to get from Point A to Point B.
A classic motion planner formulation may be defined by the following criteria: minimize path length subject to vehicle dynamics collision free arrive at goal pose Table 1. Classic motion planner ln this case, the motion planner has the objective of finding a path with minimum length. This path needs to respect the vehicle dynamics, as well as being collision free. Finally, it is a requirement that the path makes the land vehicle arrive at the goal pose. Motion planners are commonly known and therefore not described in further detail. lt is possible to program motion planners to handle various constraints or to solve various problems. Hence, a motion planner can be reconfigured to determine a path for a particular land vehicle within the limited area ensuring that the generated calibration trajectory 40 includes the set of motion primitives and stays within the boundaries of the limited area, while minimizing a cost parameter defining time required to drive the path. ln other words, in some embodiments, the generating S4 comprises using a motion planner and a land vehicle model.
A possible motion planner formulation that considers the "calibration information" may be defined by the following criteria: minimize path length subject to use calibration rich motion primitives collision free arrive at goal pose Table 2. Proposed motion planner Hence, the proposed motion planner differ from the previous formulation in the replacement of vehicle dynamics with use calibration rich motion primitives. Since the planner will now only use motion primitives that are rich in "calibration information" the planned paths will be by design better suited for calibrating SGFISOFS.
At any time during the driving using our proposed motion planner (Table 2) the system might realize that the calibration has been successful. At that point the land vehicle will decide e.g., to return to its initial pose, or simply revert to a normal motion planner, i.e., the default motion planner (Table 1) that is not focused on maximizing calibration information.
When the calibration trajectory has been determined it is used during a calibration procedure. lf the method is performed by a control arrangement of the land vehicle 1, then data defining the calibration trajectory 40 may be directly used. For example, data defining the trajectory is sent to an autonomous land vehicle as autonomous driving data. Alternatively, the path is provided to a driver or operator of a land vehicle. Hence, the providing may also involve presenting the trajectory on a display or on the surface on which the land vehicle drives. ln other words, the method further comprises providing S5 the generated calibration trajectory 40 for the land vehicle 1 to drive during the calibration procedure.
Once the calibration trajectory 40 has been provided the calibration procedure can be started. During the calibration procedure the land vehicle 1 drives along the calibration trajectory 40 while collecting data using the one or more sensors. 26 Extrinsic calibration can thereafter be performed using any available method, which typically involves comparing sensor readings of a sensor to be calibrated with sensor readings of another sensor, such as with a sensor in a known reference frame. ln other words, in some embodiments, the method comprises controlling S6 the land vehicle to drive along the provided calibration trajectory 40.
The proposed technique of calibrating with only rich motion primitives has been verified through field experiments on vehicle sensor data. Fig. 7a illustrate signals in six DoF of two motion sensors match after calibration based on all motion primitives of a sample trajectory. More specifically, Fig. 7a illustrate calibrated accelerations (Acc-x, Acc-y, Acc-z) and angular velocities (wwwwwz) of the front left top (FLT) lidar and front right top (FRT) lidar in a common reference frame. Fig. 7b illustrates the same signals after calibration with data collected only along selected trajectory segments of a sample trajectory. From the figures it can be noted that the calibration results with selected data show similar performance to the results with all data.
Fig. 6 illustrates a control arrangement 100 configured to perform the proposed method of preparing the extrinsic calibration procedure. The control arrangement 100 may be arranged in the land vehicle 1, such as in the control arrangement 10 (Fig. 1). However, in some embodiments the control arrangement 100, is external to the land vehicle 1.
The control arrangement 100 comprises control circuitry to perform the method according to any one of the steps, examples or embodiments as described herein. More in detail, the control arrangement 100 comprises processing circuitry 101 and memory 102. The processor circuitry 101 comprises one or more processors. The control arrangement 100 may include one or more Electronic Control Units (ECUs). The computer-readable memory is for example one or more of the memories in the control arrangement 100. Hence, the proposed method may be implemented as a computer program. The computer program then comprises instructions which, when the computer program is executed by a computer, cause the computer to 27 carry out the method according to any one of the aspects, embodiments or examples as described herein.
More specifically, the control arrangement 100 is configured to generate a calibration trajectory within boundaries of the limited area such that it includes a set of motion primitives determined to together provide sufficient observability to achieve calibration performance within certain bounds, while optimising time efficiency of driving the calibration trajectory. The control arrangement 100 is also configured to provide the generated calibration trajectory for the land vehicle 1 to drive during the calibration procedure. ln further embodiments, the control arrangement is configured to perform the method according to any one of the embodiments described in connection with Fig. 5a.
The terminology used in the description of the embodiments as illustrated in the accompanying drawings is not intended to be limiting of the described method, control arrangement or computer program. Various changes, substitutions and/or alterations may be made, without departing from disclosure embodiments as defined by the appended claims.
The term "or" as used herein, is to be interpreted as a mathematical OR, i.e., as an inclusive disjunction; not as a mathematical exclusive OR (XOR), unless expressly stated otherwise. ln addition, the singular forms "a", "an" and "the" are to be interpreted as "at least one", thus also possibly comprising a plurality of entities of the same kind, unless expressly stated otherwise. lt will be further understood that the terms "includes", "comprises", "including" and/ or "comprising", specifies the presence of stated features, actions, integers, steps, operations, elements, and/ or components, but do not preclude the presence or addition of one or more other features, actions, integers, steps, operations, elements, components, and/ or groups thereof. A single unit such as e.g. a processor may fulfil the functions of several items recited in the claims.
The present disclosure is not limited to the above-described preferred embodiments. Various alternatives, modifications and equivalents may be used. Therefore, the above embodiments should not be taken as Iimiting the scope of the disclosure, which is defined by the appending claims.
Claims (15)
1. A method for preparing an extrinsic calibration procedure, the calibration procedure comprising calibrating one or more sensors (11) arranged on a land vehicle (10) by driving a calibration trajectory within a limited area (2), the method comprising: - generating (S4) a calibration trajectory within boundaries of the limited area such that it includes a set of motion primitives determined to together provide sufficient observability to achieve calibration performance within certain bounds, while optimising time efficiency of driving the calibration trajectory and - providing (S5) the generated calibration trajectory for the land vehicle (1) to drive during the calibration procedure.
2. The method according to claim 1,wherein the generating (S4) comprises solving an optimization problem comprising ensuring that the generated calibration trajectory includes the set of motion primitives and stays within the boundaries of the limited area, while minimizing a cost parameter defining time required to drive the calibration trajectory.
3. The method according to claim 1 or 2, wherein the generating comprises using a motion planner and a vehicle model.
4. The method according to any one of the preceding claims, wherein the method comprises: - determining (S0), for one or more sensor types, a set of motion primitives that together provide sufficient observability to achieve calibration performance within the certain bounds.
5. The method according to claim 4, wherein the determining (S0) comprises, for each sensor type: - collecting (SOa) while driving the land vehicle (1) along a sample trajectory (30), calibration data and onboard sensor data indicative of a vehicle pose, wherein the collected calibration data provide sufficient observability to achieve the calibration performance within the certain bounds, - determining (SOb) a distribution of calibration data along the sample trajectory (30), - selecting (S0c), based on the determined distribution and the collected onboard sensor data, a set of motion primitives corresponding to segments of the sample trajectory (30) that together contain an amount of calibration information that is itself is sufficient to achieve the calibration performance within the certain bounds.
6. The method according to any one of the preceding claims, comprising: - obtaining (S3) one or more trajectory segments within the limited area (2) that can provide visual or structural calibration information suitable for calibration of the sensor type, and wherein the generating (S4) comprises including the obtained one or more trajectory segments in the calibration trajectory.
7. The method according to claim 6, wherein the obtaining (S3) comprises: o obtaining (S3a) vision data corresponding to visual data collected by the one or more sensors (10) when driving around with the land vehicle (1) in the limited area (2), and o identifying (S3b), based on the obtained vision data, segments of the content to calibration trajectory including visual corresponding information suitable for calibration of the sensor type.
8. The method according to claim 7, comprising obtaining (S3b) the vision data based on a simulation of the land vehicle (1) driving in the limited area (2).
9. The method according to any one of the preceding claims, wherein the method comprises controlling (S6) the land vehicle to drive along the provided calibration trajectory.
10. The method according to any one of the preceding claims, wherein the one or more sensors (11) comprises one or more of cameras, Iidars and radars.
11. A computer program comprising instructions which, when the computer program is executed by a computer, cause the computer to carry out the method according to any one of the preceding c|aims.
12. A computer-readabie medium comprising instructions which, when executed by a computer, cause the computer to carry out the method according to any one of the c|aims 1 to
13. A control arrangement (100) configured to prepare an extrinsic calibration procedure, the calibration procedure comprising calibrating one or more sensors (11) arranged on a land vehicle (1) by driving a calibration trajectory within a limited area (2), wherein the control arrangement is configured to: - generate a calibration trajectory within boundaries of the limited area such that it includes a set of motion primitives determined to together provide sufficient observability to achieve calibration performance within a certain bounds, while optimising time efficiency of driving the calibration trajectory and - provide the generated calibration trajectory for the land vehicle (1) to drive during the calibration procedure.
14. The control arrangement (100) according to claim 13, wherein the control arrangement is configured to perform the method according to any one of c|aims 1-
15. A vehicle comprising the control arrangement (100) according to claim 13 or 14.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| SE2350556A SE2350556A1 (en) | 2023-05-08 | 2023-05-08 | A method for preparing an extrinsic calibration procedure |
| PCT/SE2024/050375 WO2024232798A1 (en) | 2023-05-08 | 2024-04-17 | A method for preparing an extrinsic calibration procedure |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| SE2350556A SE2350556A1 (en) | 2023-05-08 | 2023-05-08 | A method for preparing an extrinsic calibration procedure |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| SE2350556A1 true SE2350556A1 (en) | 2024-11-09 |
Family
ID=90826466
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| SE2350556A SE2350556A1 (en) | 2023-05-08 | 2023-05-08 | A method for preparing an extrinsic calibration procedure |
Country Status (2)
| Country | Link |
|---|---|
| SE (1) | SE2350556A1 (en) |
| WO (1) | WO2024232798A1 (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20190293756A1 (en) * | 2018-03-21 | 2019-09-26 | Zoox, Inc. | Sensor calibration |
| US20200200566A1 (en) * | 2018-12-20 | 2020-06-25 | Samsung Electronics Co., Ltd. | Vehicle driving control apparatus and calibration method performed by the vehicle driving control apparatus |
| US20210286351A1 (en) * | 2014-11-07 | 2021-09-16 | Clearpath Robotics Inc. | Systems and methods for unmanned vehicles having self-calibrating sensors and actuators |
| GB2599175A (en) * | 2020-09-25 | 2022-03-30 | Motional Ad Llc | AV path planning with calibration information |
-
2023
- 2023-05-08 SE SE2350556A patent/SE2350556A1/en unknown
-
2024
- 2024-04-17 WO PCT/SE2024/050375 patent/WO2024232798A1/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210286351A1 (en) * | 2014-11-07 | 2021-09-16 | Clearpath Robotics Inc. | Systems and methods for unmanned vehicles having self-calibrating sensors and actuators |
| US20190293756A1 (en) * | 2018-03-21 | 2019-09-26 | Zoox, Inc. | Sensor calibration |
| US20200200566A1 (en) * | 2018-12-20 | 2020-06-25 | Samsung Electronics Co., Ltd. | Vehicle driving control apparatus and calibration method performed by the vehicle driving control apparatus |
| GB2599175A (en) * | 2020-09-25 | 2022-03-30 | Motional Ad Llc | AV path planning with calibration information |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2024232798A1 (en) | 2024-11-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111220153B (en) | Localization method based on visual topology nodes and inertial navigation | |
| US11908164B2 (en) | Automatic extrinsic calibration using sensed data as a target | |
| US8467612B2 (en) | System and methods for navigation using corresponding line features | |
| Motlagh et al. | Position Estimation for Drones based on Visual SLAM and IMU in GPS-denied Environment | |
| CN119311025B (en) | Flight control method and system for unmanned aerial vehicle tracking technology | |
| US20100265327A1 (en) | System for recording Surroundings | |
| US20240271941A1 (en) | Drive device, vehicle, and method for automated driving and/or assisted driving | |
| JP6988873B2 (en) | Position estimation device and computer program for position estimation | |
| CN109141411A (en) | Localization method, positioning device, mobile robot and storage medium | |
| CN116794640B (en) | A LiDAR-GPS/IMU self-calibration method for movable carriers | |
| JP7409037B2 (en) | Estimation device, estimation method, estimation program | |
| CN116728410A (en) | A method of error compensation for robot absolute positioning accuracy in a narrow working environment | |
| CN113741550A (en) | Mobile robot following method and system | |
| JP2009110249A (en) | Travel route determination map creation device and travel route determination map creation method for autonomous mobile body | |
| US11865724B2 (en) | Movement control method, mobile machine and non-transitory computer readable storage medium | |
| WO2022232747A1 (en) | Systems and methods for producing amodal cuboids | |
| SE2350556A1 (en) | A method for preparing an extrinsic calibration procedure | |
| JP2019191888A (en) | Unmanned flying object, unmanned flying method and unmanned flying program | |
| EP4269044B1 (en) | Slope location correction method and apparatus, robot and readable storage medium | |
| CN116559888A (en) | Indoor positioning method, device, electronic equipment and storage medium for robot | |
| CN120576781B (en) | Positioning and path planning method and system for unmanned vehicles in park | |
| CN114648577A (en) | Equipment detection method and equipment detection system | |
| US20240199065A1 (en) | Systems and methods for generating a training set for a neural network configured to generate candidate trajectories for an autonomous vehicle | |
| US20240203130A1 (en) | Systems and methods for detecting and tracking objects in an environment of an autonomous vehicle | |
| CN113777615B (en) | Positioning method and system of indoor robot and cleaning robot |