WO2020160861A1 - Étalonnage d'un capteur pour un véhicule sur la base d'indices d'identification côté objet et côté image d'un objet de référence - Google Patents
Étalonnage d'un capteur pour un véhicule sur la base d'indices d'identification côté objet et côté image d'un objet de référence Download PDFInfo
- Publication number
- WO2020160861A1 WO2020160861A1 PCT/EP2020/050343 EP2020050343W WO2020160861A1 WO 2020160861 A1 WO2020160861 A1 WO 2020160861A1 EP 2020050343 W EP2020050343 W EP 2020050343W WO 2020160861 A1 WO2020160861 A1 WO 2020160861A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- sensor
- indices
- index
- reference parts
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/52004—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to a device for calibrating a sensor for a vehicle according to claim 1. Furthermore, the invention relates to a sensor device for detecting the surroundings of a vehicle according to claim 13. The invention also relates to the use of such a device or such a sensor device in one Vehicle according to claim 14. The invention also relates to a method for calibrating a sensor for a vehicle according to claim 15. Finally, the invention relates to a computer program product according to claim 17.
- Sensor systems that include one or more sensors are increasingly being used in vehicles, especially in their driver assistance systems and security systems, in order to detect the surroundings of the vehicle.
- sensors can, for example, be cameras, radars and / or light detection and ranging devices (hereinafter: lidars).
- lidars Such a sensor is used to take pictures of the surroundings of the vehicle and to use an image evaluation to recognize objects that represent a potential obstacle for the vehicle when driving.
- the distance between the vehicle and the object can be determined on the basis of such an object detection in order to take a countermeasure if the distance falls below a critical rule, for example to brake the vehicle or to trigger a warning signal. In this way, collisions between the vehicle and objects, such as people, buildings, other vehicles or vegetation, can be avoided or at least the risk thereof can be reduced.
- Calibration is a process in which the correlation between an image coordinate system, in which the images recorded by the respective sensor are displayed and which is specific for each sensor, and a world coordinate system, which is sensor-unspecific and generally valid for all sensors, is determined.
- the respective image coordinate system is therefore with the world coordinate system is set in a transmission ratio, based on which each image coordinate can be represented by means of a corresponding world coordinate.
- Calibration devices and methods are known, for example, from WO 2018/000037 A1, which discloses a system and method for determining a camera position within a vehicle scene.
- the method there includes a first step in which an image of the vehicle scene is recorded by the camera.
- the method further comprises a second step in which reference data are loaded which indicate the vehicle scene, the reference data comprising positions and orientations of known features within the vehicle scene.
- the known method also includes a third step in which the geometric appearance of one or more of the known features within the image are identified.
- the known method includes a fourth step in which the three-dimensional position and orientation of the camera relative to the known features that were identified in the third step are determined based on the geometric appearance and a camera position is calculated within the vehicle scene.
- the object of the present invention is therefore to improve the known calibration devices and methods in such a way that their accuracy and reliability are increased.
- the object is achieved by a device for calibrating a sensor for a vehicle having the features of claim 1.
- the object is achieved by a sensor device for detecting the surroundings of a vehicle with the characteristics of claim 13.
- the object is achieved by Use of such a device in a vehicle with the features of claim 14.
- the object is achieved by a method for calibrating a sensor for a vehicle with the features of claim 15.
- the object is achieved by a computer program product with the features of claim 16.
- the sensor which is preferably attached to the vehicle, is designed to record an image of the reference object.
- the sensor is, for example, a camera, a radar sensor, a lidar sensor, an ultrasonic sensor or another imaging device of the device.
- the sensor can be integrated in a sensor system comprising a plurality of sensors that are, for example, placed in different positions of the vehicle.
- the multiple sensors include a camera, a radar sensor, a lidar sensor, an ultrasound sensor and / or another imaging device.
- the device according to the invention is preferably designed to calibrate the sensor system.
- the image data generated by the sensor contain at least one image of the predefined reference object.
- the reference object has a certain geometric shape, for example a two-dimensional or three-dimensional shape, and / or a certain two-dimensional or three-dimensional pattern.
- the reference object is preferably circular, elliptical, square, hexagonal or sickle-shaped.
- the reference object preferably has a circular, elliptical or square core area which comprises a zigzag-shaped edge area with several corner areas which are each radially delimited by two sides intersecting at a corner point.
- the reference object can be wholly or partially contained in the image (or in the images). This means that the respective images show the reference object of a certain perspective of the respective sensors out completely or partially.
- the object-side reference parts are spatially or flatly distributed on the reference object (in the case of a two-dimensional reference object).
- the object-side reference parts are, for example, several reference areas of the surface of the three-dimensional reference object or several reference areas of the two-dimensional reference object.
- at least one of the object-side reference parts can comprise a reference point which is arranged in one of the reference areas. This also includes the case that a reference point is arranged in an edge section of a reference area or on an edge of a reference area. At least one reference point preferably comprises a corner point of the aforementioned zigzag shape.
- the world coordinate index comprises coordinate information of the respective object-side reference parts.
- the coordinate information relates to a world coordinate system.
- the world coordinate system is a reference system that differs from the image coordinate system of the sensor.
- the world coordinate system can be, for example, a coordinate system of the vehicle, the surroundings of the vehicle, the sensor housing or another housing.
- the world coordinate system can be a Cartesian or a spherical coordinate system.
- the world coordinate index includes, for example, coordinate information of a point located in this reference area or of the world coordinate system.
- the world coordinate index includes coordinate information of the reference point or the world coordinate system.
- the object-side identification index is used to specify the object-side reference parts in addition to their world coordinate index.
- the object-side identification index relates to one or more properties of the respective object-side reference parts.
- the object-side identification index can relate to a position, orientation, shape and / or an appearance property of the respective object-side reference parts.
- the world coordinate index and / or the object-side identification index can at least partially belong to the image data.
- the world coordinate index and / or the object-side identification index can belong to a data packet which is separate from the image data and which can be fed to the image processing unit, for example, via the input interface.
- the object-side reference parts are mapped into the image-side reference parts.
- the reference parts on the image side correspond to the reference parts on the object side.
- the image processing unit is designed to take the predefined reference object, in particular the image-side reference parts, from the image data.
- the object-side reference parts are determined by the design of the reference object.
- the image-side reference parts can be different from image to image depending on the perspective of the sensor used and external parameters such as brightness.
- the image coordinate index comprises coordinate information of the respective reference parts on the image side.
- the coordinate information relates to an image coordinate system.
- the image coordinate system is a reference system relative to which the image points of the image recorded by the sensor, for example pixels and / or voxels, are spatially or flatly defined on an image display surface.
- the coordinate information in the image coordinate system can include one or more coordinates of the image coordinate system.
- the image coordinate index includes, for example, coordinate information of one in this Reference area of the lying point in the image coordinate system.
- the image coordinate index includes coordinate information of the reference point in the image coordinate system.
- the correlation between the world coordinate indices and the image coordinate indices corresponds to the transmission ratio between the world coordinate system and the image coordinate system.
- the calibration device according to the invention is thus designed to determine a correlation between the world coordinate system and the image coordinate system, based on which coordinates of one of the two coordinate systems can be translated into the other of the two coordinate systems and / or vice versa.
- the correlation can be determined by the image processing unit.
- the correlation can be determined by an external processing unit, for example a cloud system, to which the object-side and image-side identification indices can be supplied, based on these.
- the determined correlation can be used to determine spatial parameters such as positions and / or orientations of the sensor or the sensors in the world coordinate system (so-called extrinsic parameters).
- the determined correlation can be further optimized in a further information processing step by means of the non-linear optimization to be carried out by the image processing unit or an additional processing unit.
- so-called intrinsic parameters can be used by the image processing unit or the additional computing unit in order to determine the extrinsic parameters of the sensor.
- the calibration of the sensor is improved by taking into account the identification indices of the reference parts.
- calibration can be carried out with sufficient accuracy and reliability even if the reference object is partially covered or only a small part of the reference parts of the reference object can be detected by the sensor.
- the hidden reference parts cannot be taken into account in the calibration.
- This information deficit can be advantageously compensated by the additional information contained in the identification indexes of the non-covered reference parts.
- the calibration device according to the invention is designed to be used in an offline calibration.
- a calibration platform for example a vehicle
- a sensor system to be calibrated is arranged in a location in which at least one reference object is in the field of view of the sensor system.
- the image data generated by the sensor system can be fed to the device according to the invention for the purpose of calibration.
- the calibration device can be attached to the calibration platform. Alternatively, the calibration device can be arranged externally.
- the calibration device according to the invention can be used in an online calibration.
- the calibration platform is the vehicle on which the sensor system to be calibrated is attached.
- the vehicle can travel along a route in the vicinity of which at least one reference object is arranged in the field of view of the sensor system.
- the object-side identification index comprises an object-side distance, measured in the world coordinate system, of the respective object-side reference parts to a predefined object-side reference point, the image-side identification index comprising an image-side distance measured in the image coordinate system between the respective image-side reference parts and an image-side reference point determined by the image processing unit.
- the predefined object-side reference point is a reference point in the world coordinate system from which the object-side distance for the respective object-side reference parts is measured.
- the object-side reference point is preferably marked on the reference object, so that the image-side reference point can be determined in a simplified manner by the image processing unit in the image of the reference object. If the ref- reference object assumes a circular shape, an elliptical shape or a square shape (for example with zigzag-shaped edge areas), the object-side reference point is preferably the center of the circular, elliptical or square shape.
- the object-side distance is preferably the distance from the reference point to the object-side reference point. If an object-side reference part is a reference area, the object-side distance is preferably the distance from a center point, for example the geographical center point, or a focus of the reference area to the object-side reference point.
- the image-side reference point is a reference point in the image coordinate system from which the image-side distance for the respective image-side reference parts is measured.
- the image-side reference point can be made, for example, by comparing the image of the reference object with a circular shape, an elliptical shape or a square in the image coordinate system generated by the image processing unit or a computing unit that interacts with it. This is advantageous if the reference object has a circular, elliptical or square shape, the outer edge of which is, for example, zigzag-shaped.
- the image-side reference point can be identified particularly easily.
- the image-side reference point can be identified as the center point of the circular, elliptical or square shape, in which the circular, elliptical or square shape corresponds most to the shape of the image of the reference object.
- the object-side and image-side distances provide geometric information about the reference object that is used to calibrate the sensor to increase the accuracy.
- the image processing unit is designed to carry out a homographic transformation on the captured image-side reference parts.
- the homographic transformation of the reference parts on the image side serves to obtain a new image of the reference object, specifically in such a way as if the image had been recorded from a different perspective.
- the image-side reference parts are projected onto a plane to be determined by the image processing unit. In this way, unwanted visual effects such as distortions and / or rotations of the reference object in the image can at least be reduced.
- the homographic transformation can preferably be carried out before the image-side identification indices and / or the image coordinates of the image-side reference parts are detected. This is particularly advantageous because, for example, the definition of the reference point on the image side is simplified in that the shape of the reference object is retained in the new image due to the elimination of the distortions or rotations.
- the predefined reference object has several, preferably at least four, colors.
- the color information of the reference object can advantageously be used to calibrate the sensor to increase the accuracy.
- the use of four colors enables calibration with reduced computing effort while maintaining reliability.
- the object-side identification index comprises an object-side color index of the respective object-side reference parts, the image-side identification index including an image-side color index of the respective image-side reference parts determined in a color index system of the sensor.
- the reference object has several colors so that a specific color is assigned to each object-side reference part.
- the reference object is related or its surface is divided into several reference areas, each reference area being assigned a color.
- the object-side color index of this reference part is the color of the associated reference area.
- the object-side color index of this reference part is the color of the reference area in which the reference point (for example a corner point) is located.
- the object-related color index can, for example, be a color designation (for example “red”, “green” or “blue”).
- the object-side color index can be based on an object-side color index system (such as the RGB system).
- the colors are to be understood in such a way that two different configurations of the same superordinate color with different contrast values (for example color configurations “dark blue” and “light blue” for the superordinate color “blue”) are to be regarded as two different colors.
- the image-side color indices are preferably created when the object-side color indices are detected by the sensor.
- the color indices on the image side are thus preferably based on a color index system of the sensor, for example an RGB system. The calibration of the sensor is therefore more precise thanks to the color information.
- the image processing unit is designed to determine an image-side sequence of the image-side identification indices along a course of the image-side reference parts.
- the image processing unit is designed, for example, to first define the course of the image-side reference parts and then to determine the image-side sequence of the image-side identification indices.
- the image coordinate indices of the image-side reference parts result in the course in the image coordinate system.
- the course can be in the form of a numbering of the image-side reference parts.
- the image processing unit is designed to receive an object-side sequence of the object-side identification indices along a course of the object-side reference parts.
- the world coordinate indices of the object-side reference parts result in the course in the world coordinate system.
- the course can be present in the form of a numbering of the object-side reference parts.
- the image processing unit is designed to compare the image-side sequence with the object-side sequence in order to determine a correlation value between the image-side and object-side sequence.
- the correlation value indicates the extent to which the object-side identification indices of the object-side sequence correspond to the image-side identification indices of the image-side sequence.
- the correlation value is based, for example, on the number of object-side identification indices in the object-side sequence, the sequence of which corresponds to a sequence of the image-side identification indices of the image-side sequence.
- the correlation value indicates the extent to which the image-side sequence of the image-side distances can be obtained in terms of amount (i.e. dimensionless) by multiplying the object-side sequence of the object-side distances by a common factor.
- the image processing unit is preferably designed to display the object-side and image-side distances in each case as a unit of an object-side or image-side distance. In this case, a comparison between the object-side distances and the image-side distances is dimensionless and simplified.
- the image processing unit is designed to determine the correlation value for several object-side sequences in each case along an associated course of the object-side reference parts in order to identify a maximum and / or an extreme of the correlation value.
- the number of reference parts on the image side can be less than the number of reference parts on the object side. This is the case, for example, when the reference object is positioned during the image acquisition by the sensor of the sensor or the reference object is partially covered by an obstacle.
- the correlation value can vary depending on the sequence on the property side. The maximum or extreme indicates a high degree of correspondence between the sequence on the image side and the sequence on the object side. This reduces the risk of an incorrect choice of the object-side sequence on which the correlation to be determined between the world coordinate system and the image coordinate system is based. The calibration is therefore more reliable.
- the image processing unit is designed to determine the correlation between the image coordinate indices and the world coordinate indices based on an odometric correction factor.
- Odometric data are preferably fed in parallel to the image data or as part of the image data via the input interface of the image processing unit.
- the detection of the reference object from the image data can also take into account the odometric data, so that the calibration is fail-safe from an early phase on.
- the image data contain a plurality of images which are each recorded synchronously by the sensor and by at least one further sensor at a plurality of times.
- the sensors take an image synchronously.
- the image processing unit is preferably designed to process the plurality of images in such a way that, taking into account the time sequence of the images, to perform the calibration.
- the temporal sequence of the images provides additional information that can be used when recognizing the reference parts on the image side and their spatial and temporal information.
- the image processing unit comprises an artificial neural network module.
- the artificial neural network module (ANN module) is based on an artificial neural network (ANN), which comprises one or more algorithms.
- ANN artificial neural network
- the algorithms are trained with data in such a way that it is suitable to be used in the image processing unit. This enables a reliable calibration method.
- the artificial neural network module is designed to classify the reference object against a background in the image.
- image segmentation preferably semantic segmentation, which is particularly simple and precise.
- the artificial neural network module is designed to detect the image-side identification indices and / or the image coordinate indices and / or to determine the correlation between the image coordinate indices and the world coordinate indices.
- the training device is designed to train an ANN so that it is able to capture several image-side reference parts based on image data that contain at least one sensor-generated image of a predefined reference object, with one image-side identification in each case - dex and an image coordinate index, and / or based on the object-side and image-side identification indices to determine a correlation between the image coordinate indices and the world coordinate indices.
- the computer program product according to the invention is designed to be loaded into a memory of a computer and comprises software code sections with which the method steps of the method according to the invention for calibrating a sensor or the method steps of the method according to the invention for training an artificial neural network are carried out when the compu program product is running on the computer.
- a program is part of the software of a data processing system, for example an evaluation device or a computer.
- Software is a collective term for programs and associated data.
- the complement to software is hardware.
- Hardware describes the mechanical and electronic alignment of a data processing system.
- a computer is an evaluation device.
- Computer program products usually comprise a sequence of instructions which, when the program is loaded, cause the hardware to carry out a specific method that leads to a specific result. If the program in question is used on a computer, the computer program product produces a technical effect, namely to increase the accuracy and reliability of the calibration of the sensor
- the computer program product according to the invention is platform independent. That means it can run on any computing platform.
- the computer program product is preferably executed on a device according to the invention for calibrating a sensor for a vehicle or on a device according to the invention for training an artificial neural network.
- the software code sections are written in any programming language, for example in Python.
- the invention is illustrated by way of example in the figures. Show it:
- FIG. 1 shows a schematic representation of an inventive device for calibrating a sensor for a vehicle according to an exemplary embodiment
- FIG. 2 shows a schematic representation of a sensor system comprising a plurality of sensors
- FIG. 3 shows a schematic representation of a reference object comprising a plurality of reference points
- FIG. 4 shows a schematic representation of an image of the reference object from FIG. 3 taken by one of the sensors from FIG. 2;
- FIG. 5 shows a schematic representation of a reference object comprising a plurality of reference points
- FIG. 6 shows a schematic illustration of an image of the reference object from FIG. 5 taken by one of the sensors from FIG. 2;
- FIG. 7 shows a schematic representation of a method for calibrating the sensor according to an exemplary embodiment
- Fig. 8 is a schematic representation of a sensor device according to an Auss approximately example.
- FIG. 1 shows a schematic representation of a device 10 according to the invention for calibrating a sensor 22, 24, 26, 28 for a vehicle 21 according to an embodiment leadership example.
- the sensor 22, 24, 26, 28 and the vehicle 21 are shown schematically in FIG. 2.
- the device 10 has an input interface 12 for inputting image data 18 which contain at least one image, but here, for example, several images 19-1, 19-2, 19-3,... 19-n.
- the image 19-1, 19-2, 19-3, ... 19-n is recorded by the sensor 22, 24, 26, 28 and comprises a predefined colored reference object 34, 42 with several object-side reference parts P1 -P1 1, J1-J9, as shown schematically in FIGS. 3 and 5 by way of example.
- Each of the object-side reference parts P1 -P1 1, J1-J9 are assigned a world coordinate index and an object-side identification index.
- the device 10 also has an image processing unit 14 which is designed to detect several image-side reference parts Q1 -Q5, K1-K7 (see FIGS. 4 and 6) going back to the object-side reference parts P1-P11, J1 -J9.
- the image processing unit 14 is designed to detect an image coordinate index and an image-side identification index for each of the image-side reference parts Q1 -Q5, K1 -K7 in order to establish a correlation between the image coordinate indices of the image-side reference parts based on the object-side identification indices and the image-side identification indices Q1 -Q5, K1 -K7 and the world coordinate indices of the associated object-side reference parts P1 -P1 1, J1 -J9 to be determined.
- the device 10 furthermore has an output interface 16 for outputting the determined correlation to an external entity.
- a further unit for performing further information processing steps can be.
- the further unit can be designed to carry out a non-linear optimization of the determined correlation and / or to obtain extrinsic parameters for the sensor 22, 24, 26, 28 to be calibrated from the determined correlation using intrinsic parameters.
- FIG. 2 shows a schematic representation of a sensor system 25 comprising a plurality of sensors 22, 24, 26, 28.
- the sensors 22, 24, 26, 28 can comprise a camera, a radar sensor, a lidar sensor and / or an ultrasonic sensor.
- the reference object 34 is partially in the field of view of the sensor 22.
- the detection of the reference object 34, 42 results in an image 19 of the reference object 34, 42, which is shown schematically in more detail in FIG.
- FIG. 3 shows a schematic representation of a reference object 34.
- the object-side reference parts are shown as reference points P1-P11 by way of example.
- the reference points P1 -P1 1 are each located in a reference area R1 -R1 1 into which the surface of the reference object 34 is divided.
- the reference areas R1 -R1 1 are shown by way of example. At least four colors can be used, whereby adjacent reference areas R1 -R1 1 can have the same or different colors.
- An associated world coordinate index is assigned to each of the reference points P1 -P11 based on their position in a world coordinate system 36.
- the world coordinate system 36 is shown by way of example in FIG. 3 as a Cartesian coordinate system. Alternatively, another, for example a spherical, coordinate system can also be used for this purpose.
- an object-side color index is assigned to each of the reference points P1 -P1 1.
- the color of a reference area R1 -R1 1 is assigned to the associated reference point P1 -P1 1.
- FIG. 4 shows a schematic representation of the image 19 of the reference object 34 from FIG. 3 recorded by the sensor 22, 24, 26, 28 from FIG. 2 .
- a part of the reference parts Q1 -Q5 on the image side can be seen as reference points and their associated reference areas T1 -T5 in image 19.
- Each of the image-side reference points Q1 -Q5 corresponds to one of the object-side reference points P1 -P1 1, each of the image-side reference areas T 1 -T5 corresponding to one of the object-side reference areas R1 -R1 1.
- an image coordinate index is identified in an image coordinate system 38 when the reference object 34 is detected by the sensor 22.
- a coordinate system with three image axes for the image coordinates u, v and w is shown as the image coordinate system.
- any desired, for example a two-dimensional image coordinate system can be used.
- an image-side color index that goes back to the object-side color indices of the object-side reference points P1 -P1 1 is identified when the reference object 34 is detected by the sensor 22. This is done on the basis of a sensor's own color index system, such as the RGB system.
- the image processing unit 14 is designed to extract (detect) the image coordinate indices and the image-side color indices of the image-side reference points Q1 -Q5 from the image data 18.
- FIG. 5 shows a schematic representation of a further reference object 42.
- the object-side reference parts are shown by way of example as reference points J1-J9.
- the reference points J1 -J9 each lie in a reference area M1 -M9 into which the surface of the reference object 42 is divided.
- the reference areas M1 -M9 are shown by way of example in FIG.
- An associated world coordinate index is assigned to each of the reference points J1 -J9 based on their position in the world coordinate system 36.
- the world coordinate system 36 is shown in Fig. 3 by way of example as a Cartesian coordinate system. Alternatively, a different, for example a spherical, coordinate system can be used for this purpose.
- an object-side distance d 1, d5, d7 is assigned to each of the reference points J1-J9. For reasons of clarity, not all object-side distances are marked with a reference symbol in FIG. 5.
- the respective object-side distance d 1, d5, d7 is measured in the world coordinate system 36, with this is the distance between an object-side center point A and the respective object-side reference point J1 -J9.
- the object-side center point A is here, for example, the center of a circle 44 from which the reference areas M1 -M9 extend radially outward to the respective reference point J1 -J9 with decreasing width.
- FIG. 6 shows a schematic representation of the image 27 of the reference object 42 from FIG. 5 recorded by the sensor 22, 24, 26, 28 from FIG. 2.
- the image 27 shows a part of the reference object 42 which is in the field of vision of the sensor 22 on the object side .
- part of the image-side reference parts K1-K7 as reference points and their associated reference areas N1-N7 can be seen in image 27.
- Each of the image-side reference points K1 -K7 corresponds to one of the object-side reference points J1 -J9, each of the image-side reference areas N1 -N7 corresponding to one of the object-side reference areas M1 -M9.
- an image coordinate index is identified in an image coordinate system 38 when the reference object 42 is detected by the sensor 22.
- a coordinate system with three image axes for the image coordinates u, v and w is shown as the image coordinate system.
- this is not limitative of the present invention. Any desired, for example a two-dimensional image coordinate system can be used.
- an image-side distance 11, 15, I7 that goes back to the object-side distances of the object-side reference points J1 -J9 is detected by the image processing unit 14.
- the respective image-side distance 11, I5, 17 is measured in the image coordinate system 38, this being the distance between an image-side center point B and the respective reference point K1 -K7.
- the image-side center point B is shown by way of example as the center point of the circle 46 of the image of the reference object 42.
- the circle 46 here goes back to the circle 44 of the reference object 42 in FIG. 5.
- the image processing unit 14 is designed to extract (detect) the image coordinate indices and the image-side distances of the image-side reference points K1 -K7 from the image data 18.
- FIG. 7 schematically shows a method 100 according to the invention for calibrating the sensor 22, 24, 26, 28.
- image data 18 are entered which contain at least one image 19-1, 2, recorded by a sensor 22, 24, 26, 28 , 3, ..., n of a predefined reference object 34, 42.
- the reference object 34 comprises several object-side reference parts P1 -P1 1, J1 -J9, each of which is assigned a world coordinate index.
- several image-side reference parts Q1 -Q5, K1-K7 are acquired from the image data, which go back to the several object-side reference parts P1-P11, J 1 -J9.
- an image coordinate index and an image-side identification index are detected from the image data for each of the captured image-side reference parts Q1 -Q5, K1 -K7.
- the image-side identification indexes are output.
- a correlation between the image coordinate indices of the image-side reference parts Q1 -Q5, K1 -K7 and the world coordinate indices of the object-side reference parts P1-P1 1, J1 -J9 is then determined.
- the image processing unit 14 is preferably designed to provide an image-side sequence of the image-side identification indices to be determined.
- This image-side sequence is the sequence of the image-side identification indices of the image-side reference points.
- this image-side sequence corresponds, for example, to the (image-side) course of the image-side reference points Q1 -Q2-Q3-Q4-Q5 (in this order).
- the associated image-side identification indices here are image-side color indices FB1, FB2, FB3, FB4, FB5. Therefore it is the image-side Sequence for the example shown in Fig. 4 around the color index sequence FB1 -FB2-FB3-FB4-FB5.
- the image-side sequence corresponds, for example, to the image-side course K1 -K2-K3-K4-K5-K6-K7 (in this order).
- the associated image-side identification indices here are image-side distances 11, 12, I3, 14, 15, I6, I7. Therefore, the image-side sequence for the example shown in FIG. 6 is the distance sequence 11 -I2-I3-I4-I5-I6-I7.
- the image processing unit 14 is also preferably designed to determine an object-side sequence of the object-side identification indices.
- the object-side sequence for the reference object 34 shown in FIG. 3 corresponds, for example, to any (object-side) course of the object-side reference points P1 -P1 1, with the number of object-side reference points P1 -P1 1 being the number of image-side reference points Q1 to Q5 preferably corresponds to five.
- the object-side sequence can correspond to the object-side course P1-P2-P3-P4-P5 (in this order).
- the associated object-side identification indices are object-side color indices F01, F02, F03, F04, F05.
- the exemplary object-side sequence for the example shown in FIG. 3 is therefore the color index sequence F01-F02-F03-F04-F05.
- the object-side sequence for the reference object 42 shown in FIG. 5 corresponds, for example, to any (object-side) course of the object-side reference points J1 -J9, with the number of object-side reference points J1 -J9 preferably corresponding to the number of image-side reference points K1-K7 and thus is seven.
- the object-side sequence can correspond to the object-side course J1 -J2-J3-J4-J5 (in this order).
- the associated object-side identification indices are object-side distances d1, d2, d3, d4, d5, d6, d7.
- the object-side sequence for the example shown in FIG. 3 can be F06-F07-F08-F09-F010.
- the object-side sequence for the example shown in FIG. 5 can be d3-d4-d5-d6-d7-d8-d9.
- the image processing unit 14 is preferably designed to compare the image-side sequence with each of the multiple possible object-side sequences and to determine a correlation value in each case.
- the correlation value is preferably measured by the extent to which an object-side sequence corresponds to the image-side sequence.
- the image-side color index sequence FB1 -FB2-FB3-FB4-FB5 is compared with the object-side color index sequence F01 -F02-F03-F04-F05.
- the image-side and object-side color indices are preferably translated into a common color index system so that a direct comparison is possible. If the order of the two color index sequences is the same for all five color indexes, the correlation value is the highest. If the order of the two color index sequences is the same for four of the five color indexes, the correlation value is lower. The higher the number of color indices in which the two color index sequences match one another, the higher the correlation value.
- the image-side distance sequence 11 -I2-I3-I4-I5-I6-I7 is compared with the object-side distance sequence d3-d4-d5-d6-d7-d8-d9.
- the two distance sequences are preferably compared with one another in terms of absolute value or dimensionlessly, so that a direct comparison between the distance sequences is possible. If the two distance sequences agree with one another for all seven distances in their order except for a factor common to all seven values, the correlation value is the highest. If the two distance sequences in six of the seven distances agree in their order except for the common factor, the correlation value is lower. The higher the number of distances in which the two distance follow each other except for a common factor, the higher the correlation value.
- the image processing unit 14 is preferably designed to change the object-side profile by one or more object-side reference parts after determining the correlation value and then to determine a correlation value corresponding to the changed object-side profile. This process can continue iteratively until a maximum or extreme of the correlation value is determined.
- the object-side sequence with the highest correlation value is preferably used as the basis for determining the correlation.
- the image coordinate indices of the image-side reference parts are related to the world coordinate indices of the object-side reference parts of this particular object-side sequence in order to obtain a transmission ratio between the two coordinate systems.
- FIG. 8 schematically shows a sensor device 50 comprising the sensor system 25 shown in FIG. 2 with the multiple sensors 22, 24, 26, 28 and the calibration device 10 shown in FIG. 1 for calibrating the sensors 22, 24, 26, 28.
- the sensor device 50 is attached to a calibration platform 23, which preferably includes a vehicle. An offline and / or online calibration is thus possible by means of the calibration device 10 according to the invention.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
L'invention concerne un dispositif (10) d'étalonnage d'un capteur (22, 24, 26, 28) pour un véhicule (21, 23). Le dispositif selon l'invention comprend une interface de saisie (12) servant à saisir des données d'image (18), qui contiennent au moins une image enregistrée par un capteur (22, 24, 26, 28) d'un objet de référence (34, 42) prédéfini. L'objet de référence (34, 42) comprend plusieurs parties de référence (P1-P11, J1-J9) côté objet, auxquelles sont associés respectivement un indice de coordonnées universelles et un indice d'identification côté objet. Le dispositif comprend également une unité de traitement d'images (14) servant à détecter plusieurs parties de référence (Q1-Q5, K1-K7) côté image correspondant aux parties de référence côté objet (P1-P11, J1-J9). L'unité de traitement d'images (14) est réalisée pour détecter pour chacune des parties de référence côté image (Q1-Q5, K1-K7) un indice de coordonnées d'image et un indice d'identification côté image lié à l'indice d'identification côté objet respectif afin de déterminer, sur la base des indices d'identification côté objet et côté image, une corrélation entre les indices de coordonnées d'images et les indices de coordonnées universelles. Le dispositif selon l'invention comprend également une interface d'émission (16) servant à émettre les indices d'identification côté image détectés.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE102019201633.2 | 2019-02-08 | ||
| DE102019201633.2A DE102019201633A1 (de) | 2019-02-08 | 2019-02-08 | Kalibrierung eines Sensors für ein Fahrzeug basierend auf objektseitigen und bildseitigen Identifikationsindizes eines Referenzobjektes |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020160861A1 true WO2020160861A1 (fr) | 2020-08-13 |
Family
ID=69156429
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2020/050343 Ceased WO2020160861A1 (fr) | 2019-02-08 | 2020-01-09 | Étalonnage d'un capteur pour un véhicule sur la base d'indices d'identification côté objet et côté image d'un objet de référence |
Country Status (2)
| Country | Link |
|---|---|
| DE (1) | DE102019201633A1 (fr) |
| WO (1) | WO2020160861A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112815979A (zh) * | 2020-12-30 | 2021-05-18 | 联想未来通信科技(重庆)有限公司 | 一种传感器的标定方法及装置 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2004003586A1 (fr) * | 2002-06-29 | 2004-01-08 | Robert Bosch Gmbh | Procede et dispositif d'etalonnage de capteurs dans un vehicule automobile |
| WO2016054004A1 (fr) * | 2014-09-30 | 2016-04-07 | Sikorsky Aircraft Corporation | Système de vérification d'étalonnage de capteur en ligne |
| WO2018000037A1 (fr) | 2016-06-29 | 2018-01-04 | Seeing Machines Limited | Systèmes et procédés d'identification de la pose de caméras dans une scène |
| WO2018182737A1 (fr) * | 2017-03-31 | 2018-10-04 | Airbus Group Hq, Inc. | Systèmes et procédés permettant d'étalonner des capteurs de véhicules |
-
2019
- 2019-02-08 DE DE102019201633.2A patent/DE102019201633A1/de not_active Withdrawn
-
2020
- 2020-01-09 WO PCT/EP2020/050343 patent/WO2020160861A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2004003586A1 (fr) * | 2002-06-29 | 2004-01-08 | Robert Bosch Gmbh | Procede et dispositif d'etalonnage de capteurs dans un vehicule automobile |
| WO2016054004A1 (fr) * | 2014-09-30 | 2016-04-07 | Sikorsky Aircraft Corporation | Système de vérification d'étalonnage de capteur en ligne |
| WO2018000037A1 (fr) | 2016-06-29 | 2018-01-04 | Seeing Machines Limited | Systèmes et procédés d'identification de la pose de caméras dans une scène |
| WO2018182737A1 (fr) * | 2017-03-31 | 2018-10-04 | Airbus Group Hq, Inc. | Systèmes et procédés permettant d'étalonner des capteurs de véhicules |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112815979A (zh) * | 2020-12-30 | 2021-05-18 | 联想未来通信科技(重庆)有限公司 | 一种传感器的标定方法及装置 |
| CN112815979B (zh) * | 2020-12-30 | 2023-11-21 | 联想未来通信科技(重庆)有限公司 | 一种传感器的标定方法及装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| DE102019201633A1 (de) | 2020-08-13 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP2767925B1 (fr) | Procédé de détection d'objets dans un entrepôt et/ou d'orientation spatiale dans un entrepôt | |
| DE112010004767B4 (de) | Punktwolkedaten-Verarbeitungsvorrichtung, Punktwolkedaten-Verarbeitungsverfahren und Punktwolkedaten-Verarbeitungsprogramm | |
| DE102011106050B4 (de) | Schattenentfernung in einem durch eine fahrzeugbasierte Kamera erfassten Bild zur Detektion eines freien Pfads | |
| DE102015209822A1 (de) | Erfassungseinrichtung, Erfassungsprogramm, Erfassungsverfahren, mit Erfassungseinrichtung ausgerüstetes Fahrzeug, Parameterberechnungseinrichtung, Parameter berechnende Parameter, Parameterberechnungsprogramm, und Verfahren zum Berechnen von Parametern | |
| DE102009051826A1 (de) | Verfahren zum Vergleichen der Ähnlichkeit von 3D-bildlichen Objekten | |
| WO2013178407A1 (fr) | Procédé et dispositif pour traiter des données stéréoscopiques | |
| DE102018109276A1 (de) | Bildhintergrundsubtraktion für dynamische beleuchtungsszenarios | |
| EP2546778A2 (fr) | Procédé destiné à l'évaluation d'un dispositif d'identification dýobjet dýun véhicule automobile | |
| DE102018132805A1 (de) | Verfahren für eine verbesserte Objekterfassung | |
| DE102019132996A1 (de) | Schätzen einer dreidimensionalen Position eines Objekts | |
| DE10148060A1 (de) | Verfahren zur Erkennung und Verfolgung von Objekten | |
| DE102018123393A1 (de) | Erkennung von Parkflächen | |
| EP3663881B1 (fr) | Procédé de commande d'un véhicule autonome en fonction des vecteurs de mouvement estimés | |
| DE102008036219A1 (de) | Verfahren zur Erkennung von Objekten im Umfeld eines Fahrzeugs | |
| DE102022106765B3 (de) | Verfahren zum Bestimmen einer Lage eines Objekts relativ zu einer Erfassungseinrichtung, Computerprogramm und Datenträger | |
| WO2020160861A1 (fr) | Étalonnage d'un capteur pour un véhicule sur la base d'indices d'identification côté objet et côté image d'un objet de référence | |
| EP3931798A1 (fr) | Estimation du déplacement d'une position d'image | |
| DE112022002410T5 (de) | Informationsverarbeitungsvorrichtung, informationsverarbeitungsverfahren und steuerprogramm | |
| DE112021002598T5 (de) | Bildverarbeitungsvorrichtung | |
| DE102018121317B4 (de) | Verfahren, Vorrichtung und Computerprogramm zur Schätzung einer durch eine Freiraumgeste vermittelten Richtungsinformation zur Bestimmung einer Benutzereingabe an einer Mensch-Maschine-Schnittstelle | |
| DE102023203164A1 (de) | Verfahren zur Ermittlung von mindestens einem Freiheitsgrad einer Kamera, Computerprogramm, maschinenlesbares Speichermedium und elektronische Steuereinheit oder Automatisierungsanordnung | |
| DE102017217063A1 (de) | Erkennungssystem, Arbeitsverfahren und Trainingsverfahren zum Erzeugen eines 3D-Modells mit Referenzdaten | |
| DE102019102423A1 (de) | Verfahren zur Live-Annotation von Sensordaten | |
| DE102024201856A1 (de) | Verfahren und Steuereinheit zum Bestimmen einer Lageveränderung eines Behälters | |
| DE102017217156B4 (de) | Verfahren und Vorrichtung zur Ansteuerung eines Fahrerassistenzsystems unter Verwendung eines Stereokamerasystems mit einer ersten und einer zweiten Kamera |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20700358 Country of ref document: EP Kind code of ref document: A1 |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20700358 Country of ref document: EP Kind code of ref document: A1 |