[go: up one dir, main page]

WO2019188820A1 - Dispositif de transmission d'informations, structure de données, procédé de commande, programme et support d'informations - Google Patents

Dispositif de transmission d'informations, structure de données, procédé de commande, programme et support d'informations Download PDF

Info

Publication number
WO2019188820A1
WO2019188820A1 PCT/JP2019/012176 JP2019012176W WO2019188820A1 WO 2019188820 A1 WO2019188820 A1 WO 2019188820A1 JP 2019012176 W JP2019012176 W JP 2019012176W WO 2019188820 A1 WO2019188820 A1 WO 2019188820A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
accuracy
vehicle
estimated
measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2019/012176
Other languages
English (en)
Japanese (ja)
Inventor
加藤 正浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pioneer Corp
Original Assignee
Pioneer Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pioneer Corp filed Critical Pioneer Corp
Publication of WO2019188820A1 publication Critical patent/WO2019188820A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3863Structures of map data
    • G01C21/387Organisation of map data, e.g. version management or database structures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3859Differential updating map data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/123Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
    • G08G1/127Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station
    • G08G1/13Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams to a central station ; Indicators in a central station the indicator being in the form of a map
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • the present invention relates to a technique for updating a map.
  • Patent Literature 1 when a change point of a partial map is detected based on an output of a sensor installed on a moving body such as a vehicle, the driving support device transmits change point information regarding the change point to a server device.
  • Patent Document 2 discloses a vehicle position estimation technique using a Kalman filter.
  • Non-Patent Document 1 discloses specifications related to a data format for collecting data detected by a vehicle-side sensor with a cloud server.
  • the present invention has been made to solve the above-described problems, and is an information transmission apparatus capable of transmitting data suitable for processing for generating position information of an object to be registered in a map, and a data structure of the data
  • the main purpose is to provide
  • the invention described in claim is an information transmission device, which includes object position information of an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body A generation unit that generates measurement information; an acquisition unit that acquires accuracy information indicating the estimation accuracy of the estimated position information; and a transmission unit that transmits the object measurement information and the accuracy information to an information processing device. .
  • the invention described in the claims is a data structure of data transmitted to an information processing apparatus that collects object measurement information related to an object measured by a measurement apparatus mounted on a moving body, and is estimated for the moving body.
  • the object measurement information including the object position information of the object and the accuracy information indicating the estimation accuracy of the estimated position information generated based on the estimated position information indicating the position of the object, and the position of the object on the map Is a data structure of data used to determine the information processing apparatus.
  • the invention described in the claims is an information transmission device, and a generation unit that generates object measurement information including object position information of an object measured by a measurement device mounted on a moving body, and the object is measured
  • An acquisition unit that acquires accuracy information indicating the estimated accuracy of the position of the moving body
  • a transmission unit that transmits the object measurement information and the accuracy information to the information processing apparatus.
  • the invention described in the claims is a control method executed by the information transmission device, and is an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body. Generating the object measurement information including the object position information, acquiring the accuracy information indicating the estimation accuracy of the estimated position information, and transmitting the object measurement information and the accuracy information to the information processing apparatus And a transmitting step.
  • the functional block of the own vehicle position estimation part in landmark base position estimation is shown.
  • the functional block of the own vehicle position estimation part in point cloud base position estimation is shown.
  • An example of the schematic data structure of voxel data is shown. It is a graph which shows transition of the value of accuracy information, and transition of the value of standardization accuracy information.
  • the information transmission device is an object position information of an object measured by a measurement device mounted on the moving body based on estimated position information indicating a position estimated for the moving body.
  • a generation unit that generates object measurement information including: an acquisition unit that acquires accuracy information indicating the estimation accuracy of the estimated position information; a transmission unit that transmits the object measurement information and the accuracy information to an information processing device; Have.
  • the information transmission apparatus updates the map based on the object measurement information measured by the moving body, the information transmission apparatus provides accuracy information that is an index of accuracy and reliability of the object measurement information. It can be suitably transmitted to the information processing apparatus together with the object measurement information.
  • the acquisition unit is configured to standardize the estimation accuracy based on an average and a standard deviation based on an average and a standard deviation of the estimation accuracy in a predetermined period including a time point when the position of the moving body is estimated.
  • the transmission unit transmits the object measurement information and the standardization accuracy information to the information processing apparatus.
  • the information transmission apparatus is suitable for the information processing apparatus with the standardization system information having uniform average and variation regardless of the position estimation method or the like, even when accuracy information having different averages and variations is acquired. Can be sent to.
  • the position of the moving body is estimated by weighted average of the position estimation results by a plurality of methods
  • the acquisition unit is configured for each of the position estimation results by the plurality of methods.
  • Standardization accuracy information is generated, the standardization accuracy information is averaged by the same weighting as the weighted average, and the transmission unit sends the object measurement information and the averaged standardization accuracy information to the information processing apparatus.
  • the information transmission device preferably generates standardized accuracy information and transmits it to the information processing device even when the position of the moving object is estimated by weighted averaging of the position estimation results by a plurality of methods. Can do.
  • a data structure of data transmitted to an information processing apparatus that collects object measurement information related to an object measured by a measurement apparatus mounted on the mobile object, the mobile object
  • the object measurement information including the object position information of the object, generated based on the estimated position information indicating the position estimated for, and the accuracy information indicating the estimation accuracy of the estimated position information, and on the map
  • It is the data structure of the data used for the said information processing apparatus to determine the position of an object.
  • the information processing apparatus can refer to accuracy information that is an index of accuracy and reliability of the received object measurement information, and can determine the position of the object on the map. It can be suitably determined.
  • an information transmission device a generation unit that generates object measurement information including object position information of an object measured by a measurement device mounted on a moving body, and the object
  • An acquisition unit that acquires accuracy information indicating the estimation accuracy of the position of the moving body estimated when measuring the object
  • a transmission unit that transmits the object measurement information and the accuracy information to an information processing device.
  • the information transmission device can suitably transmit accuracy information that is an index of accuracy and reliability of the object measurement information together with the object measurement information to the information processing device.
  • a control method executed by the information transmitting apparatus wherein a measuring device mounted on the moving body is based on estimated position information indicating a position estimated for the moving body.
  • the information transmitting apparatus can suitably transmit accuracy information that is an index of accuracy and reliability of the object measurement information to the information processing apparatus together with the object measurement information.
  • a program causes a computer to execute the control method described above.
  • the computer functions as the information transmission device described above by executing this program.
  • the program is stored in a storage medium.
  • FIG. 1 is a schematic configuration of a map update system according to the present embodiment.
  • the map update system includes an in-vehicle device 1 that moves together with each vehicle that is a moving body, and a server device 6 that communicates with each in-vehicle device 1 via a network. Then, the map update system updates the distribution map DB 20 that is a map for distribution held by the server device 6 based on the information transmitted from each in-vehicle device 1.
  • the “map” includes data used for ADAS (Advanced Driver Assistance System) and automatic driving in addition to data referred to by a conventional in-vehicle device for route guidance.
  • ADAS Advanced Driver Assistance System
  • the in-vehicle device 1 is electrically connected to the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and based on these outputs, a predetermined object is detected and the vehicle in which the in-vehicle device 1 is mounted.
  • the position of the vehicle also referred to as “the vehicle position” is estimated.
  • the vehicle equipment 1 performs automatic driving
  • DB DataBase
  • the vehicle equipment 1 estimates the own vehicle position by collating with the output of the lidar 2 etc. based on this map DB10.
  • the in-vehicle device 1 transmits upload information “Iu” including information on the detected object to the server device 6.
  • the in-vehicle device 1 is an example of an information transmission device.
  • the lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to an object existing in the outside world, and a three-dimensional point indicating the position of the object Generate group information.
  • the lidar 2 includes an irradiation unit that irradiates laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) reflected by the object, and a light reception signal output by the light receiving unit. And an output unit for outputting scan data based on.
  • the scan data is point cloud data and is generated based on the irradiation direction corresponding to the laser light received by the light receiving unit and the distance to the object in the irradiation direction of the laser light specified based on the light reception signal.
  • the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5 each supply output data to the in-vehicle device 1.
  • the server device 6 receives the upload information Iu from each in-vehicle device 1 and stores it. For example, the server device 6 updates the distribution map DB 20 based on the collected upload information Iu. In addition, the server device 6 transmits download information Id including update information of the distribution map DB 20 to each in-vehicle device 1.
  • the server device 6 is an example of a map data generation device.
  • FIG. 2A is a block diagram showing a functional configuration of the in-vehicle device 1.
  • the in-vehicle device 1 mainly includes an interface 11, a storage unit 12, a communication unit 13, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
  • the interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15.
  • the storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process.
  • the storage unit 12 stores a map DB 10 including object information IO, landmark information IL, and voxel data IB.
  • the object information IO is information related to obstacles, holes, broken vehicles, etc. existing on or around the road, and includes attribute information such as the position, size, and shape of each object.
  • the landmark information IL is information related to features on or around the road to be a landmark.
  • a kilometer post, a 100 m post, a delineator, a traffic infrastructure facility eg, a sign, a direction
  • the landmark information IL is used in landmark base position estimation described later.
  • the landmark whose position or shape has changed due to an accident or the like is also information of the object information IO.
  • the voxel data IB is information on point cloud data indicating the measurement position of the stationary structure for each unit region (also referred to as “voxel”) when the three-dimensional space is divided into a plurality of regions.
  • the voxel data IB is used in point cloud base position estimation described later.
  • the communication unit 13 performs transmission of the upload information Iu and reception of the download information Id based on the control of the control unit 15.
  • the input unit 14 is a button, a touch panel, a remote controller, a voice input device, or the like for a user to operate.
  • the information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
  • the control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1.
  • the control unit 15 includes a host vehicle position estimation unit 17 and an upload control unit 18.
  • the own vehicle position estimation unit 17 performs highly accurate estimation of the own vehicle position by selectively or combining a plurality of own vehicle position estimation methods.
  • the vehicle position estimation unit 17 performs position estimation using the landmark measurement result and the landmark position information recorded in the object information IO (also referred to as “landmark base position estimation”). ), Position estimation using voxel data (also referred to as “point cloud-based position estimation”), and position estimation using a global positioning satellite system (GNSS) (“GNSS-based position estimation”).
  • GNSS-based position estimation global positioning satellite system
  • the vehicle position estimation unit 17 performs vehicle position estimation based on the output of the lidar 2 in the landmark base position estimation and the point cloud base position estimation, and the vehicle position estimation based on the output of the GPS receiver 5 in the GNSS base position estimation. I do.
  • the own vehicle position estimating unit 17 estimates the own vehicle position, and generates information on the estimated accuracy of the own vehicle position (also referred to as “accuracy information”) and stores it in the storage unit 12 or the like. Details of the landmark-based position estimation and the point cloud-based position estimation and the method of generating the accuracy information will be described later.
  • the upload control unit 18 generates upload information Iu including information related to the detected object when a predetermined object is detected based on the output of an external sensor such as the lidar 2, and transmits the upload information Iu to the server device 6. .
  • the information related to the object includes at least the position information of the object, and may further include attribute information such as object identification information, object shape, and size.
  • the upload control unit 18 standardizes the accuracy information calculated by the vehicle position estimation unit 17 so as to be a predetermined average value and standard deviation value (also referred to as “standardized accuracy information”). And the standardized accuracy information is included in the upload information Iu and transmitted to the server device 6. The standardized accuracy information will be described later.
  • the upload control unit 18 is an example of a “generation unit”, a “transmission unit”, and a “computer” that executes a program.
  • FIG. 2B is a block diagram showing a functional configuration of the server device 6.
  • the server device 6 mainly includes a communication unit 61, a storage unit 62, and a control unit 65. Each of these elements is connected to each other via a bus line.
  • the communication unit 61 receives the upload information Iu and transmits the download information Id based on the control of the control unit 65.
  • the storage unit 62 stores a program executed by the control unit 65 and information necessary for the control unit 65 to execute a predetermined process.
  • the storage unit 62 stores a distribution map DB 20 having the same data structure as the map DB 10 and an upload information DB 27 that is a database of upload information Iu received from each in-vehicle device 1.
  • the control unit 65 includes a CPU that executes a program and controls the entire server device 6.
  • the control unit 65 updates the distribution map DB 20 based on the upload information DB 27 in which the upload information Iu received from each in-vehicle device 1 by the communication unit 61 is accumulated, and the download including the generated map update information. Processing for transmitting the information Id to each vehicle-mounted device 1 through the communication unit 61 is performed.
  • the control unit 65 is an example of a “reception unit”, a “generation unit”, and a “computer” that executes a program.
  • the vehicle position estimation unit 17 is based on the distance and angle measurement values obtained by the lidar 2 with respect to the landmark and the landmark position information extracted from the map DB 10.
  • the vehicle position estimated from the output data of the gyro sensor 3, the vehicle speed sensor 4, and / or the GPS receiver 5 is corrected.
  • the vehicle position estimation unit 17 predicts the vehicle position calculated in the prediction step of predicting the vehicle position from the output data of the gyro sensor 3, the vehicle speed sensor 4, and the like, and the prediction step immediately before.
  • the measurement update step for correcting the value is executed alternately.
  • Various filters developed to perform Bayesian estimation can be used as the state estimation filter used in these steps, and examples thereof include an extended Kalman filter, an unscented Kalman filter, and a particle filter.
  • an extended Kalman filter an extended Kalman filter
  • an unscented Kalman filter an unscented Kalman filter
  • a particle filter an example in which the vehicle position estimation unit 17 performs vehicle position estimation using an extended Kalman filter.
  • FIG. 3 is a diagram showing the position of the vehicle to be estimated in two-dimensional orthogonal coordinates.
  • the vehicle position on the plane defined on the two-dimensional orthogonal coordinates of xy is represented by coordinates “(x, y)” and the direction (yaw angle) “ ⁇ ” of the vehicle.
  • the yaw angle ⁇ is defined as an angle formed by the traveling direction of the vehicle and the x-axis.
  • four variables (x, y, z, ⁇ ) taking into account the z-axis coordinates perpendicular to the x-axis and the y-axis are used.
  • the vehicle position is estimated using the state variable of the vehicle position. Since a general road has a gentle slope, the pitch angle and roll angle of the vehicle are basically ignored in this embodiment.
  • FIG. 4 is a diagram illustrating a schematic relationship between the prediction step and the measurement update step.
  • FIG. 5 shows an example of functional blocks of the vehicle position estimation unit 17. As shown in FIG. 4, by repeating the prediction step and the measurement update step, calculation and update of the estimated value of the state variable vector “X” indicating the vehicle position are sequentially executed. Moreover, as shown in FIG. 5, the own vehicle position estimation part 17 has the position estimation part 21 which performs a prediction step, and the position estimation part 22 which performs a measurement update step.
  • the position prediction unit 21 includes a dead reckoning block 23 and a position prediction block 24, and the position estimation unit 22 includes a landmark search / extraction block 25 and a position correction block 26.
  • the state variable vector of the reference time (ie, current time) “t” to be calculated is represented as “X ⁇ (k)” or “X ⁇ (k)”.
  • the provisional estimated value (predicted value) estimated in the predicting step is appended with “ - ” on the character representing the predicted value, and the estimated value with higher accuracy updated in the measurement updating step. Is appended with “ ⁇ ” on the character representing the value.
  • the position prediction block 24 of the control unit 15 adds the obtained movement distance and azimuth change to the state variable vector X ⁇ (k-1) at time t-1 calculated in the immediately previous measurement update step, and then adds the time t A predicted value (also referred to as “predicted position”) X ⁇ (k) is calculated.
  • the landmark search / extraction block 25 associates the landmark position vector registered in the landmark information IL of the map DB 10 with the scan data of the lidar 2. Then, when the association is made, the landmark search / extraction block 25, the measured value “Z (k)” by the lidar 2 of the made landmark, the predicted position X ⁇ (k), and the land A landmark measurement value (referred to as “measurement prediction value”) “Z ⁇ (k)” obtained by modeling the measurement processing by the lidar 2 using the landmark position vector registered in the mark information IL. Get each.
  • the measured value Z (k) is a vehicle coordinate system ("vehicle coordinate system”) converted from a landmark distance and a scan angle measured by the rider 2 at time t into components with the vehicle traveling direction and the lateral direction as axes. Vector value). Then, the position correction block 26 multiplies the difference value between the measured value Z (k) and the measured predicted value Z ⁇ (k) by the Kalman gain “K (k)” as shown in the following equation (1). By adding this to the predicted position X ⁇ (k), the updated state variable vector (also referred to as “estimated position”) X ⁇ (k) is calculated.
  • the position correction block 26 uses a covariance matrix P ⁇ (k) (simply expressed as P (k)) corresponding to the error distribution of the estimated position X ⁇ (k). Obtained from the covariance matrix P ⁇ (k). Parameters such as the Kalman gain K (k) can be calculated in the same manner as a known self-position estimation technique using an extended Kalman filter, for example.
  • the prediction step and the measurement update step are repeatedly performed, and the predicted position X ⁇ (k) and the estimated position X ⁇ (k) are sequentially calculated, so that the most likely vehicle position is calculated. .
  • the accuracy of position estimation can be determined by the value of the diagonal element of the covariance matrix P.
  • the covariance matrix calculated based on the measured value for the landmark at time “k” is P (k)
  • the covariance matrix P (k) is expressed by the following equation (2).
  • the vehicle position estimation unit 17 uses the square roots “ ⁇ x (k)”, “ ⁇ y (k)”, “ ⁇ z (k)”, “ ⁇ ” of the diagonal elements of the covariance matrix P (k).
  • ⁇ (k) is regarded as accuracy information value“ d (k) ”for each state variable x, y, z, ⁇ .
  • the vehicle position estimation unit 17 uses the square roots ⁇ X (k), ⁇ Y (k), ⁇ Z (k), ⁇ ⁇ of the diagonal elements after conversion into the vehicle coordinate system (X, Y, Z). (K) is regarded as the precision information value d (k) for each state variable.
  • the vehicle position estimation unit 17 uses the coordinate axes (that is, latitude, longitude, altitude, or xyz coordinates from a certain reference point) of the global coordinate system adopted in the map.
  • the value d (k) of accuracy information may be calculated for each value.
  • the vehicle position estimation unit 17 uses the matrix C ⁇ (k) described above to perform covariance.
  • the matrix P (k) is converted into the global coordinate system, and the diagonal element after conversion is defined as the value d (k) of accuracy information.
  • the voxel data IB used in the point cloud-based position estimation includes data representing the point cloud data measured for stationary structures in each voxel by a normal distribution, and is used for scan matching using NDT (Normal Distributions Transform). .
  • FIG. 6 shows an example of the vehicle position estimation unit 17 in the point cloud base position estimation.
  • the difference from the vehicle position estimation unit 17 in the landmark base position estimation shown in FIG. 5 is that the point cloud data obtained from the lidar 2 and the voxel acquired from the map DB are used instead of the landmark search / extraction unit 25.
  • the point cloud data correlation block 27 is provided as the correlation process.
  • FIG. 7 shows an example of a schematic data structure of the voxel data IB.
  • the voxel data IB includes information on parameters when the point cloud in the voxel is expressed by a normal distribution.
  • the voxel ID, voxel coordinates, average vector, and covariance are included. Including matrix.
  • “voxel coordinates” indicate absolute three-dimensional coordinates of a reference position such as the center position of each voxel.
  • Each voxel is a cube obtained by dividing the space into a lattice shape, and since the shape and size are determined in advance, the space of each voxel can be specified by the voxel coordinates.
  • the voxel coordinates may be used as a voxel ID.
  • the mean vector“ ⁇ n ”and the covariance matrix“ V n ”at voxel n are expressed by the following equations (5) and (6), respectively.
  • the average value “L ′ n ” is expressed by the following equation (7).
  • the in-vehicle device 1 uses the point group obtained by coordinate transformation, the average vector ⁇ n and the covariance matrix V n included in the voxel data, and the voxel n represented by the following equation (9).
  • Overall evaluation function value “E (k)” also referred to as “overall evaluation function value” for all voxels to be matched indicated by the evaluation function value “E n ” and Expression (10). Is calculated.
  • the in-vehicle device 1 calculates an estimation parameter P that maximizes the overall evaluation function value E (k) by an arbitrary root finding algorithm such as Newton's method.
  • the in-vehicle device 1 applies the estimation parameter P to the own vehicle position X ⁇ (k) predicted from the position prediction unit 21 shown in FIG. An accurate own vehicle position X ⁇ (k) is estimated.
  • the accuracy information value d () of the point cloud base position estimation is reduced so that the better the position estimation accuracy is, like the accuracy information value d (k) calculated in the landmark base position estimation. k) is defined.
  • the GNSS base position estimation own vehicle position estimation unit 17 estimates the own vehicle position based on the output of the GPS receiver 5 in the GNSS base position estimation. Further, when executing the GNSS base position estimation, the host vehicle position estimation unit 17 acquires, for example, DOP (Division Of Precision) obtained from the GPS receiver 5 as the accuracy information value d (k). In another example, the vehicle position estimation unit 17 uses the standard deviation values of latitude, longitude, and altitude acquired from the GPS receiver 5 within a predetermined period of time as values of accuracy information for each of latitude, longitude, and altitude. Obtained as d (k).
  • the GPS receiver 5 may be a receiver capable of positioning not only GPS but also GLONASS, Galileo, quasi-zenith satellite (QZSS), and the like.
  • the upload control unit 18 of the in-vehicle device 1 uses the accuracy information generated by the vehicle position estimation unit 17 at the transmission timing of the upload information Iu, and the accuracy of the vehicle position estimation calculated by the vehicle position estimation unit 17 within the past predetermined time. Standardized accuracy information is calculated based on the average and standard deviation of the information values.
  • the accuracy information calculated by the own vehicle position estimating unit 17 at the transmission timing of the upload information Iu is “d (k)”, and the average of the accuracy information values calculated by the own vehicle position estimating unit 17 within the past predetermined time is used.
  • the standard deviations are “ ⁇ (k)” and “ ⁇ (k)”, respectively, the upload control unit 18 calculates the value “S (k)” of the standardization accuracy information by the following equation (12).
  • the value S (k) of the standardized accuracy information is a negative value when the accuracy information d (k) is smaller than the average ⁇ (k), and a positive value when the accuracy information d (k) is larger than the average ⁇ (k).
  • the value S (k) of the standardization accuracy information approaches 0 as the standard deviation ⁇ (k) increases, and increases as the standard deviation ⁇ (k) decreases. Therefore, the standardized accuracy information value S (k) is a standardized expression using the distribution of the accuracy information value d (k). Then, the upload control unit 18 calculates the standardized accuracy information value S (k) for each latitude, longitude, and altitude.
  • FIG. 8A shows accuracy information values d A (k) and d obtained when the vehicle-mounted devices 1 mounted on different vehicles at a certain point perform vehicle position estimation using different vehicle position estimation methods. It is the figure which showed transition of B (k).
  • the accuracy information value d A (k) indicated by the graph G1 is a distribution of the average “ ⁇ A (k)” and the standard deviation “ ⁇ A (k)”
  • the accuracy information value indicated by the graph G2 d B (k) has a distribution of mean “ ⁇ B (k)” and standard deviation “ ⁇ B (k)”.
  • the accuracy information differs in average and standard deviation due to the vehicle position estimation method to be executed and the accuracy of the sensor to be used.
  • FIG. 8B shows the transition of the standardized accuracy information value S A (k) obtained by standardizing the accuracy information value d A (k) based on the equation (12), and the accuracy information value d B (k). is a diagram showing changes with the values S B of standardized standardized accuracy information (k) based on (12).
  • the value S A (k) of the standardization accuracy information indicated by the graph G3 and the value S B (k) of the standardization accuracy information indicated by the graph G4 are both distributions of mean 0 and standard deviation 1. Yes.
  • Expression (12) it is possible to handle accuracy information having different distributions by the same axis.
  • the vehicle position estimation unit 17 uses the vehicle position estimation method A in which the accuracy information value is d A (k) (see the graph G1 in FIG. 8A), and the accuracy information value is d B ( k) A case where the vehicle position is estimated by weighting the vehicle position estimation method B (see graph G2 in FIG. 8A) with a ratio of ⁇ : ⁇ (for example, 7: 3) will be described. To do.
  • the upload control unit 18 first adds standardized accuracy information S A (see graph G3 in FIG. 8B) based on accuracy information of the vehicle position estimation method A and accuracy information of the vehicle position estimation method B. Based on the standardized accuracy information S B (see graph G4 in FIG. 8B), each is calculated. Then, the upload control unit 18 calculates standardized accuracy information S AB obtained by weighting the calculated standardized accuracy information S A and the standardized accuracy information S B with a ratio of ⁇ : ⁇ .
  • Figure 9 (A) is, alpha: beta 7: shows a transition of normalization precision information S AB calculated in FIG. 8 (B) are shown to weighting and the standard-precision information S A and the standard-precision information S B as 3 It is a graph.
  • the standardized accuracy information S AB is calculated by adding the standardized accuracy information S B multiplied by.
  • the upload control unit 18 uses the average “ ⁇ AB ” and the standard deviation “ ⁇ AB ” within the past predetermined time of the standardization accuracy information S AB , so that the standardization accuracy is 1 and the standard deviation 1 is distributed. Standardize the information SAB .
  • FIG. 9B shows the transition of the value S (k) obtained by standardizing the standardized accuracy information SAB .
  • the upload control unit 18 divides the standardized accuracy information S AB subtracted by the average ⁇ AB by the standard deviation ⁇ AB , and the standardized accuracy information value S included in the upload information Iu. Calculate as (k).
  • the upload control unit 18 preferably calculates the standardized accuracy information in which the average and standard deviation are standardized even when the vehicle position estimation unit 17 executes a plurality of vehicle position estimation methods. It can be included in the upload information Iu.
  • FIG. 10 is a diagram showing an outline of the data structure of the upload information Iu transmitted by the in-vehicle device 1.
  • the upload information Iu includes header information, travel route information, event information, and media information.
  • the header information includes items of “version”, “transmission source”, and “vehicle metadata”.
  • the in-vehicle device 1 designates information on the version of the data structure of the upload information Iu used in “Version”, and the name of the company (OEM name or system vendor of the vehicle that transmits the upload information Iu) in “Sender” Name) information. Further, the in-vehicle device 1 specifies vehicle attribute information (for example, vehicle type, vehicle ID, vehicle width, vehicle height, etc.) in “vehicle metadata”.
  • the travel route information includes an item “position estimation”. The in-vehicle device 1 designates, for this “position estimation”, the time stamp information indicating the position estimation time, the latitude, longitude, altitude information indicating the estimated vehicle position, and information regarding the estimation accuracy. .
  • Event information includes an item of “object recognition event”.
  • the vehicle-mounted device 1 When the vehicle-mounted device 1 detects an object recognition event, it designates information as a detection result as an “object recognition event”.
  • the “object recognition event” includes elements of “time stamp”, “object ID”, “offset position”, “object type”, “object size”, “object size accuracy”, and “media ID”.
  • the media information is a data type used when transmitting raw data that is output data (detection information) of an external sensor such as the lidar 2.
  • the in-vehicle device 1 transmits the upload information Iu including at least the position information of the detected object and the standardization accuracy information to the server device 6.
  • the in-vehicle device 1 specifies the standardization accuracy information in the item “position estimation”, and also specifies the position information of the detected object in the “object recognition event” of the event information.
  • the in-vehicle device 1 designates the relative position from the vehicle position as the sub-item “offset position” of the “object recognition event” as the position information of the object designated as the “object recognition event”.
  • the absolute position of the vehicle is specified by the item “position estimation”.
  • the in-vehicle device 1 includes the information on the absolute position of the object generated based on the estimated own vehicle position and the measured relative position of the object in the upload information Iu, instead of including the information on the relative position of the object in the upload information Iu. May be.
  • the upload information Iu is an example of “object measurement information”
  • the information on the absolute position of the vehicle specified by the item “position estimation” or the like is an example of “estimated position information”.
  • the server device 6 calculates the position information of the object detected by each in-vehicle device 1 by statistical processing based on the upload information Iu received from each in-vehicle device 1. At this time, the server device 6 weights the position information of the corresponding object on the basis of the weighting value “w (k)” calculated based on the standardization accuracy information S (k), so that the object to be registered in the distribution map DB 20 The position information of is calculated.
  • the server device 6 is set such that the smaller the standardization accuracy information S (k), the better the position estimation accuracy at that time, so that the weighting value w (k) increases. In other words, the server device 6 sets the weighting value w (k) to be smaller because the position estimation accuracy at that time is lower as the standardization accuracy information S (k) is larger.
  • the server device 6 calculates the weighting value w (k) by referring to any one of the following formulas (13) to (15), so that the standardized accuracy information S (k) increases.
  • the weighting value w (k) can be generated so as to approach 0, and the weighting value w (k) approaches 2 as the standardized accuracy information S (k) increases in the negative direction.
  • FIG. 11A shows the correspondence between the standardized accuracy information S (k) based on the equations (13) to (15) and the weighting value w (k).
  • the graph G5 shows the correspondence between the standardized accuracy information S (k) based on the equation (13) and the weighting value w (k)
  • the graph G6 shows the standardized accuracy information S (k) based on the equation (14).
  • the weighting value w (k) and the graph G7 shows the correspondence between the standardized accuracy information S (k) based on the equation (15) and the weighting value w (k).
  • the server device 6 uses the following formulas (16) to (18), in which the coefficient c is introduced into the formulas (13) to (15), to weight the standardized accuracy information S (k). It is possible to suitably adjust the conversion ratio to the value w (k).
  • FIG. 11B shows the correspondence between the standardized accuracy information S (k) based on the equations (16) to (18) and the weighting value w (k) when the coefficient c is set to “0.5”.
  • the graph G8 shows the correspondence between the standardized accuracy information S (k) based on the equation (16) and the weighting value w (k)
  • the graph G9 shows the standardized accuracy information S (k) based on the equation (17).
  • the graph G10 shows the correspondence between the standardized accuracy information S (k) based on the equation (18) and the weighting value w (k).
  • the upload control unit 18 calculates the weight value w (k) based on the standardization accuracy information for each of the latitude, longitude, altitude, or xyz coordinate value from a certain reference point.
  • the accuracy information value d (k) is not obtained for each direction (that is, only one accuracy information value d (k) is calculated) as in point cloud base position estimation, upload control is performed.
  • the unit 18 considers that the values d (k) of the latitude, longitude, and altitude accuracy information are the same, and calculates the weighting value w (k) for each of the latitude, longitude, and altitude.
  • the absolute position (also referred to as “object measurement position”) of the object indicated by each upload information Iu is [p x (k), p y (k), p z (k)] T , and the upload information Iu.
  • the server device 6 can suitably calculate the estimated object position to be reflected in the distribution map DB 20 by the weighted averaging process.
  • the server device 6 In addition to the estimated object position, the server device 6 also includes the number of object measurement positions (also referred to as “number of samples”) used to calculate the estimated object position and the object measurement position used to calculate the estimated object position.
  • a variation index value (also referred to as “variation index value”) is further calculated.
  • the variation index value is, for example, variance or standard deviation.
  • the server apparatus 6 registers the object estimated position which linked
  • the greater the number of samples the higher the reliability of the calculated estimated object position.
  • the server device 6 registers information on the number of samples and the variation index value together with the estimated position information of the object in the distribution map DB 20, so that information serving as a reliability index of the registered position information of the object is preferably used. Can be added to the distribution map DB 20.
  • FIG. 12 is an example of a flowchart showing an outline of processing related to transmission / reception of upload information Iu and download information Id.
  • the vehicle-mounted device 1 estimates its own vehicle position, calculates accuracy information, and stores it in the storage unit 12 (step S101).
  • the in-vehicle device 1 determines whether or not a predetermined object has been detected based on the output of an external sensor such as the lidar 2 (step S102). That is, the in-vehicle device 1 determines whether an object to be notified of position information is detected based on the upload information Iu. And the vehicle equipment 1 will detect the predetermined
  • the vehicle equipment 1 transmits the upload information Iu containing the information regarding objects, such as the positional information and identification information of the detected object, and the standardization precision information calculated by step S103 to the server apparatus 6 (step S104). On the other hand, the vehicle equipment 1 returns a process to step S101, when the predetermined object is not detected (step S102; No).
  • the server device 6 that has received the upload information Iu from the in-vehicle device 1 stores the upload information Iu in the upload information DB 27 (step S201).
  • the server device 6 receives, for the same object, upload information Iu at a plurality of times from the in-vehicle devices 1 of the plurality of vehicles.
  • the server device 6 determines whether or not it is the update timing of the distribution map DB 20 (step S202). And when it is not the update timing of distribution map DB20 (step S202; No), the server apparatus 6 receives the upload information Iu from the vehicle equipment 1, and performs the process of step S201.
  • the server device 6 determines that it is the update timing of the distribution map DB 20 (step S202; Yes)
  • the upload information DB 27 is referred to, and each object detected by the in-vehicle device 1 is weighted based on the standardization accuracy information.
  • the server device 6 determines the weighting value w (k) from the standardized accuracy information S (k) with reference to, for example, any one of the equations (13) to (18).
  • the server device 6 calculates, for each object to be updated, the number of samples of the object measurement position used for calculating the object estimated position and the variation index value indicating the variation of the object measured position, and the estimated object position At the same time, it is registered in the distribution map DB 20 (step S204). Then, the server device 6 transmits download information Id including the estimated object position for each object, the number of samples, a combination of variation index values, and the like as map update information to each vehicle-mounted device 1 (step S205).
  • the in-vehicle device 1 When the in-vehicle device 1 receives the download information Id (step S105; Yes), it updates the map DB 10 using the download information Id (step S106). For example, the in-vehicle device 1 updates the object information IO based on the download information Id. Thereby, the number of samples and the variation index value are associated with the position information of each object included in the object information IO. Thereafter, the in-vehicle device 1 may select, for example, a landmark to be used for landmark base position estimation with reference to the number of samples and the variation index value included in the object information IO.
  • the in-vehicle device 1 determines that an object whose number of samples is equal to or greater than a predetermined value and the variation indicated by the variation index value is equal to or less than a predetermined degree is registered with highly reliable position information, Select as a landmark.
  • the vehicle equipment 1 returns a process to step S101, when the download information Id is not received from the server apparatus 6 (step S105; No).
  • FIGS. 13A to 13C are diagrams showing the positions of the vehicles A to D and the vehicle position estimation accuracy when the obstacle 30 is detected by the lidar 2 at different times.
  • the point 31a indicates the estimated position coordinates of the vehicle
  • the ellipse 32a indicates the vehicle position estimation accuracy of the vehicle.
  • the size of the ellipse 32a is expressed as being smaller as the vehicle position estimation accuracy is higher, and larger as the vehicle position estimation accuracy is lower.
  • the length of the ellipse 32a indicates the vehicle position estimation accuracy with respect to the extension direction of the ellipse 32a. The better the vehicle position estimation accuracy is, the shorter the length is expressed.
  • the vehicle-mounted device 1 of the vehicle A detects the obstacle 30 at the position shown in FIG. 13A by the lidar 2 and calculates the coordinates of the obstacle 30 based on the estimated position coordinates of the respective own vehicle.
  • the upload information Iu indicating the detection result of the obstacle 30 is transmitted to the server device 6 respectively.
  • the vehicle-mounted device 1 of the vehicle B detects the obstacle 30 at a position shown in FIG. 13B at a time different from that in FIG. 13A, and upload information Iu indicating the detection result is sent to the server device. 6 to send.
  • the plurality of vehicles A to D transmit the upload information Iu indicating the detection results to the server device 6 for the same object (here, the obstacle 30).
  • the vehicles A to D have different own vehicle position estimation accuracy (see the ellipse 32a).
  • the upload information Iu is transmitted to the server device 6 at a plurality of times from the vehicle-mounted devices 1 of a plurality of vehicles having different estimation accuracy of the vehicle position for the same object.
  • the server device 6 performs a weighted averaging process based on the standardization accuracy information included in the upload information Iu received from each of the in-vehicle devices 1 of the plurality of vehicles, for each object to be updated.
  • the object estimated position is calculated. Accordingly, regardless of the position estimation method, the weight value of the object measurement position based on the upload information Iu from the vehicle-mounted device 1 of the vehicle having high vehicle position estimation accuracy is increased, and the vehicle position estimation accuracy is low. The weight value of the object measurement position based on the upload information Iu from the in-vehicle device 1 is lowered. Thereby, the server apparatus 6 can determine correctly the object estimated position registered into distribution map DB20.
  • the server device 6 further sets a weighting value (also referred to as “vehicle weighting value”) for each vehicle-mounted device 1 that is the transmission source of the upload information Iu (that is, for each vehicle in which the vehicle-mounted device 1 is mounted), and sets the vehicle weighting value.
  • a weighting value also referred to as “vehicle weighting value”
  • vehicle weighting value Based on the object estimated position [p ⁇ x , p ⁇ y , p ⁇ z ] T of the object may be calculated.
  • This vehicle weighting value is set by a method to be described later so as to be a value according to the accuracy and performance of the sensor used for detecting the object.
  • the server device 6 sets the object estimated positions [p ⁇ x , p ⁇ y , p ⁇ z ] T based on the following equations (22) to (24). calculate.
  • the server device 6 stores information (also referred to as “vehicle weighting information IV”) in which a vehicle weighting value is associated with each vehicle ID, and refers to the vehicle weighting information IV to formulas (22) to (22) 24)
  • vehicle weighting information IV in which a vehicle weighting value is associated with each vehicle ID, and refers to the vehicle weighting information IV to formulas (22) to (22) 24
  • the estimated object position of the object is calculated based on 24).
  • the server device 6 updates the vehicle weight value recorded in the vehicle weight information IV during the map update process.
  • the vehicle weighting value is set to 11 steps from 0 to 1 in 0.1 steps, and the deviation amount (that is, the distance between the indicated positions) between the calculated object estimated position and the object measurement position indicated by certain upload information Iu is a predetermined threshold value.
  • the weighting value of the transmission source vehicle of the upload information Iu is lowered by 0.1 from the value recorded in the vehicle weighting information IV.
  • the server device 6 transmits the upload information Iu if the amount of deviation between the calculated estimated object position and the object measurement position indicated by certain upload information Iu (that is, the distance between the indicated positions) is less than a predetermined threshold.
  • the vehicle weighting value is increased by 0.1 from the value recorded in the vehicle weighting information IV. Therefore, in a vehicle in which the difference in the object measurement position is often large with respect to the calculated object estimation position, the vehicle weighting value approaches 0 and does not contribute much to the calculation of the object estimation position. On the other hand, a vehicle in which the difference in the object measurement position is often small with respect to the calculated object estimated position has a vehicle weight value approaching 1 and greatly contributes to the calculation of the object estimated position.
  • the server device 6 receives the upload information Iu from a vehicle for which no vehicle weight value is recorded in the vehicle weight information IV, the server device 6 calculates the vehicle weight value for the vehicle as a predetermined initial value (for example, 0.5). The initial value is updated based on the amount of deviation between the estimated object position and the object measurement position indicated by the upload information Iu.
  • the object estimated position by the averaging process using the vehicle weighting value it is possible to reduce the influence due to the measurement value having a large error and to increase the estimation accuracy of the object position.
  • an object measurement position measured by a vehicle in which an external sensor such as the lidar 2 is deficient and the object measurement position is normally deviated or a vehicle having an external sensor with low measurement accuracy is reduced. Therefore, according to this modification, even when the upload information Iu is received from a vehicle having a defect in the external sensor, a vehicle having a low measurement accuracy of the external sensor, or the like, the calculation of the estimated object position is less affected. .
  • the vehicle weight value of the vehicle gradually increases and is reflected in the calculation of the object estimated position.
  • FIG. 14 is a flowchart showing a procedure related to map update processing of the server device 6 according to the present modification.
  • the server device 6 that has received the upload information Iu from the in-vehicle device 1 stores the upload information Iu in the upload information DB 27 (step S211).
  • the server device 6 determines that it is the update timing of the distribution map DB 20 (step S212; Yes)
  • the server device 6 refers to the upload information DB 27 and the vehicle weighting information IV, and for each object detected by the vehicle-mounted device 1, the vehicle Using the weighting value and the weighting value based on the standardized accuracy information, the estimated object position [p ⁇ x , p ⁇ y , p ⁇ z ] T is calculated from the expressions (22) to (24) (step S213).
  • the server device 6 calculates, for each object, the number of samples indicating the number of object measurement positions used to calculate the object estimated position and the variation index value indicating the variation of the object measurement position, and the estimated object position And is registered in the distribution map DB 20 (step S214).
  • the server device 6 records the vehicle recorded in the vehicle weighting information IV based on the deviation amount between the object estimated position of each object calculated in step S213 and each object measurement position used for calculating the object estimated position.
  • the weighting value is updated (step S215).
  • the server device 6 transmits download information Id including a combination of the estimated object position, the number of samples, and the variation index value for each object to each vehicle-mounted device 1 (step S216).
  • the server device 6 may calculate the object estimated position by weighted averaging processing using only the vehicle weight values without using the weight values based on the standardization accuracy information. In this case, the server device 6 calculates the object estimated position [p ⁇ x , p ⁇ y , p ⁇ z ] T based on the following equations (25) to (27).
  • the server device 6 can preferably estimate the position information of the object to be registered in the distribution map DB 20 based on the reliability of the object detection result for each vehicle.
  • the in-vehicle device 1 may not include the standardization accuracy information in the upload information Iu.
  • the server device 6 may execute the standardization accuracy information calculation process instead of the in-vehicle device 1.
  • the server device when the vehicle-mounted device 1 detects an object in step S102 in FIG. 12, the server device includes the accuracy information of the current time and the average and standard deviation of accuracy information within the past predetermined time in the upload information Iu.
  • the upload information Iu is transmitted to 6.
  • the server device 6 calculates standardization accuracy information for each received upload information Iu after receiving the upload information Iu in step S201 or after determining the update timing of the distribution map DB 20 in step S202.
  • the configuration of the map update system shown in FIG. 1 is an example, and the configuration of the map update system to which the present invention is applicable is not limited to the configuration shown in FIG.
  • the electronic control device of the vehicle instead of having the in-vehicle device 1, the electronic control device of the vehicle executes the processes of the vehicle position estimation unit 17, the upload control unit 18, and the automatic driving control unit 19 of the in-vehicle device 1.
  • the map DB 10 is stored in, for example, a storage unit in the vehicle, and the electronic control device of the vehicle exchanges upload information Iu and download information Id with the server device 6 via the in-vehicle device 1 or communication (not shown). You may go through the part.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Navigation (AREA)

Abstract

Une machine embarquée 1 estime une position de véhicule hôte, et calcule et mémorise des informations de précision dans une unité de mémorisation 12 (étape S101). Ensuite, la machine embarquée 1 détermine si un objet prescrit a été détecté ou non sur la base de la sortie d'un capteur externe d'un LIDAR 2 ou analogue (étape S102). La machine embarquée 1 calcule, lorsque l'objet prescrit est détecté (Oui à l'étape S102), des informations de précision de normalisation sur la base d'informations de précision générées au moment actuel et de la moyenne et de l'écart type des valeurs d'informations de précision dans un temps passé prescrit mémorisé à l'étape S101 (étape S103). Ensuite, la machine embarquée 1 transmet, à un dispositif serveur 6, des informations de téléchargement Iu qui comprennent les informations de précision de normalisation calculées à l'étape S103 et des informations concernant l'objet telles que des informations de position et des informations d'identification de l'objet (étape S104).
PCT/JP2019/012176 2018-03-28 2019-03-22 Dispositif de transmission d'informations, structure de données, procédé de commande, programme et support d'informations Ceased WO2019188820A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018063279 2018-03-28
JP2018-063279 2018-03-28

Publications (1)

Publication Number Publication Date
WO2019188820A1 true WO2019188820A1 (fr) 2019-10-03

Family

ID=68061873

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/012176 Ceased WO2019188820A1 (fr) 2018-03-28 2019-03-22 Dispositif de transmission d'informations, structure de données, procédé de commande, programme et support d'informations

Country Status (1)

Country Link
WO (1) WO2019188820A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021149567A (ja) * 2020-03-19 2021-09-27 株式会社Soken 移動体検知システム
CN114526744A (zh) * 2020-11-05 2022-05-24 丰田自动车株式会社 地图更新装置和地图更新方法
CN116034420A (zh) * 2020-09-01 2023-04-28 三菱电机株式会社 引导装置、程序和引导方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002036990A (ja) * 2000-07-26 2002-02-06 Honda Motor Co Ltd 車両の車線逸脱警報装置
JP2004198997A (ja) * 2002-12-20 2004-07-15 Denso Corp 地図評価システム、及び、照合装置、並びに、地図評価装置
JP2016156973A (ja) * 2015-02-25 2016-09-01 パイオニア株式会社 地図データ記憶装置、制御方法、プログラム及び記憶媒体
JP2016180980A (ja) * 2015-03-23 2016-10-13 株式会社豊田中央研究所 情報処理装置、プログラム、及び地図データ更新システム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002036990A (ja) * 2000-07-26 2002-02-06 Honda Motor Co Ltd 車両の車線逸脱警報装置
JP2004198997A (ja) * 2002-12-20 2004-07-15 Denso Corp 地図評価システム、及び、照合装置、並びに、地図評価装置
JP2016156973A (ja) * 2015-02-25 2016-09-01 パイオニア株式会社 地図データ記憶装置、制御方法、プログラム及び記憶媒体
JP2016180980A (ja) * 2015-03-23 2016-10-13 株式会社豊田中央研究所 情報処理装置、プログラム、及び地図データ更新システム

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021149567A (ja) * 2020-03-19 2021-09-27 株式会社Soken 移動体検知システム
JP7328170B2 (ja) 2020-03-19 2023-08-16 株式会社Soken 移動体検知システム
CN116034420A (zh) * 2020-09-01 2023-04-28 三菱电机株式会社 引导装置、程序和引导方法
CN114526744A (zh) * 2020-11-05 2022-05-24 丰田自动车株式会社 地图更新装置和地图更新方法
CN114526744B (zh) * 2020-11-05 2024-03-22 丰田自动车株式会社 地图更新装置和地图更新方法

Similar Documents

Publication Publication Date Title
JP2022113746A (ja) 判定装置
CN106352867B (zh) 用于确定车辆自身位置的方法和设备
JP7155284B2 (ja) 計測精度算出装置、自己位置推定装置、制御方法、プログラム及び記憶媒体
JP6806891B2 (ja) 情報処理装置、制御方法、プログラム及び記憶媒体
JP2023164553A (ja) 位置推定装置、推定装置、制御方法、プログラム及び記憶媒体
JP6980010B2 (ja) 自己位置推定装置、制御方法、プログラム及び記憶媒体
JP2025015843A (ja) 情報処理装置、制御方法、プログラム及び記憶媒体
JP2024161585A (ja) 自己位置推定装置
JP2024161130A (ja) 自己位置推定装置、制御方法、プログラム及び記憶媒体
JP2025083454A (ja) 情報処理装置、地図データ生成装置、方法及びプログラム
JP2019174675A (ja) データ構造、地図データ生成装置、制御方法、プログラム及び記憶媒体
JP2023076673A (ja) 情報処理装置、制御方法、プログラム及び記憶媒体
JP2020046411A (ja) データ構造、記憶装置、端末装置、サーバ装置、制御方法、プログラム及び記憶媒体
JP2019174191A (ja) データ構造、情報送信装置、制御方法、プログラム及び記憶媒体
WO2019188820A1 (fr) Dispositif de transmission d'informations, structure de données, procédé de commande, programme et support d'informations
WO2019188886A1 (fr) Dispositif terminal, procédé de traitement d'informations et support d'informations
WO2019188874A1 (fr) Structure de données, dispositif de traitement d'informations et dispositif de génération de données cartographiques
Verentsov et al. Bayesian framework for vehicle localization using crowdsourced data
WO2019188877A1 (fr) Dispositif de transmission d'informations, structure de données, procédé de commande, programme et support de stockage
CN120703803A (zh) 一种基于时空图模型的协同定位方法
CN120322765A (zh) 用于确定道路曲率的方法和装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19775550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19775550

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP