WO2012086029A1 - Système à mouvement autonome - Google Patents
Système à mouvement autonome Download PDFInfo
- Publication number
- WO2012086029A1 WO2012086029A1 PCT/JP2010/073131 JP2010073131W WO2012086029A1 WO 2012086029 A1 WO2012086029 A1 WO 2012086029A1 JP 2010073131 W JP2010073131 W JP 2010073131W WO 2012086029 A1 WO2012086029 A1 WO 2012086029A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- prediction error
- unit
- autonomous mobile
- self
- shape model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
Definitions
- the present invention relates to an autonomous mobile system that moves an autonomous mobile body to a destination while estimating a self-position based on data of a measuring device.
- the autonomous mobile system in the prior art is provided with a map in which object shapes in a real environment that can be used as landmarks (marks) used for estimating the self-location are stored.
- Self-position estimation has been realized by matching (positioning) this map with the shape of surrounding objects existing in the actual environment measured using a measuring device such as a laser scanner.
- the autonomous mobile system may actually deviate from the route.
- the autonomous mobile system collides with obstacles (static obstacles such as buildings, roadside trees, and power poles, and dynamic obstacles such as people and other vehicles that are moving objects) or breaks at intersections. There was a possibility of going on the wrong path.
- obstacles static obstacles such as buildings, roadside trees, and power poles, and dynamic obstacles such as people and other vehicles that are moving objects
- breaks at intersections There was a possibility of going on the wrong path.
- the autonomous mobile system completely loses its own position and it is difficult to continue the subsequent movement.
- An object of the present invention is to provide an autonomous mobile system that plans a route that allows the autonomous mobile system to move appropriately without losing sight of its own position and reaches the destination.
- the above-mentioned purpose is to input a travel environment shape model in an autonomous mobile system in which an autonomous mobile body travels to a destination while estimating its own position by referring to a map based on measurement data from a peripheral object shape measurement unit.
- An environmental shape model input unit, a self-position error prediction unit that calculates a prediction error of self-position estimation based on information from the driving environment shape model input unit, and the prediction error in an area where the autonomous mobile body can travel A prediction error map generation storage unit that generates and stores a predicted error map, and a route plan unit that performs route planning based on the prediction error corresponding to the route traveled by the autonomous mobile body with reference to the prediction error map This is achieved by providing
- the object is that the travel environment shape model input unit, the self-position error prediction unit, and the prediction error map generation storage unit are installed in a management system, the route planning unit is installed in a terminal system, and the management system And the terminal system are preferably connected by a wireless network.
- the object is that the prediction error map generated and stored by the prediction error map generation storage unit divides the travelable area into a plurality of small areas and holds the prediction error corresponding to the small areas. Is preferred.
- the above object holds the prediction error corresponding to each of a plurality of directions in the small area.
- the prediction error map generation storage unit extracts a shape having a predetermined height from the shape model input to the driving environment shape model input unit, and the self-position error prediction unit generates the prediction error map. It is preferable to calculate the prediction error using the shape extracted by the storage unit.
- an autonomous mobile system that reaches a destination by planning a route that allows the autonomous mobile system to move appropriately without losing sight of its own position.
- road structures such as a building, a roadside tree, and a telephone pole, and a runnable field (road) concerning one example.
- FIG. 6 is a plan view of landmark data that has been coordinate-transformed and an environmental shape model according to the present embodiment. It is a top view of the measurement range and environmental shape model which extracted the landmark data in a certain node concerning a present Example.
- FIG. 6 is a plan view of landmark data that has been coordinate-transformed and an environmental shape model according to the present embodiment. It is a top view of the measurement range and environmental shape model which extracted the landmark data in a certain node concerning a present Example.
- FIG. 6 is a plan view of landmark data that has been coordinate-transformed and an environmental shape model according to the present embodiment.
- FIG. 1 is a block diagram of an autonomous mobile system according to an embodiment of the present invention.
- an autonomous mobile system 1 is mounted on a vehicle v such as an autonomous mobile robot or an autonomous mobile vehicle and provides an autonomous mobile function.
- the autonomous mobile system 1 includes a traveling environment shape model input unit 2, a prediction error map generation storage unit 3, a self-position error prediction unit 4, a landmark map storage unit 5, a peripheral object shape measurement unit 6, a self-position estimation unit 7,
- the destination input unit 8, the route planning unit 9, and the traveling unit 10 are configured.
- the traveling environment shape model input unit 2 is a portion for inputting a shape model of a traveling environment such as a building in a city, a road structure such as a roadside tree, and a power pole.
- the environmental shape model m to be input is used in recent car navigation systems, or uses CityGML (http://www.citygml.org/), which is standardized worldwide by the Open Geospatial Consortium (OGC). .
- These environmental shape models m have information such as the shape and texture of road structures represented by polygons and the like, and information such as sidewalks / roadways as attribute information.
- the prediction error map generation storage unit 3 is a part that generates and stores a prediction error map in the travelable region using the environment shape model m input from the travel environment shape model input unit 2.
- processing is performed in cooperation with the self-position error prediction unit 4 (specific processing contents will be described later).
- the self-position error prediction unit 4 is a part that predicts in advance a self-position estimation error that occurs at each position in the travelable area on the map by using a geometric calculation method. Data necessary for the calculation is acquired from the prediction error map generation storage unit 3, and the calculated prediction error of the self-position estimation is returned to the prediction error map generation storage unit 3 (specific processing contents will be described later).
- the landmark map storage unit 5 is a part that extracts a portion that can be used as a landmark (mark) from the environmental shape model m input from the traveling environment shape model input unit 2 and stores it as a landmark map. If a landmark map can be acquired separately in advance, the separately acquired landmark map may be stored without extracting the landmark from the environmental shape model m.
- the peripheral object shape measuring unit 6 is a part that measures the shape of an object (a building, a road tree such as a tree, a telephone pole, a person, or another vehicle) existing around the vehicle v.
- a laser scanner, a stereo camera, a “Time of Flight”, a Distance image camera, or the like can be used.
- the self-position estimation unit 7 detects a portion to be a landmark (mark) from the shape of the surrounding object measured by the surrounding object shape measurement unit 6 at the current position, and the landmark map stored in the landmark map storage unit 5. This is a part for estimating the current self-position on the map by performing matching (position alignment).
- a landmark mark
- the method described in the book “Probabilistic Robotics” (author: Sebastian Thrun, Wolfram Burgard, and Dieter Fox, publisher: The MIT Press, publication year: 2005) can be used.
- the destination input unit 8 is a part where the user of the vehicle v inputs a destination to the autonomous mobile system 1. For example, as in the car navigation system, the user inputs a destination by operating a touch panel displaying a map.
- the route planning unit 9 is a part that plans a route from the current self-position estimated by the self-position estimation unit 7 to the destination input from the destination input unit 8 using the prediction error of the self-position estimation as a cost. .
- the prediction error map stored in the prediction error map generation storage unit 3 is used (specific processing contents will be described later).
- the traveling unit 10 is a part for driving and driving the wheels of the vehicle v on which the autonomous mobile system 1 is mounted. Control is performed so that the vehicle v travels according to the route planned by the route planning unit 9. In addition, the movement form by a crawler and a leg may be sufficient instead of a wheel.
- FIG. 2 is a block diagram of another form of the autonomous mobile system according to one embodiment.
- the traveling environment shape model input unit 2, the prediction error map generation storage unit 3, the self-position error prediction unit 4, and the landmark map storage unit 5 in the autonomous mobile system 1 are not on the vehicle v.
- a terminal system 1b is installed in the vehicle v, and the autonomous mobile system 1 is configured by combining the management system 1a and the terminal system 1b.
- each component of the autonomous mobile system 1 is the same as that of the embodiment of FIG. 1, but between the prediction error map generation storage unit 3 and the route plan unit 9, and the landmark map storage unit 5 and the self-position estimation unit 7 is different in that communication between 7 is performed via a wireless network.
- FIG. 3 is a plan view in which a travelable area is extracted from an environmental shape model according to an embodiment and divided by a grid.
- a travelable region (road) is extracted from the environment shape model m input from the travel environment shape model input unit 2.
- the top view of FIG. 3 shows a plan view of the result of extracting the travelable area from the environmental shape model m.
- the environmental shape model m not only has a shape and a texture, but also has information such as a sidewalk / roadway as attribute information. For example, when an autonomous mobile robot traveling on a sidewalk is targeted, By extracting, it is possible to extract the travelable area.
- the travelable area is divided by a grid 11 or the like to express discretely.
- a grid 11 In the lower part of FIG. 3, an example is shown in which the grid 11 is divided by a uniform size. Actually, for example, the travelable area is divided by a grid 11 of 1 m square.
- how to divide the travelable area is not limited to the grid 11 of uniform size, but the book “Introduction to Intelligent Robots—Solution of Motion Planning Problems” (authors: Junjun Ohta, Daisuke Kurabayashi, Tomio Arai, Publishing) (A company: Corona, publication year: 2001).
- FIG. 4 is a plan view showing a result of constructing a graph structure with nodes and links by dividing a travelable area according to an embodiment by a grid.
- a node 12 is generated at the center of each divided grid 11.
- one round 360 deg is divided into one grid, and a plurality of nodes 12 are generated.
- one round 360 deg is divided into eight every 45 deg to generate eight nodes 12. Therefore, each node 12 becomes a node 12 in a three-dimensional space of (x, y, ⁇ ).
- the measurement range 14 of the peripheral object shape measurement unit 6 cannot cover the entire circumference, and the measurable region changes depending on the azimuth ⁇ .
- some peripheral object shape measuring units 6 such as an omnidirectional stereo camera
- many peripheral object shape measuring units 6 such as a laser scanner cannot cover the entire circumference, so each node 12 has information on the direction ⁇ . It is effective to have
- a graph structure is constructed by connecting nodes corresponding to 8 neighbors as viewed in the two-dimensional plane of (x, y) with links 13.
- This graph structure is the basic data structure used by the route planning unit 9.
- the route planning unit 9 that does not use the link 13 (details will be described in the description of the route planning unit 9).
- the prediction error map generation storage unit 3 generates data for passing to the self-position error prediction unit 4.
- the self-position error prediction unit 4 generates data necessary for calculation for each node 12 in order to predict an error of self-position estimation for each node 12.
- FIG. 5 is a plan view of an environmental shape model having information on road structures such as buildings, roadside trees, utility poles, and travelable areas (roads) according to an embodiment.
- FIG. 6 is a diagram illustrating an example of extracting landmark data from measurement data in a region higher than a person according to the present embodiment.
- the node 12a in FIG. 5 will be described as an example.
- measurement data is generated by simulating an object that exists at the position (x, y, ⁇ ) of the node 12 a and exists in the measurement range 14 of the peripheral object shape measurement unit 6.
- the measurement data can be simulated by performing ray casting, which is a computer graphics technique, and obtaining the reflection point of the laser irradiated from the laser scanner. It is. Note that the measurement data simulated at this time may be thinned out evenly. For example, thinning is performed at intervals of 0.1 m.
- landmark data z used as a landmark is extracted from the measured measurement data.
- the landmark data is extracted by extracting a predetermined height using the height information of each measurement data. Extract z.
- data used as a landmark may be extracted from the result of clustering measurement data or extracting features.
- the above processing is performed for each node 12 and the landmark data z is extracted for each node 12.
- the landmark data z extracted here and the environment shape model m input from the traveling environment shape model input unit 2 are data used by the self-position error prediction unit 4 for calculation.
- FIG. 7 is a flowchart illustrating a processing procedure of the self-position error prediction unit according to the embodiment.
- the self-position error prediction unit 4 obtains a prediction error of self-position estimation as a covariance matrix ⁇ by geometric calculation using the landmark data z and the environment shape model m for each node 12. .
- the basic idea is to move the position of the landmark data z by a small amount and evaluate the degree of overlap with the environmental shape model m at that time to obtain the nature of the error (deviation) during matching, From there, the covariance matrix ⁇ is calculated.
- the degree of overlap with the environmental shape model m changes greatly when the landmark data z is moved, it means that the uniqueness of the shape of the landmark data z is high. Has the property of becoming smaller. Conversely, if the degree of overlap does not change much when the landmark data z is moved, this means that the uniqueness is low, and as a result, the matching error becomes large. Further, since the nature of how the error changes when moved, the nature of the error can be obtained not as a value but as a distribution represented by the covariance matrix ⁇ . First, the movement amount (x T , y T , ⁇ T ) for moving the landmark data z is determined (S11).
- the maximum value and the minimum value of the movement amount are set in advance, and the range is equally divided at regular intervals.
- the maximum value and the minimum value may be determined based on the error magnitude of the self-position estimation unit 7. For example, for x T and y T respectively 10 divided for each 0.5m range of + 2.5 m from -2.5m, for theta T is 12 divided for each 10deg a range of + 60 deg from -60Deg. In this case, there are 1200 combinations of movement amounts of 10 ⁇ 10 ⁇ 12.
- the landmark data z is coordinate-converted with the movement amount (x T , y T , ⁇ T ) (S12).
- the homogeneous transformation matrix representing the coordinate transformation by the movement amount (x T , y T , ⁇ T ) is T
- the landmark data after the coordinate transformation is expressed as Tz.
- the degree of overlap is calculated as the matching evaluation value DT using the coordinate-converted landmark data Tz and the environmental shape model m (S13).
- Equation (1) and Equation (2) are used. *
- the correspondence c between the point group of the landmark data Tz and the polygon vertex group of the environmental shape model m is obtained by the nearest neighbor search of Expression (1).
- the number of point groups of the landmark data Tz is N.
- the matching evaluation value DT is obtained by the root mean square of the distance between corresponding points in Expression (2) using the obtained correspondence c.
- n is a normal direction unit vector of the polygon surface of the environmental shape model m.
- the likelihood L T calculates the likelihood L T from the matching evaluation value D T (S14).
- the likelihood L T here approximates the likelihood of overlapping with the environmental shape model m when the landmark data z is coordinate-transformed with the movement amount (x T , y T , ⁇ T ) by a normal distribution. Is.
- the calculation of likelihood L T for example using equation (3).
- ⁇ is a constant representing the nature of the error of the peripheral object shape measuring unit 6.
- a specific value may be determined based on the magnitude of the error of the peripheral object shape measurement unit 6.
- the calculation of the likelihood L T for the above movement amounts (x T , y T , ⁇ T ) is performed for all combinations of movement amounts (S15). In the above example, there are 1200 combinations.
- Equation (4) each amount of movement (x T, y T, ⁇ T) was weighted likelihood L T for all the combinations of a calculation of the covariance matrix ⁇ by weighted least squares method.
- sigma T is a mathematical symbol indicates a total for each amount of movement.
- the processing of the self-position error prediction unit 4 is finished (S17, S18). Thereby, the prediction error of the self-position estimation in each node 12 in the travelable area is obtained as a covariance matrix ⁇ . Further, the obtained prediction error of the self-position estimation in each node 12 is returned to the prediction error map generation storage unit 3.
- the prediction error map generation storage unit 3 can generate and store a prediction error map with a data structure of each node 12 in the travelable area and a prediction error of self-position estimation associated with each node 12. .
- FIG. 8 is a plan view of a measurement range and an environmental shape model obtained by extracting landmark data at a certain node according to one embodiment.
- the processing from S11 to S14 in the flowchart of the self-position error prediction unit 4 shown in FIG. 7 in such a situation will be described as an example.
- the movement amount (x T , y T , ⁇ T ) is determined to be (+1.0 m, +2.0 m, 0 deg) (S11).
- the landmark data z1 extracted from the measurement range 14 is coordinate-converted with the movement amount (S12).
- FIG. 9 is a plan view of the landmark data subjected to coordinate conversion and the environmental shape model according to the present embodiment.
- the landmark data Tz1 subjected to coordinate conversion and the environmental shape model m do not overlap (the deviation is large).
- the degree of overlap between the landmark data z1 and the environmental shape model m is greatly changed. If the degree of overlap with the environmental shape model m changes greatly when the landmark data z1 is moved, it means that the uniqueness of the shape of the landmark data z1 is high. It has the property that the matching error becomes smaller with respect to the direction.
- a matching evaluation value DT is calculated in order to quantitatively determine the degree of overlap between the coordinate-converted landmark data Tz1 and the environmental shape model m (S13).
- the matching evaluation value DT which is the mean square of the distance between corresponding points, is a large value.
- the likelihood L T is obtained by approximating the likelihood of overlap in the normal distribution, the likelihood L T as the matching evaluation value D T is large (low degree of overlap) is a small value.
- FIG. 10 shows a plan view of the measurement range 14b from which the landmark data z is extracted and the environmental shape model m, as seen from above, in a certain node 12c in a situation different from that in FIG.
- FIG. 11 is a plan view of the landmark data subjected to coordinate conversion and the environmental shape model according to one embodiment.
- the landmark data Tz2 subjected to coordinate conversion and the environmental shape model m are relatively overlapped (the deviation is small). That is, it can be seen that there is a deviation in the x direction, but there appears to be no deviation in the y direction.
- the degree of overlap between the landmark data z2 and the environmental shape model m does not change much. If the degree of overlap with the environmental shape model m does not change much when the landmark data z2 is moved, it means that the uniqueness of the shape of the landmark data z2 is low. It has the property that the matching error increases with respect to the direction.
- a matching evaluation value DT is calculated in order to quantitatively determine the degree of overlap between the landmark data Tz2 subjected to coordinate conversion and the environmental shape model m (S13).
- the matching evaluation value DT which is the mean square of the distance between corresponding points, is a small value.
- the likelihood L T is obtained by approximating the likelihood of overlap in the normal distribution, the likelihood L T as the matching evaluation value D T is smaller (higher degree of overlap) is a large value.
- the route plan unit 9 plans a route using the prediction error map stored in the prediction error map generation storage unit 3.
- the prediction error map has a data structure of each node 12 obtained by dividing the travelable area shown in FIG. 4 by the grid 11 and a prediction error of self-position estimation associated with each node 12.
- Each node 12 is a node 12 in a three-dimensional space of (x, y, ⁇ ), and there are a plurality of nodes 12 having different azimuths ⁇ in the same place as seen in the two-dimensional plane of (x, y). .
- the route planning unit 9 As an embodiment of the route planning unit 9, at least two forms are conceivable.
- the first is a form in which a route plan is performed by a shortest route search algorithm in the graph theory for a graph structure constructed by connecting nodes with links 13 as described above.
- the shortest path search is performed using the prediction error of the self-position estimation of each node 12 as a cost.
- the travelable region is discretely expressed in a three-dimensional (x, y, ⁇ ) configuration space, and the target route in the region is represented. Can be requested.
- the current self-position estimated by the self-position estimation unit 7 is set as the start position
- the destination input from the destination input unit 8 is set as the goal position
- the path planning is performed using the prediction error of the self-position estimation of each node 12 as the cost. Do.
- the path planning unit 9 calculates the cost, the plurality of nodes 12 through which the path passes have.
- the covariance matrix ⁇ By fusing the covariance matrix ⁇ by maximum likelihood estimation, the covariance at the time of passing through each node 12 is obtained.
- simulation of covariance when traveling along a route is performed using the driving model described in the book “Probabilistic Robotics” (Author: Sebastian Thrun, Wolfram Burgard, and Dieter Fox, Publisher: The MIT Press, Publication year: 2005)
- the cost can be obtained by fusing the covariance matrix ⁇ of the node 12 with the maximum likelihood estimation.
- the prediction error represented by the fused covariance As the cost of the path plan, it is determined whether the covariance at the time of passing through each node 12 exceeds a predetermined threshold, thereby increasing the prediction error. It is possible to plan a route to the destination considering that the route does not pass. To determine whether the covariance exceeds a predetermined threshold, the elements of the matrix may be compared, or the eigenvalues of the matrix may be compared.
- each node 12 has a cost as a value, and each node 12 represents the cost as a covariance matrix ⁇ , instead of evaluating the route by the sum of the costs of a plurality of nodes 12 through which the route passes. And the route is evaluated by covariance when the route passes through each node 12. This makes it possible to plan a route in consideration of differences in the nature (size and direction of deviation) of errors that occur in self-position estimation at each position.
- an error in self-position estimation at each position in the travelable region is predicted in advance, and a route that increases the prediction error does not pass. It is possible to plan the route to the destination in consideration of the above.
- this invention can acquire the outstanding effect of implement
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
La présente invention concerne un système à mouvement autonome pour voyager jusqu'à une destination tout en estimant également une position propre en se référant à une carte sur la base des données de mesure provenant d'une unité de mesure de la forme d'un objet périphérique. L'objet de la présente invention est caractérisé en ce qu'il comprend : une unité d'entrée de modèle de forme d'environnement de déplacement pour entrer un modèle de forme de l'environnement de déplacement ; une unité de prévision d'erreur de position propre pour calculer l'erreur de prévision de l'estimation de la position propre en se basant sur le modèle de forme ; une unité de stockage et de génération de carte d'erreur de prévision pour générer et stocker, dans une région où le système à mouvement autonome est capable de se déplacer, une carte d'erreur de prévision à laquelle est associée l'erreur de prévision ; et une unité de planification du trajet pour se référer à la carte d'erreur de prévision et planifier un trajet en se basant sur l'erreur de prévision correspondant au trajet dans lequel se déplace le système à mouvement autonome.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2010/073131 WO2012086029A1 (fr) | 2010-12-22 | 2010-12-22 | Système à mouvement autonome |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2010/073131 WO2012086029A1 (fr) | 2010-12-22 | 2010-12-22 | Système à mouvement autonome |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2012086029A1 true WO2012086029A1 (fr) | 2012-06-28 |
Family
ID=46313333
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2010/073131 Ceased WO2012086029A1 (fr) | 2010-12-22 | 2010-12-22 | Système à mouvement autonome |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2012086029A1 (fr) |
Cited By (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014139821A1 (fr) * | 2013-03-15 | 2014-09-18 | Volkswagen Aktiengesellschaft | Application de planification d'itinéraire pour conduite automatique |
| WO2015025599A1 (fr) * | 2013-08-21 | 2015-02-26 | シャープ株式会社 | Corps mobile autonome |
| JP2016162013A (ja) * | 2015-02-27 | 2016-09-05 | 株式会社日立製作所 | 自己位置推定装置および移動体 |
| CN106444756A (zh) * | 2016-09-22 | 2017-02-22 | 纳恩博(北京)科技有限公司 | 一种信息处理方法及电子设备 |
| WO2017060947A1 (fr) * | 2015-10-05 | 2017-04-13 | パイオニア株式会社 | Appareil d'estimation, procédé de commande, programme et support de stockage |
| JP2017101944A (ja) * | 2015-11-30 | 2017-06-08 | パイオニア株式会社 | 速度算出装置、制御方法、プログラム及び記憶媒体 |
| WO2017199333A1 (fr) * | 2016-05-17 | 2017-11-23 | パイオニア株式会社 | Dispositif de sortie d'informations, dispositif de terminal, procédé de commande, programme et support de stockage |
| JP2018504650A (ja) * | 2014-12-26 | 2018-02-15 | ヘーレ グローバル ベスローテン フェンノートシャップ | 装置の位置特定のための幾何学的指紋法 |
| EP3106836B1 (fr) * | 2015-06-16 | 2018-06-06 | Volvo Car Corporation | Unité et procédé pour régler une limite de route |
| JP2019192206A (ja) * | 2018-04-27 | 2019-10-31 | 深セン市優必選科技股▲ふん▼有限公司Ubtech Poboticscorp Ltd | 充電台の認識方法、装置及びロボット |
| JP2020098196A (ja) * | 2019-10-23 | 2020-06-25 | パイオニア株式会社 | 推定装置、制御方法、プログラム及び記憶媒体 |
| JP2021044008A (ja) * | 2020-12-01 | 2021-03-18 | パイオニア株式会社 | 情報出力装置、端末装置、制御方法、プログラム及び記憶媒体 |
| JP2022025118A (ja) * | 2019-10-23 | 2022-02-09 | パイオニア株式会社 | 推定装置、制御方法、プログラム及び記憶媒体 |
| WO2022070324A1 (fr) * | 2020-09-30 | 2022-04-07 | 日本電気株式会社 | Dispositif de commande de corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et support d'enregistrement non transitoire lisible par ordinateur avec programme de commande de corps mobile enregistré dans celui-ci |
| JP2023105152A (ja) * | 2021-11-01 | 2023-07-28 | パイオニア株式会社 | 推定装置、制御方法、プログラム及び記憶媒体 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH02259913A (ja) * | 1989-03-31 | 1990-10-22 | Glory Ltd | 移動体の自己定位方法 |
| JP2003015739A (ja) * | 2001-07-02 | 2003-01-17 | Yaskawa Electric Corp | 外環境地図、並びに自己位置同定装置および誘導制御装置 |
| WO2007069726A1 (fr) * | 2005-12-16 | 2007-06-21 | Ihi Corporation | Procede et dispositif d'identification d'auto-positionnement, et procede et dispositif de mesure de forme tridimensionnelle |
| JP2007322138A (ja) * | 2006-05-30 | 2007-12-13 | Toyota Motor Corp | 移動装置及び移動装置の自己位置推定方法 |
| JP2010152787A (ja) * | 2008-12-26 | 2010-07-08 | Fujitsu Ltd | 環境地図生成プログラム、環境地図生成方法及び移動ロボット |
| JP2010238217A (ja) * | 2009-03-09 | 2010-10-21 | Yaskawa Electric Corp | 移動ロボットの自己位置同定方法及び移動ロボット |
-
2010
- 2010-12-22 WO PCT/JP2010/073131 patent/WO2012086029A1/fr not_active Ceased
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH02259913A (ja) * | 1989-03-31 | 1990-10-22 | Glory Ltd | 移動体の自己定位方法 |
| JP2003015739A (ja) * | 2001-07-02 | 2003-01-17 | Yaskawa Electric Corp | 外環境地図、並びに自己位置同定装置および誘導制御装置 |
| WO2007069726A1 (fr) * | 2005-12-16 | 2007-06-21 | Ihi Corporation | Procede et dispositif d'identification d'auto-positionnement, et procede et dispositif de mesure de forme tridimensionnelle |
| JP2007322138A (ja) * | 2006-05-30 | 2007-12-13 | Toyota Motor Corp | 移動装置及び移動装置の自己位置推定方法 |
| JP2010152787A (ja) * | 2008-12-26 | 2010-07-08 | Fujitsu Ltd | 環境地図生成プログラム、環境地図生成方法及び移動ロボット |
| JP2010238217A (ja) * | 2009-03-09 | 2010-10-21 | Yaskawa Electric Corp | 移動ロボットの自己位置同定方法及び移動ロボット |
Cited By (32)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105229422A (zh) * | 2013-03-15 | 2016-01-06 | 大众汽车有限公司 | 自动驾驶路线规划应用 |
| US10451428B2 (en) | 2013-03-15 | 2019-10-22 | Volkswagen Aktiengesellschaft | Automatic driving route planning application |
| WO2014139821A1 (fr) * | 2013-03-15 | 2014-09-18 | Volkswagen Aktiengesellschaft | Application de planification d'itinéraire pour conduite automatique |
| WO2015025599A1 (fr) * | 2013-08-21 | 2015-02-26 | シャープ株式会社 | Corps mobile autonome |
| JP2015041203A (ja) * | 2013-08-21 | 2015-03-02 | シャープ株式会社 | 自律移動体 |
| CN105247431A (zh) * | 2013-08-21 | 2016-01-13 | 夏普株式会社 | 自主移动体 |
| CN105247431B (zh) * | 2013-08-21 | 2017-09-19 | 夏普株式会社 | 自主移动体 |
| US9804598B2 (en) | 2013-08-21 | 2017-10-31 | Sharp Kabushiki Kaisha | Autonomous mobile body |
| JP2020101833A (ja) * | 2014-12-26 | 2020-07-02 | ヘーレ グローバル ベスローテン フェンノートシャップ | 装置の位置特定のための幾何学的指紋法 |
| JP2021060602A (ja) * | 2014-12-26 | 2021-04-15 | ヘーレ グローバル ベスローテン フェンノートシャップ | 装置の位置特定のための幾何学的指紋法 |
| JP7111794B2 (ja) | 2014-12-26 | 2022-08-02 | ヘーレ グローバル ベスローテン フェンノートシャップ | 装置の位置特定のための幾何学的指紋法 |
| JP2018504650A (ja) * | 2014-12-26 | 2018-02-15 | ヘーレ グローバル ベスローテン フェンノートシャップ | 装置の位置特定のための幾何学的指紋法 |
| US9881379B2 (en) | 2015-02-27 | 2018-01-30 | Hitachi, Ltd. | Self-localization device and movable body |
| JP2016162013A (ja) * | 2015-02-27 | 2016-09-05 | 株式会社日立製作所 | 自己位置推定装置および移動体 |
| EP3106836B1 (fr) * | 2015-06-16 | 2018-06-06 | Volvo Car Corporation | Unité et procédé pour régler une limite de route |
| US11199850B2 (en) | 2015-10-05 | 2021-12-14 | Pioneer Corporation | Estimation device, control method, program and storage medium |
| JPWO2017060947A1 (ja) * | 2015-10-05 | 2018-08-02 | パイオニア株式会社 | 推定装置、制御方法、プログラム及び記憶媒体 |
| WO2017060947A1 (fr) * | 2015-10-05 | 2017-04-13 | パイオニア株式会社 | Appareil d'estimation, procédé de commande, programme et support de stockage |
| JP2017101944A (ja) * | 2015-11-30 | 2017-06-08 | パイオニア株式会社 | 速度算出装置、制御方法、プログラム及び記憶媒体 |
| JPWO2017199333A1 (ja) * | 2016-05-17 | 2019-03-14 | パイオニア株式会社 | 情報出力装置、端末装置、制御方法、プログラム及び記憶媒体 |
| WO2017199333A1 (fr) * | 2016-05-17 | 2017-11-23 | パイオニア株式会社 | Dispositif de sortie d'informations, dispositif de terminal, procédé de commande, programme et support de stockage |
| CN106444756A (zh) * | 2016-09-22 | 2017-02-22 | 纳恩博(北京)科技有限公司 | 一种信息处理方法及电子设备 |
| JP2019192206A (ja) * | 2018-04-27 | 2019-10-31 | 深セン市優必選科技股▲ふん▼有限公司Ubtech Poboticscorp Ltd | 充電台の認識方法、装置及びロボット |
| JP2020098196A (ja) * | 2019-10-23 | 2020-06-25 | パイオニア株式会社 | 推定装置、制御方法、プログラム及び記憶媒体 |
| JP2022025118A (ja) * | 2019-10-23 | 2022-02-09 | パイオニア株式会社 | 推定装置、制御方法、プログラム及び記憶媒体 |
| WO2022070324A1 (fr) * | 2020-09-30 | 2022-04-07 | 日本電気株式会社 | Dispositif de commande de corps mobile, procédé de commande de corps mobile, système de commande de corps mobile et support d'enregistrement non transitoire lisible par ordinateur avec programme de commande de corps mobile enregistré dans celui-ci |
| JPWO2022070324A1 (fr) * | 2020-09-30 | 2022-04-07 | ||
| JP7601105B2 (ja) | 2020-09-30 | 2024-12-17 | 日本電気株式会社 | 移動体制御装置、移動体制御方法、移動体制御システム及び移動体制御プログラム |
| US12332647B2 (en) | 2020-09-30 | 2025-06-17 | Nec Corporation | Mobile body control apparatus, mobile body control method, mobile body control system, and non-transitory computer-readable storage medium storing mobile body control program |
| JP2021044008A (ja) * | 2020-12-01 | 2021-03-18 | パイオニア株式会社 | 情報出力装置、端末装置、制御方法、プログラム及び記憶媒体 |
| JP2023105152A (ja) * | 2021-11-01 | 2023-07-28 | パイオニア株式会社 | 推定装置、制御方法、プログラム及び記憶媒体 |
| JP2024161108A (ja) * | 2021-11-01 | 2024-11-15 | パイオニア株式会社 | 推定装置、制御方法、プログラム及び記憶媒体 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2012086029A1 (fr) | Système à mouvement autonome | |
| JP6813703B2 (ja) | 装置の位置特定のための幾何学的指紋法 | |
| Gwon et al. | Generation of a precise and efficient lane-level road map for intelligent vehicle systems | |
| EP3237923B1 (fr) | Localisation d'un dispositif par multilatération | |
| RU2756439C1 (ru) | Определение локализации для работы транспортного средства | |
| CN104914865B (zh) | 变电站巡检机器人定位导航系统及方法 | |
| EP3238494B1 (fr) | Sélection de données géométriques caractéristiques pour la localisation d'un dispositif | |
| Kim et al. | Updating point cloud layer of high definition (hd) map based on crowd-sourcing of multiple vehicles installed lidar | |
| CN108089572A (zh) | 用于稳健且有效的车辆定位的算法和基础设施 | |
| Huang et al. | Task-specific performance evaluation of UGVs: Case studies at the IVFC | |
| CN109059942A (zh) | 一种井下高精度导航地图构建系统及构建方法 | |
| Singh et al. | Evaluating the performance of map matching algorithms for navigation systems: an empirical study | |
| WO2016103026A2 (fr) | Extraction de géométries caractéristiques pour la localisation d'un dispositif | |
| US20210003415A1 (en) | Submap geographic projections | |
| CN113741503B (zh) | 一种自主定位式无人机及其室内路径自主规划方法 | |
| JP2016149090A (ja) | 自律移動装置、自律移動システム、自律移動方法、およびプログラム | |
| Li et al. | Hybrid filtering framework based robust localization for industrial vehicles | |
| JP2023126893A (ja) | 汎用的に使用可能な特徴マップを生成する方法 | |
| WO2021112078A1 (fr) | Dispositif de traitement d'informations, procédé de commande, programme et support de stockage | |
| JP2011059043A (ja) | 経路探索装置及び移動システム | |
| JPWO2012160630A1 (ja) | 軌跡補正方法、軌跡補正装置および移動体装置 | |
| CN120178933B (zh) | 一种基于深度强化学习的无人机路线规划方法及系统 | |
| Zhou et al. | An autonomous navigation approach for unmanned vehicle in off-road environment with self-supervised traversal cost prediction | |
| Basiri | Open area path finding to improve wheelchair navigation | |
| Zakaria et al. | Autonomous shuttle development at Universiti Malaysia Pahang: LiDAR point cloud data stitching and mapping using iterative closest point cloud algorithm |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10861072 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 10861072 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |