[go: up one dir, main page]

WO2024232249A1 - Procédé d'estimation d'auto-position, dispositif de traitement d'informations, et programme - Google Patents

Procédé d'estimation d'auto-position, dispositif de traitement d'informations, et programme Download PDF

Info

Publication number
WO2024232249A1
WO2024232249A1 PCT/JP2024/015672 JP2024015672W WO2024232249A1 WO 2024232249 A1 WO2024232249 A1 WO 2024232249A1 JP 2024015672 W JP2024015672 W JP 2024015672W WO 2024232249 A1 WO2024232249 A1 WO 2024232249A1
Authority
WO
WIPO (PCT)
Prior art keywords
self
sensor
distance measuring
variance
moving body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/015672
Other languages
English (en)
Japanese (ja)
Inventor
裕崇 田中
希彰 町中
遼 高橋
雄大 湯口
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of WO2024232249A1 publication Critical patent/WO2024232249A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/40Control within particular dimensions
    • G05D1/43Control of position or course in two dimensions

Definitions

  • the present disclosure relates to a self-location estimation method, an information processing device, and a program, and in particular to a self-location estimation method, an information processing device, and a program that enable improvement of the accuracy of estimating a self-location.
  • a technology is known that allows a moving object moving within a specified movement area to estimate its own position within the movement area.
  • Patent Document 1 discloses a self-position estimation device that, when estimating the final self-position of a moving object moving within a specified moving area for each step, adds one step's worth of movement information acquired by odometry to the final self-position in the previous step.
  • This disclosure was made in light of these circumstances, and aims to improve the accuracy of self-location estimation.
  • the self-location estimation method disclosed herein is a self-location estimation method that performs self-location estimation of a moving body using a particle filter based on sensor data from a first distance measurement sensor that senses the front of the moving body and a second distance measurement sensor that senses the side of the moving body, sets a weighting factor related to the reliability of the sensor data from the first distance measurement sensor and the second distance measurement sensor based on the variance in the moving direction of the moving body and the variance in an orthogonal direction perpendicular to the moving direction in the presence distribution of the estimated self-location of the moving body, and sets the sampling period of the first distance measurement sensor and the second distance measurement sensor according to the set weighting factor.
  • the information processing device disclosed herein is an information processing device that includes a self-position estimation unit that performs self-position estimation of the moving body using a particle filter based on sensor data from a first distance measurement sensor that senses the front of the moving body and a second distance measurement sensor that senses the side of the moving body, a weighting factor setting unit that sets a weighting factor related to the reliability of the sensor data from the first distance measurement sensor and the second distance measurement sensor based on the variance in the moving direction of the moving body and the variance in an orthogonal direction perpendicular to the moving direction in the presence distribution of the estimated self-position of the moving body, and a sampling period setting unit that sets the sampling period of the first distance measurement sensor and the second distance measurement sensor according to the weighting factor that has been set.
  • the program disclosed herein causes a computer to execute a process of estimating the self-position of a moving body using a particle filter based on sensor data from a first distance measurement sensor that senses the front of the moving body and a second distance measurement sensor that senses the side of the moving body, setting a weighting factor related to the reliability of the sensor data from the first distance measurement sensor and the second distance measurement sensor based on the variance in the moving direction of the moving body and the variance in an orthogonal direction perpendicular to the moving direction in the distribution of the estimated self-position of the moving body, and setting the sampling period of the first distance measurement sensor and the second distance measurement sensor according to the weighting factor that has been set.
  • a particle filter is used to estimate the self-position of a moving body based on sensor data from a first distance measurement sensor that senses the front of the moving body and a second distance measurement sensor that senses the side of the moving body, and a weighting factor related to the reliability of the sensor data from the first distance measurement sensor and the second distance measurement sensor is set based on the variance in the moving direction of the moving body and the variance in an orthogonal direction perpendicular to the moving direction in the distribution of the estimated self-position of the moving body, and the sampling period of the first distance measurement sensor and the second distance measurement sensor is set according to the weighting factor that has been set.
  • FIG. 1 is a block diagram showing an example of the configuration of a moving body using conventional technology.
  • 11 is a flowchart illustrating a flow of a self-position estimation process.
  • FIG. 1 is a diagram illustrating an example of a moving object that moves in an environment in which changes in feature amount are small.
  • FIG. 13 is a diagram illustrating an accumulated self-location error. 1 is a block diagram showing an example configuration of a moving body to which the technology according to the present disclosure is applied.
  • 11 is a flowchart illustrating a flow of a self-position estimation process.
  • FIG. 13 is a block diagram showing another example of the configuration of a moving body.
  • FIG. 13 is a block diagram showing yet another example configuration of a moving body.
  • FIG. 2 is a block diagram showing an example of the hardware configuration of a computer.
  • FIG. 1 is a block diagram showing an example of the configuration of a moving object 1 using a conventional technique.
  • the mobile unit 1 is configured to enable autonomous movement by dead reckoning (also known as autonomous navigation), which estimates its own position from the route traveled and the distance traveled.
  • dead reckoning also known as autonomous navigation
  • errors accumulate with each repetition of processing, so the mobile unit 1 can maintain the accuracy of its own position estimation by periodically correcting the errors accumulated by dead reckoning through localization using features obtained by sensing the surrounding environment.
  • the mobile object 1 may be a transport robot or a serving robot that can move autonomously, an inspection robot that inspects inside pipes, or a vehicle that can travel autonomously. Without being limited to these, the mobile object 1 may be a drone that flies autonomously within a space.
  • the moving body 1 is equipped with a wheel encoder 10, an IMU (Inertial Measurement Unit) 20, a distance sensor 30, and a processor 50.
  • the moving body 1 is also equipped with a motor, wheels, propellers, etc. as a movement mechanism for movement.
  • the wheel encoder 10 is one of the internal sensors that senses the internal state of the moving body 1, detects the direction and angle of rotation of the wheels of the moving body 1, and supplies the information to the processor 50. If the moving body 1 does not have wheels, the wheel encoder 10 does not need to be provided.
  • the IMU 20 is one of the internal sensors that senses the internal state of the moving body 1, and by detecting the inertial motion of the moving body 1, it supplies the angular velocity (Yaw, Pitch, Roll) around the three axes of the aircraft (moving body 1) and the acceleration in the three axial directions (Ax, Ay, Az) to the processor 50.
  • the distance measurement sensor 30 senses the surroundings (particularly the front) of the moving body 1 to generate sensor data (point cloud data) representing the distance to surrounding objects and their shapes, and supplies the data to the processor 50.
  • the distance measurement sensor 30 may be configured with a LiDAR (Light Detection And Ranging) sensor or a stereo camera.
  • the processor 50 performs self-position estimation of the moving body 1 based on the sensor data from each of the wheel encoders 10, the IMU 20, and the distance measurement sensor 30.
  • the processor 50 is configured as an information processing device that operates by executing a specific program, for example.
  • the processor 50 realizes the following functional blocks: a sensor data acquisition unit 51, a self-position estimation unit 52, a weighting coefficient setting unit 53, and a sampling period setting unit 54.
  • the sensor data acquisition unit 51 acquires sensor data from each of the sensors, the wheel encoder 10, the IMU 20, and the distance measurement sensor 30, and supplies it to the self-position estimation unit 52. If necessary, the sensor data acquisition unit 51 converts the sensor data from each sensor into a format that can be used by the self-position estimation unit 52. For example, the sensor data acquisition unit 51 converts the sensor data (point cloud data) from the distance measurement sensor 30 into angular velocity (Yaw, Pitch, Roll) around the three axes of the moving body 1 and acceleration in the three axial directions (Ax, Ay, Az), and supplies it to the self-position estimation unit 52. The sensor data conversion may be performed by the distance measurement sensor 30.
  • the self-position estimation unit 52 performs self-position estimation of the mobile unit 1 based on the sensor data from each sensor supplied by the sensor data acquisition unit 51.
  • the self-position estimation unit 52 obtains the self-position (x, y, z) of the moving body 1 by performing self-position estimation of the moving body 1 using a Kalman filter based on the rotation direction and rotation angle of the wheel detected by the wheel encoder 10.
  • the self-position estimation unit 52 also obtains the self-position (x, y, z) of the moving body 1 by performing self-position estimation of the moving body 1 using a Kalman filter based on the angular velocity (Yaw, Pitch, Roll) about the three axes of the moving body 1 and the acceleration (Ax, Ay, Az) in the three axial directions of the moving body 1 detected by the IMU 20.
  • the self-position estimation unit 52 performs self-position estimation by dead reckoning using the EKF.
  • the self-position estimation unit 52 performs self-position estimation of the moving body 1 using a particle filter based on the sensor data generated by the distance measurement sensor 30, thereby determining the self-position (x, y, z) of the moving body 1 as well as the variance of particles in the distribution of the self-position.
  • a particle filter Adaptive Monte Carlo Localization
  • AMCL Adaptive Monte Carlo Localization
  • the self-location estimation unit 52 integrates the self-location estimation results of the moving body 1 obtained by self-location estimation based on the sensor data from each sensor, and determines the final self-location of the moving body 1. Specifically, the self-location estimation unit 52 corrects the self-location obtained by dead reckoning based on the self-location obtained by AMCL. The self-location estimation unit 52 supplies the finally determined self-location and the variance obtained by AMCL to the weighting coefficient setting unit 53.
  • the weighting factor setting unit 53 sets a weighting factor relating to the reliability of the sensor data from the distance measurement sensor 30 based on the variance from the self-position estimation unit 52, supplies it to the sampling period setting unit 54, and also feeds it back to the self-position estimation unit 52.
  • the self-position estimation unit 52 to which the weighting factor has been fed back, performs self-position estimation of the mobile unit 1 based on the sensor data for which the weighting factor has been set.
  • the weighting coefficient set here can be considered the reliability of the estimation result by AMCL (self-location estimation using a particle filter). Therefore, the self-location estimation unit 52 weights the self-location calculated by AMCL in response to the feedback of the weighting coefficient set by the weighting coefficient setting unit 53.
  • the sampling period setting unit 54 sets the sampling period of the distance measurement sensor 30 based on the weighting coefficient from the weighting coefficient setting unit 53, and supplies information indicating the set sampling period to the distance measurement sensor 30 and the self-position estimation unit 52.
  • the distance measurement sensor 30 generates sensor data at the set sampling period
  • the self-position estimation unit 52 executes AMCL at a timing synchronized with the sampling period.
  • step S1 the weighting coefficient setting unit 53 reads the particle variance threshold V_th in the self-position distribution obtained by AMCL from a memory area (not shown).
  • step S2 the sensor data acquisition unit 51 acquires sensor data from the distance measurement sensor 30.
  • step S3 the self-position estimation unit 52 performs self-position estimation based on the sensor data from the distance measurement sensor 30 to obtain the self-position of the moving body 1 and the variance V of particles in the distribution of the self-position.
  • step S4 the weighting factor setting unit 53 obtains the variance V calculated by the self-position estimation unit 52.
  • step S5 the weighting coefficient setting unit 53 determines whether the variance V is greater than the threshold value V_th.
  • step S5 If it is determined in step S5 that the variance V is greater than the threshold value V_th, the process proceeds to step S6, where the weighting factor setting unit 53 sets the weighting factor of AMCL (sensor data from the distance measurement sensor 30) to a low value.
  • step S7 the sampling period setting unit 54 sets the sampling period of the distance measurement sensor 30 to a long period based on the set weighting coefficient. Then, the process returns to step S2, and the subsequent processes are repeated.
  • step S5 determines whether the variance V is greater than the threshold value V_th. If it is determined in step S5 that the variance V is not greater than the threshold value V_th, the process proceeds to step S8, where the weighting factor setting unit 53 sets the weighting factor of AMCL (sensor data from the distance measurement sensor 30) high.
  • step S9 the sampling period setting unit 54 shortens the sampling period of the distance measurement sensor 30 based on the set weighting coefficient. Then, the process returns to step S2, and the subsequent processes are repeated.
  • the reliability of the AMCL is deemed high, and the priority of the self-position determined by the AMCL is increased in correcting the self-position determined by dead reckoning, and the frequency of such corrections also increases.
  • Figure 3 shows a moving object 1 moving through a long, featureless passage.
  • the extension direction of the passage is the Y-axis direction
  • the direction of the wall of the passage is the X-axis direction.
  • the error in the self-position is larger in the direction of travel (Y-axis direction) than in the orthogonal direction (X-axis direction) that is perpendicular to the direction of travel of the mobile unit 1. This is because there is less change in the scenery (i.e. change in features) in the direction of travel than in the orthogonal direction, making it difficult to recognize the distance traveled.
  • the variance V calculated by AMCL is small, and the deviation between the estimated self-position and the actual position is also small.
  • the self-position calculated by dead reckoning has a higher priority, but in dead reckoning, errors accumulate with each repetition of processing.
  • the self-position relative to a wall at the end of a long, featureless passage will deviate by the amount of accumulated error between the internal map held internally by the mobile unit 1 and the actual measurement value based on dead reckoning. In this case, the risk of the mobile unit 1 colliding with the wall increases.
  • FIG. 5 is a block diagram showing an example configuration of a moving body 1 to which the technology according to the present disclosure is applied.
  • the moving body 1 shown in FIG. 5 differs from the moving body 1 in FIG. 1 in that it has distance measurement sensors 30t, 30n and a processor 150 instead of the distance measurement sensor 30 and the processor 50.
  • the distance measurement sensor 30t senses the front (traveling direction) of the moving body 1 to generate sensor data (point cloud data) representing the distance to surrounding objects and their shapes, and supplies the data to the processor 150.
  • the distance measurement sensor 30t may be configured with LiDAR or a stereo camera.
  • the distance measurement sensor 30n senses the side of the moving body 1 (the direction perpendicular to the direction of travel) to generate sensor data (point cloud data) representing the distance to surrounding objects and their shapes, and supplies the data to the processor 150.
  • the distance measurement sensor 30 may also be configured with LiDAR or a stereo camera.
  • the processor 150 performs self-position estimation of the moving body 1 based on the sensor data from the wheel encoder 10, the IMU 20, and the distance measurement sensors 30t and 30n.
  • the processor 150 is configured as an information processing device that operates by executing a specific program, for example.
  • the processor 150 realizes the following functional blocks: a sensor data acquisition unit 151, a self-position estimation unit 152, a weighting coefficient setting unit 153, and a sampling period setting unit 154.
  • the sensor data acquisition unit 151, self-location estimation unit 152, weighting factor setting unit 153, and sampling period setting unit 154 of the processor 150 basically have the same functions as the sensor data acquisition unit 51, self-location estimation unit 52, weighting factor setting unit 53, and sampling period setting unit 54 of the processor 50 in FIG. 1, respectively, and therefore detailed description thereof will be omitted.
  • the self-position estimation unit 152 performs self-position estimation of the moving body 1 using a particle filter based on the sensor data generated by the distance measurement sensors 30t and 30n.
  • the self-position estimation unit 152 performs self-position estimation by AMCL based on the sensor data generated by the distance measurement sensor 30t, and determines the self-position in the traveling direction (Y-axis direction) of the moving body 1 as well as the variance in the traveling direction of the particles in the distribution of the self-position.
  • the self-position estimation unit 152 also performs self-position estimation using AMCL based on the sensor data generated by the distance measurement sensor 30n, thereby determining the self-position in the orthogonal direction (X-axis direction) of the moving body 1, as well as the variance of particles in the orthogonal direction in the distribution of the self-position.
  • the self-position estimation unit 152 can determine the traveling direction and the perpendicular direction of the moving body 1 by using the sensor data from the distance measurement sensor 30t and the distance measurement sensor 30n.
  • the weighting factor setting unit 153 sets weighting factors related to the reliability of the sensor data from each of the distance measuring sensors 30t, 30n based on the variance in the travel direction and the variance in the perpendicular direction obtained by the self-position estimation unit 152, and supplies them to the sampling period setting unit 154 and also feeds them back to the self-position estimation unit 152.
  • the self-position estimation unit 152 which has received the weighting factors as feedback, performs self-position estimation of the mobile unit 1 based on the sensor data for which the weighting factors have been set.
  • the sampling period setting unit 154 sets the sampling period of each of the distance measurement sensors 30t, 30n based on the weighting coefficient set by the weighting coefficient setting unit 153, and supplies information representing the set sampling period to each of the distance measurement sensors 30t, 30n and the self-position estimation unit 152.
  • each of the distance measurement sensors 30t, 30n generates sensor data at the set sampling period
  • the self-position estimation unit 152 executes AMCL at a timing synchronized with each sampling period.
  • step S101 the self-position estimation unit 152 reads the self-position in the direction perpendicular to the traveling direction that has been calculated and corrected by dead reckoning, and the weighting coefficient setting unit 153 reads the particle dispersion threshold V_th in the distribution of the existence of the self-position from a memory area (not shown).
  • step S102 the sensor data acquisition unit 151 acquires sensor data from each of the distance measurement sensors 30t and 30n.
  • step S103 the self-position estimation unit 152 performs self-position estimation based on the sensor data from the distance measurement sensors 30t and 30n to obtain the self-position in the traveling direction of the mobile body 1, the self-position in the perpendicular direction, and the variance Vt in the traveling direction of the particles and the variance Vn in the perpendicular direction in the distribution of the self-position.
  • step S104 the weighting coefficient setting unit 153 obtains the variance Vt in the traveling direction and the variance Vn in the perpendicular direction calculated by the self-position estimation unit 152.
  • step S105 the weighting coefficient setting unit 153 determines whether the variance Vt in the travel direction is greater than a threshold V_th, or whether the variance Vn in the perpendicular direction is greater than a threshold V_th (Vt>V_th or Vn>V_th).
  • step S105 If it is determined in step S105 that Vt>V_th or Vn>V_th, the process proceeds to step S106, where the weighting coefficient setting unit 153 determines whether the variance Vt in the traveling direction is greater than the variance Vn in the perpendicular direction (Vt>Vn).
  • step S106 If it is determined in step S106 that Vt>Vn, i.e., for example, that the wall of the passageway through which the moving body 1 moves has features, the process proceeds to step S107, and the weighting factor setting unit 153 maintains the weighting factor of the AMCL (sensor data from the distance measurement sensor 30t) for the front distance measurement sensor 30t.
  • Vt>Vn i.e., for example, that the wall of the passageway through which the moving body 1 moves has features
  • step S108 the sampling period setting unit 154 maintains (does not change) the sampling period of the distance measurement sensor 30t based on the maintained weighting coefficient. After that, the process returns to step S102, and the subsequent processes are repeated.
  • step S106 determines whether Vt>Vn is not true, i.e., for example, the wall of the passageway through which the moving body 1 moves has no features, but the wall in front of it has features.
  • the process proceeds to step S109, and the weighting factor setting unit 153 maintains the weighting factor of the AMCL (sensor data from the distance measurement sensor 30t) for the forward distance measurement sensor 30t.
  • step S110 the sampling period setting unit 154 maintains (does not change) the sampling period of the distance measurement sensor 30t based on the maintained weighting coefficient. After that, the process returns to step S102, and the subsequent processes are repeated.
  • step S105 If it is determined in step S105 that Vt>V_th or Vn>V_th is not true, proceed to step S111.
  • step S111 the weighting coefficient setting unit 153 determines whether the variance Vt in the traveling direction is greater than a threshold V_th and the variance Vn in the perpendicular direction is greater than a threshold V_th (Vt>V_th and Vn>V_th).
  • step S111 If it is determined in step S111 that Vt>V_th and Vn>V_th, that is, for example, that the moving object 1 is moving in a room surrounded by featureless walls, the process proceeds to step S112, and the weighting factor setting unit 153 sets the weighting factor of the AMCL (sensor data from the distance measurement sensor 30t) for the front distance measurement sensor 30t to a low value.
  • the weighting factor setting unit 153 sets the weighting factor of the AMCL (sensor data from the distance measurement sensor 30t) for the front distance measurement sensor 30t to a low value.
  • step S113 the sampling period setting unit 154 sets the sampling period of the front distance measuring sensor 30t to be longer based on the set weighting coefficient. After that, the process returns to step S102, and the subsequent processes are repeated.
  • step S111 determines whether Vt>V_th and Vn>V_th are not true, i.e., if there are changes in the scenery in both the direction of travel and the perpendicular direction.
  • the process proceeds to step S114, where the weighting factor setting unit 153 sets the weighting factor of the AMCL (sensor data from the distance measuring sensors 30t, 30n) high for each of the distance measuring sensors 30t, 30n.
  • step S115 the sampling period setting unit 154 sets the sampling period of each of the distance measuring sensors 30t, 30n to be short based on the set weighting coefficient. After that, the process returns to step S102, and the subsequent processes are repeated.
  • the above processing makes it possible to continue correcting the self-position determined by dead reckoning, even if there is a large variance in the direction of travel determined by AMCL, by maintaining the priority of the self-position in the direction of travel determined by AMCL. Therefore, even if a moving body moves through an environment where there is little change in features, such as a corridor, warehouse, tunnel, or inside a train, it is possible to suppress the accumulation of errors due to dead reckoning, and prevent the self-position recognized by the moving body from deviating from the actual position. As a result, it is possible to improve the accuracy of estimating the self-position.
  • a food delivery robot can move without its recognized position deviating from its actual position, it can give people around it a sense of security. Also, if a transport robot can move without its recognized position deviating from its actual position, it can prevent unintended abnormal stops and improve transport throughput.
  • the above-described moving body 1 may have other configurations.
  • FIG. 7 is a block diagram showing another example of the configuration of the moving object 1.
  • FIG. 7 is a block diagram showing another example of the configuration of the moving object 1.
  • the mobile body 1 shown in FIG. 7 differs from the mobile body 1 in FIG. 5 in that it further includes an environmental map DB (database) 210.
  • the environmental map DB (database) 210 stores an environmental map of the environment in which the mobile unit 1 moves.
  • the self-position estimation unit 152 can determine the traveling direction and the perpendicular direction of the mobile unit 1 by using the environmental map stored in the environmental map DB 210 in addition to the sensor data from the distance measurement sensors 30t and 30n.
  • FIG. 8 is a block diagram showing still another example of the configuration of the moving body 1.
  • FIG. 8 is a block diagram showing still another example of the configuration of the moving body 1.
  • the moving body 1 shown in FIG. 8 differs from the moving body 1 in FIG. 5 in that it does not have the wheel encoder 10 and IMU 20, which are internal sensors.
  • the self-location estimation unit 152 does not perform self-location estimation using dead reckoning, but only performs self-location estimation using AMCL.
  • the functional blocks realized by the processor 150 of the mobile body 1 may be realized by a device external to the mobile body 1, such as a computer capable of communicating with the mobile body 1 via a network, or a controller or terminal device for controlling the mobile body 1.
  • Example of computer hardware configuration The above-mentioned series of processes can be executed by hardware or software.
  • the program constituting the software is installed from a program recording medium into a computer incorporated in dedicated hardware or a general-purpose personal computer.
  • FIG. 9 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes by a program.
  • the processor 150 that constitutes the mobile unit 1 can be configured, for example, by a computer having a configuration similar to that shown in FIG. 9.
  • the CPU (Central Processing Unit) 301, ROM (Read Only Memory) 302, and RAM (Random Access Memory) 303 are interconnected by a bus 304.
  • an input/output interface 305 Further connected to the bus 304 is an input/output interface 305. Connected to the input/output interface 305 are an input unit 306 consisting of a keyboard, mouse, etc., and an output unit 307 consisting of a display, speakers, etc. Also connected to the input/output interface 305 are a storage unit 308 consisting of a hard disk or non-volatile memory, a communication unit 309 consisting of a network interface, etc., and a drive 310 that drives removable media 311.
  • the CPU 301 for example, loads a program stored in the storage unit 308 into the RAM 303 via the input/output interface 305 and the bus 304, and executes the program, thereby performing the above-mentioned series of processes.
  • the programs executed by the CPU 301 are provided, for example, by being recorded on removable media 311, or via a wired or wireless transmission medium such as a local area network, the Internet, or digital broadcasting, and are installed in the storage unit 308.
  • the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.
  • a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.
  • an embodiment of the present disclosure can be configured as cloud computing, in which a single function is shared and processed collaboratively by multiple devices over a network.
  • each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
  • one step includes multiple processes
  • the multiple processes included in that one step can be executed by one device, or can be shared and executed by multiple devices.
  • the technology according to the present disclosure can have the following configuration. (1) estimating a self-position of the moving object using a particle filter based on sensor data from a first distance measurement sensor that senses a front of the moving object and a second distance measurement sensor that senses a side of the moving object; setting a weighting factor related to reliability of the sensor data from the first distance measuring sensor and the second distance measuring sensor based on a variance in the moving direction of the moving body and a variance in an orthogonal direction perpendicular to the moving direction in the existence distribution of the estimated self-position of the moving body; a sampling period of the first distance measuring sensor and the second distance measuring sensor is set in accordance with the set weight coefficient.
  • the self-location estimation method according to (1) further comprising: performing the self-location estimation of the moving body based on the sensor data to which the weighting coefficient is set. (3) If either the variance in the travel direction or the variance in the orthogonal direction is greater than a predetermined threshold, at least maintain the weighting factor for the sensor data from the first ranging sensor; The self-location estimation method according to (2), wherein the sampling period of the first distance measuring sensor is not changed. (4) If both the forward variance and the orthogonal variance are greater than a predetermined threshold, setting the weighting coefficient of the sensor data from the first distance measuring sensor low; The self-location estimation method according to (3), further comprising: setting a sampling period of the first distance measuring sensor to be long.
  • the moving body is a vehicle capable of autonomous driving.
  • the self-location estimation method according to any one of (1) to (13), wherein the moving body is a drone capable of autonomous flight.
  • a self-location estimation unit that performs self-location estimation of the moving object using a particle filter based on sensor data from a first distance measurement sensor that senses a front of the moving object and a second distance measurement sensor that senses a side of the moving object; a weighting factor setting unit that sets a weighting factor related to reliability of the sensor data from the first distance measuring sensor and the second distance measuring sensor based on a variance in the moving direction of the moving body and a variance in an orthogonal direction perpendicular to the moving direction in a presence distribution of the estimated self-position of the moving body; a sampling period setting unit that sets sampling periods of the first distance measuring sensor and the second distance measuring sensor in accordance with the set weighting coefficient.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Mathematical Physics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

La présente divulgation se rapporte à un procédé d'estimation d'auto-position, à un dispositif de traitement d'informations et à un programme qui permettent d'obtenir une augmentation de la précision d'estimation d'auto-position. La présente divulgation comprend : l'exécution d'une estimation d'auto-position d'un corps mobile à l'aide d'un filtre à particules sur la base de données de capteur provenant d'un premier capteur de mesure de distance destiné à détecter une distance dans la direction avant du corps mobile et d'un second capteur de mesure de distance destiné à détecter une distance dans la direction latérale du corps mobile ; le réglage d'un coefficient de poids lié à la fiabilité des données de capteur provenant du premier capteur de mesure de distance et du second capteur de mesure de distance sur la base de la variance de la direction de déplacement du corps mobile dans la distribution en présence de l'auto-position estimée du corps mobile et de la variance d'une direction orthogonale orthogonale à la direction de déplacement ; et le réglage d'une période d'échantillonnage du premier capteur de mesure de distance et du second capteur de mesure de distance en fonction du coefficient de poids réglé. La présente divulgation est applicable à un corps mobile qui se déplace de manière autonome.
PCT/JP2024/015672 2023-05-09 2024-04-22 Procédé d'estimation d'auto-position, dispositif de traitement d'informations, et programme Pending WO2024232249A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023077275 2023-05-09
JP2023-077275 2023-05-09

Publications (1)

Publication Number Publication Date
WO2024232249A1 true WO2024232249A1 (fr) 2024-11-14

Family

ID=93430048

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/015672 Pending WO2024232249A1 (fr) 2023-05-09 2024-04-22 Procédé d'estimation d'auto-position, dispositif de traitement d'informations, et programme

Country Status (1)

Country Link
WO (1) WO2024232249A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120143824A (zh) * 2025-03-10 2025-06-13 常州莘之星智能科技有限公司 基于激光雷达的爬壁机器人全覆盖路径规划避障及方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06332531A (ja) * 1993-05-19 1994-12-02 Shinko Electric Co Ltd 無人無軌道車
JP2002063682A (ja) * 2000-08-21 2002-02-28 Nec Corp 走行位置検出装置
WO2013002067A1 (fr) * 2011-06-29 2013-01-03 株式会社日立産機システム Robot mobile et système d'auto-estimation de position et d'attitude installé sur un corps mobile
JP2015170127A (ja) * 2014-03-06 2015-09-28 トヨタ自動車株式会社 自律移動ロボット、及びその制御方法
JP2016091202A (ja) * 2014-10-31 2016-05-23 株式会社豊田中央研究所 自己位置推定装置及び自己位置推定装置を備えた移動体
JP2021018638A (ja) * 2019-07-22 2021-02-15 株式会社ダイヘン 自己位置推定装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06332531A (ja) * 1993-05-19 1994-12-02 Shinko Electric Co Ltd 無人無軌道車
JP2002063682A (ja) * 2000-08-21 2002-02-28 Nec Corp 走行位置検出装置
WO2013002067A1 (fr) * 2011-06-29 2013-01-03 株式会社日立産機システム Robot mobile et système d'auto-estimation de position et d'attitude installé sur un corps mobile
JP2015170127A (ja) * 2014-03-06 2015-09-28 トヨタ自動車株式会社 自律移動ロボット、及びその制御方法
JP2016091202A (ja) * 2014-10-31 2016-05-23 株式会社豊田中央研究所 自己位置推定装置及び自己位置推定装置を備えた移動体
JP2021018638A (ja) * 2019-07-22 2021-02-15 株式会社ダイヘン 自己位置推定装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN120143824A (zh) * 2025-03-10 2025-06-13 常州莘之星智能科技有限公司 基于激光雷达的爬壁机器人全覆盖路径规划避障及方法

Similar Documents

Publication Publication Date Title
US20240318957A1 (en) Sensor perturbation
EP3665501B1 (fr) Étalonnage et localisation de capteur de véhicule
US10983199B2 (en) Vehicle sensor calibration and localization
US11279045B2 (en) Robot pose estimation method and apparatus and robot using the same
CN110244772B (zh) 移动机器人的领航跟随系统和领航跟随控制方法
CN111532257B (zh) 用于补偿交通工具校准误差的方法和系统
CN109975792B (zh) 基于多传感器融合矫正多线激光雷达点云运动畸变的方法
CN118963388B (zh) 无需外部导航的无人机自主飞行控制系统
CN110554376A (zh) 用于运载工具的雷达测程法
CN107167148A (zh) 同步定位与地图构建方法和设备
JP2019505423A (ja) 比例、積分及び微分(pid)コントローラを用いた自律走行車のステアリング制御方法及びシステム
CN111238469B (zh) 一种基于惯性/数据链的无人机编队相对导航方法
US20230391350A1 (en) Systems and methods for hybrid open-loop and closed-loop path planning
CN115082562B (zh) 一种外参标定方法、装置、设备、服务器及车载计算设备
US12215976B2 (en) Estimation device, estimation method, program product for estimation
CN112455446A (zh) 用于车辆控制的方法、装置、电子设备和存储介质
WO2024232249A1 (fr) Procédé d'estimation d'auto-position, dispositif de traitement d'informations, et programme
EP3276306B1 (fr) Naviguer un vehicule aerien sans pilote
Kilic et al. Evaluation of the benefits of zero velocity update in decentralized extended kalman filter-based cooperative localization algorithms for GNSS-denied multi-robot systems
JP2022537361A (ja) ドリフト補正を用いるモーションセンサを使用した相対位置追跡
CN113358123B (zh) 可信运动单元
Rustagi et al. Model based drift free localization for multirotors
Mkrtchyan et al. Vision-Based Autopilot Implementation Using a Quad-Rotor Helicopter
US20230343227A1 (en) System and method for controlling aerial vehicle
US20230266451A1 (en) Online lidar-to-ground alignment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24803362

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2025519370

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2025519370

Country of ref document: JP