WO2021079794A1 - 情報処理装置、情報処理方法、プログラムおよび飛行体 - Google Patents
情報処理装置、情報処理方法、プログラムおよび飛行体 Download PDFInfo
- Publication number
- WO2021079794A1 WO2021079794A1 PCT/JP2020/038704 JP2020038704W WO2021079794A1 WO 2021079794 A1 WO2021079794 A1 WO 2021079794A1 JP 2020038704 W JP2020038704 W JP 2020038704W WO 2021079794 A1 WO2021079794 A1 WO 2021079794A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time observation
- real
- map
- observation results
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C13/00—Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
- B64C13/02—Initiating means
- B64C13/16—Initiating means actuated automatically, e.g. responsive to gust detectors
- B64C13/18—Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
- G06V20/647—Three-dimensional objects by matching two-dimensional images to three-dimensional objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/20—Arrangements for acquiring, generating, sharing or displaying traffic information
- G08G5/21—Arrangements for acquiring, generating, sharing or displaying traffic information located onboard the aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/20—Arrangements for acquiring, generating, sharing or displaying traffic information
- G08G5/26—Transmission of traffic-related information between aircraft and ground stations
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/30—Flight plan management
- G08G5/34—Flight plan management for flight plan modification
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/53—Navigation or guidance aids for cruising
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/55—Navigation or guidance aids for a single aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/70—Arrangements for monitoring traffic-related situations or conditions
- G08G5/74—Arrangements for monitoring traffic-related situations or conditions for monitoring terrain
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/80—Anti-collision systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/50—Navigation or guidance aids
- G08G5/57—Navigation or guidance aids for unmanned aircraft
Definitions
- This technology relates to information processing devices, information processing methods, programs, and flying objects, and more specifically to information processing devices for enabling high-speed autonomous flight of flying objects.
- a drone which is an air vehicle, flies autonomously, it repeatedly draws a flight path to the destination according to a global action plan and flies along that flight path. Since it takes time to calculate the route, it is necessary to calculate a long route at a time in order to fly at high speed, and it is necessary to draw a route even in an unobserved area. For example, when a route is drawn on the assumption that there is nothing in the unobserved area, there is an inconvenience such as a collision when an obstacle suddenly appears in an area that cannot be observed to the very limit.
- Patent Document 1 an integrated map is created by superimposing an environmental information map stored in advance and information on an observed obstacle, and while avoiding obstacles on the integrated map, a predetermined route is set. The technique of moving and controlling the robot along the line is described. Further, for example, Patent Document 2 describes a technique for estimating the self-position of a vehicle by matching a registered image included in the map data with an observation image taken from the vehicle.
- the purpose of this technology is to enable high-speed autonomous flight of flying objects.
- the concept of this technology is A generator that generates 3D real-time observation results based on self-position estimation information and 3D ranging information, An acquisition unit that acquires a preliminary map corresponding to the above 3D real-time observation results, An alignment unit that aligns the 3D real-time observation results with the prior map, and The information processing apparatus includes an extension unit that expands the three-dimensional real-time observation result based on the prior map after the alignment.
- the generation unit generates 3D real-time observation results based on self-position estimation information and 3D ranging information.
- the 3D real-time observation result may be a 3D proprietary grid map.
- the acquisition unit acquires a preliminary map corresponding to the three-dimensional real-time observation result.
- the alignment section aligns the 3D real-time observation results with the preliminary map.
- the expansion unit expands the three-dimensional real-time observation result based on the prior map.
- it is further provided with an environment structure recognition unit that detects a plane for 3D real-time observation results, and the extension unit expands the plane based on the information of the prior map by using the result of the plane detection. You may.
- the environmental structure recognition unit further performs semantic segmentation on the three-dimensional real-time observation result, and the extension unit expands the plane when the semantics are continuous by using the result of the semantic segmentation. May be done.
- the 3D real-time observation result is aligned with the prior map, and then the 3D real-time observation result is extended based on the prior map. Therefore, by using the expanded 3D real-time observation results, the state of the unobserved area can be grasped in advance. For example, in a flying object such as a drone, a somewhat long flight path can be taken at once in a global action plan. It can be calculated accurately and enables high-speed autonomous flight of the aircraft.
- a generator that generates 3D real-time observation results based on self-position estimation information and 3D ranging information, An acquisition unit that acquires a preliminary map corresponding to the above 3D real-time observation results, An alignment unit that aligns the 3D real-time observation results with the prior map, and After the above alignment, the extension part that expands the above 3D real-time observation result based on the above advance map, and The aircraft is equipped with an action planning unit that sets the flight path based on the expanded 3D real-time observation results.
- the generation unit generates 3D real-time observation results based on self-position estimation information and 3D ranging information.
- the acquisition unit acquires a preliminary map corresponding to the three-dimensional real-time observation result.
- the acquisition unit may be configured to acquire a preliminary map by communication from another aircraft.
- the prior map may be made to be a map based on the three-dimensional real-time observation results generated by other flying objects.
- the prior map may be a map obtained by cutting the three-dimensional real-time observation result at a certain height and converting it into a bird's-eye view. Further, for example, in this case, the prior map may be a map obtained by performing a process of lowering the resolution of the three-dimensional real-time observation result to the extent that communication is possible.
- the alignment section aligns the 3D real-time observation results with the preliminary map.
- the extension unit expands the 3D real-time observation result based on the preliminary map. Then, the action planning unit sets the flight path based on the expanded three-dimensional real-time observation result.
- an environment structure recognition unit that detects a plane for 3D real-time observation results, and the extension unit expands the plane based on the information of the prior map by using the result of the plane detection. You may.
- the environmental structure recognition unit further performs semantic segmentation on the three-dimensional real-time observation result, and the extension unit expands the plane when the semantics are continuous by using the result of the semantic segmentation. May be done.
- the 3D real-time observation result is aligned with the prior map, and then the 3D real-time observation result is expanded based on the prior map, and based on this expanded 3D real-time observation result.
- To set the flight path Therefore, in an air vehicle such as a drone, it is possible to accurately calculate a somewhat long flight path at a time with a global action plan, and the air vehicle can fly autonomously at high speed.
- FIG. 1 schematically shows the autonomous flight operation of the drone 10.
- the drone 10 generates a three-dimensional real-time observation result, for example, a three-dimensional exclusive grid map, based on the self-position estimation information and the three-dimensional distance measurement information in the observation region 20. Further, the drone 10 extends the three-dimensional real-time observation result in the unobserved region 30 based on the preliminary map (preliminary map).
- the unobserved area 30 includes, for example, an area that cannot be observed due to an obstacle, an area outside the measurement range of the sensor, and the like.
- the advance map is a simple map in which rough information on the environment in which the drone 10 flies is described.
- this preliminary map is a two-dimensional or three-dimensional map that shows the position and size of walls, buildings, and the like. More specifically, it corresponds to a two-dimensional or three-dimensional map, a topographic map, a floor plan of a building, etc. stored in a server on the cloud.
- This advance map may be held by the drone 10 in the storage. In order for the drone 10 to fly at high speed, it is necessary to maintain a wide range of preliminary maps. If the advance map is simple, the data capacity is small, and the drone 10 can hold a relatively wide range of advance maps. This preliminary map may show the approximate position and size of the obstacle.
- this advance map is always stored in the server on the cloud, and the drone 10 can download the advance map of the required range from the server on the cloud each time and use it. If the advance map is simple, the data capacity is small and it can be downloaded in a short time.
- Drone 10 aligns the 3D real-time observation result with the pre-map when expanding the 3D real-time observation result based on the pre-map.
- the three-dimensional real-time observation result is adjusted to the dimension of the preliminary map. For example, when the prior map is two-dimensional, the map in a certain range from the height of the own machine is folded into two dimensions in the three-dimensional real-time observation result.
- alignment with the map is performed using a well-known alignment method such as ICP (Iterative Closest Points) and NDT (Normal Distributions Transform).
- Drone 10 expands the 3D real-time observation result based on the preliminary map after alignment.
- the method of this extension will be described.
- a plane is detected from the three-dimensional real-time observation result, and if a space corresponding to the plane is found on the prior map, the plane is expanded.
- semantic segmentation is further performed on the three-dimensional real-time observation result, and the result is used to expand the plane when the semantics are continuous.
- the space corresponding to the plane detected from the 3D real-time observation result is found in the pre-map, and the semantics (walls, roads, ground, buildings, etc.) at the connection points of the 3D real-time observation result and the pre-map related to the plane are found. ) Is continuous, the plane detected from the 3D real-time observation result is expanded based on the prior map.
- FIG. 2 schematically shows an outline of alignment and expansion.
- FIG. 2A shows the three-dimensional real-time observation results observed by the drone 10.
- the bottom and the wall are present in the three-dimensional real-time observation result.
- FIG. 2B shows a state in which the three-dimensional real-time observation results observed by the drone 10 are aligned so as to match the preliminary map (two-dimensional in the illustrated example).
- This alignment is performed using well-known alignment techniques such as ICP and NDT, as described above.
- ICP and NDT alignment techniques
- the displacement of the wall, road, etc. of the 3D real-time observation result is corrected so as to match the prior map.
- FIG. 2C shows a state in which the three-dimensional real-time observation result observed by the drone 10 is expanded based on a preliminary map (two-dimensional in the illustrated example).
- the wall part of the 3D real-time observation result is detected as a plane, and since the space corresponding to this plane exists in the pre-map, the wall part of the 3D real-time observation result is extended to the pre-map side.
- the 3D real-time observation results are being expanded.
- the drone 10 makes a global action plan based on the expanded 3D real-time observation result, and sets a flight path to the destination. Then, the drone 10 flies along the flight path 40, and creates control information necessary for the flight as a local action plan.
- This control information includes information such as the speed and acceleration of the drone 10, as well as correction route information based on obstacle determination.
- FIG. 3 shows a configuration example of the drone 10.
- the drone 10 has a drone-mounted PC 100, a drone control unit 200, a sensor unit 300, and an external storage 400.
- the sensor unit 200 includes a stereo camera, LiDAR (Light Detection and Ringing), and the like.
- the external storage 400 stores the advance map.
- This preliminary map is a simple two-dimensional or three-dimensional map, a topographic map, a floor plan of a building, etc. corresponding to a somewhat wide range in which the drone 10 flies.
- the pre-map may be stored in the external storage 400 from the beginning, or the pre-map of the required range may be acquired from the server on the cloud and stored in the external storage 400.
- the drone-mounted PC 100 includes a self-position estimation unit 101, a three-dimensional distance measuring unit 102, a real-time observation result management unit 103, an environment structure recognition unit 104, a pre-map acquisition unit 105, an alignment unit 106, and an extension unit. It has 107, a global action planning unit 108, and a local action planning unit 109.
- the self-position estimation unit 101 estimates the self-position based on the sensor output of the sensor unit 300. In this case, for example, the position relative to the starting position is estimated.
- the three-dimensional distance measuring unit 103 acquires depth information of the surrounding environment based on the sensor output of the sensor unit 300.
- the real-time observation result management unit 103 has a three-dimensional real-time observation result (for example, three-dimensional) based on the self-position estimated by the self-position estimation unit 101 and the depth information of the surrounding environment obtained by the three-dimensional distance measuring unit 102. Create an exclusive grid map). In this case, a three-dimensional real-time observation result is generated by adding the depth information of the surrounding environment together with the self-position.
- a three-dimensional real-time observation result is generated by adding the depth information of the surrounding environment together with the self-position.
- the environmental structure recognition unit 104 recognizes the environmental structure based on the three-dimensional real-time observation result generated by the real-time observation result management unit 103. Specifically, plane detection and semantic segmentation are performed on the 3D real-time observation results.
- the pre-map acquisition unit 105 acquires a pre-map corresponding to the three-dimensional real-time observation result generated by the real-time observation result management unit 103 from the external storage 400.
- the range of the prior map needs to be a wide range including the range of the three-dimensional real-time observation result because the three-dimensional real-time observation result is expanded based on the prior map.
- the alignment unit 106 refers to the results of plane detection and semantic segmentation obtained by the environment structure recognition unit 104, and corrects the position of the three-dimensional real-time observation result by using a well-known alignment method such as ICP or NDT. (See Fig. 2 (b)).
- the expansion unit 107 expands the three-dimensional real-time observation result based on the preliminary map based on the results of the plane detection and the semantic segmentation obtained by the environment structure recognition unit 104 (see FIG. 2C). ). In this case, if a space corresponding to the plane detected from the three-dimensional real-time observation result is found in the preliminary map, the plane is expanded. Then, in this case, when the semantics are continuous at the connection points of the three-dimensional real-time observation result and the prior map related to the plane, the plane is expanded.
- the global action planning unit 108 makes a global action plan based on the expanded 3D real-time observation result obtained by the expansion unit 107, and sets the flight route to the destination.
- the local action planning unit 109 creates the control information necessary for flying along the flight path set in the global action plan.
- the drone control unit 200 receives the control information obtained by the local action planning unit 109 of the drone-mounted PC 100, controls the motor so that the drone 10 flies along the set flight path, and drives the propeller.
- the flowchart of FIG. 4 shows an example of the processing procedure for redrawing the flight path.
- the drone-mounted PC 100 starts processing when the flight route redrawing management unit (not shown in FIG. 3) instructs the route redrawing in step ST1.
- the flight route redrawing management unit instructs redrawing when the flight route is unreasonable, for example, when there is an unexpected large obstacle on the already set route.
- the flight route redrawing management unit instructs the redrawing every fixed time or every fixed distance flight.
- step ST2 the drone-mounted PC 100 newly generates a three-dimensional real-time observation result by the real-time observation result management unit 103, and updates the three-dimensional real-time observation result.
- step ST3 the drone-mounted PC 100 acquires a two-dimensional or three-dimensional advance map corresponding to the updated real-time observation result from the external storage 400 by the advance map acquisition unit 105.
- step ST4 the drone-mounted PC 100 recognizes the environmental structure from the three-dimensional real-time observation result by the environmental structure recognition unit 104. Specifically, plane detection and semantic segmentation are performed on the three-dimensional real-time observation results.
- step ST5 the drone-mounted PC 100 refers to the results of plane detection and semantic segmentation at the alignment unit 106, and uses well-known alignment methods such as ICP and NDT to position the 3D real-time observation results. Correct and align to the pre-map.
- step ST6 the drone-mounted PC 100 expands the three-dimensional real-time observation result based on the preliminary map based on the results of plane detection and semantic segmentation by the expansion unit 107.
- the plane is expanded.
- the semantics are continuous at the connection points of the three-dimensional real-time observation result and the prior map related to the plane, the plane is expanded.
- step ST7 the drone-mounted PC 100 performs a global action plan based on the expanded three-dimensional real-time observation result in the global action planning unit 108, and sets a flight path to the destination. .. After that, the drone-mounted PC 100 ends a series of processes in step ST8.
- the 3D real-time observation result is aligned with the pre-map, and then the 3D real-time observation result is expanded based on the pre-map, and the expanded 3D
- a global action plan is made based on real-time observation results to set a flight path. Therefore, the state of the unobserved area can be grasped in advance from the expanded 3D real-time observation result, and in a flying object such as a drone, a flight path that is long to some extent can be calculated accurately at one time with a global action plan, and the drone can be used. 10 high-speed autonomous flights are possible.
- the drone 10 has shown an example of acquiring a preliminary map from the external storage 400.
- the drone 10 may acquire a preliminary map by communication from another drone 10A.
- FIG. 5 schematically shows the state in that case.
- the drone 10A is configured in the same manner as the drone 10.
- the drone 10A converts the three-dimensional real-time observation result into a simple map format and sends a preliminary map obtained to the drone 10.
- this preliminary map is a map obtained by cutting a three-dimensional real-time observation result at a constant height and converting it into a bird's-eye view.
- this prior map is a map obtained by performing a process of lowering the resolution of the three-dimensional real-time observation result to the extent that communication is possible.
- the other drone 10A is one, but the other drone 10A that transmits the advance map to the drone 10 is not limited to one, and may be two or more. As the number of other drones 10A increases, the range of the pre-map transmitted to the drone 10 increases.
- the present technology can also have the following configurations.
- a generator that generates 3D real-time observation results based on self-position estimation information and 3D distance measurement information.
- An acquisition unit that acquires a preliminary map corresponding to the above 3D real-time observation results,
- An alignment unit that aligns the 3D real-time observation results with the prior map, and
- An information processing device including an extension unit that expands the three-dimensional real-time observation result based on the prior map after the alignment.
- An environmental structure recognition unit that detects the plane of the above 3D real-time observation results is further provided.
- the information processing apparatus according to (1), wherein the expansion unit expands a plane based on the information of the prior map by using the result of the plane detection.
- the environmental structure recognition unit further performs semantic segmentation on the three-dimensional real-time observation result.
- the three-dimensional real-time observation result is a three-dimensional exclusive grid map.
- Procedure for generating 3D real-time observation results based on self-position estimation information and 3D distance measurement information and The procedure for acquiring a preliminary map corresponding to the above 3D real-time observation results, and The procedure for aligning the above 3D real-time observation results with the above prior map, and An information processing method having a procedure for expanding the three-dimensional real-time observation result based on the prior map after the alignment.
- Computer A generation means for generating 3D real-time observation results based on self-position estimation information and 3D distance measurement information An acquisition method for acquiring a preliminary map corresponding to the above 3D real-time observation results, Alignment means for aligning the 3D real-time observation results with the prior map, and A program that functions as an extension means to extend the 3D real-time observation results based on the prior map after the alignment.
- An acquisition unit that acquires a preliminary map corresponding to the above 3D real-time observation results, An alignment unit that aligns the 3D real-time observation results with the prior map, and After the above alignment, the extension part that expands the above 3D real-time observation result based on the above advance map, and An air vehicle equipped with an action planning unit that sets a flight path based on the expanded three-dimensional real-time observation results.
- the preliminary map is a map based on the three-dimensional real-time observation results generated by the other flying bodies.
- the flying object according to (9), wherein the prior map is a map obtained by cutting the three-dimensional real-time observation result at a constant height and converting it into a bird's-eye view.
- the prior map is a map obtained by performing a process of reducing the resolution of the three-dimensional real-time observation result to the extent that the communication is possible.
- the environmental structure recognition unit further performs semantic segmentation on the three-dimensional real-time observation result.
- the flying object according to the above 12, wherein the extension portion uses the result of the semantic segmentation to expand the plane when the semantics are continuous.
- Drone-equipped PC 101 ⁇ ⁇ ⁇ Self-position estimation unit 102 ⁇ ⁇ ⁇ 3D distance measurement unit 103 ⁇ ⁇ ⁇ Real-time observation result management unit 104 ⁇ ⁇ ⁇ Environmental structure recognition unit 105 ⁇ ⁇ ⁇ Preliminary map acquisition unit 106 ⁇ ⁇ ⁇ Alignment unit 107 ⁇ ⁇ ⁇ Expansion part 108 ⁇ ⁇ ⁇ Global action planning part 109 ⁇ ⁇ ⁇ Local action planning part 200 ⁇ ⁇ ⁇ Drone control part 300 ⁇ ⁇ ⁇ Sensor part 400 ⁇ ⁇ ⁇ External storage
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Instructional Devices (AREA)
Abstract
Description
自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する生成部と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する取得部と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする位置合わせ部と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する拡張部を備える
情報処理装置にある。
自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する生成部と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する取得部と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする位置合わせ部と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する拡張部と、
上記拡張された3次元リアルタイム観測結果に基づいて飛行経路を設定する行動計画部を備える
飛行体にある。
1.実施の形態
2.変形例
図1は、ドローン10の自律飛行動作を概略的に示している。ドローン10は、観測領域20において、自己位置推定情報および3次元測距情報に基づいて、3次元リアルタイム観測結果、例えば3次元専有格子地図を生成する。また、ドローン10は、未観測領域30において、3次元リアルタイム観測結果を、事前地図(事前マップ)に基づいて拡張する。未観測領域30には、例えば、障害物で観測できない領域、センサの測定範囲外の領域などが含まれる。
図3は、ドローン10の構成例を示している。ドローン10は、ドローン搭載PC100と、ドローン制御部200と、センサ部300と、外部ストレージ400を有している。
なお、上述実施の形態においては、飛行体がドローンである例を示した。詳細説明は省略するが、本技術は、その他の飛行体である場合にも同様に適用できる。
(1)自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する生成部と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する取得部と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする位置合わせ部と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する拡張部を備える
情報処理装置。
(2)上記3次元リアルタイム観測結果に対して平面検出を行う環境構造認識部をさらに備え、
上記拡張部は、上記平面検出の結果を利用して、上記事前地図の情報を元に平面を拡張する
前記(1)に記載の情報処理装置。
(3)上記環境構造認識部は、上記3次元リアルタイム観測結果に対してセマンティックセグメンテーションをさらに行い、
上記拡張部は、上記セマンティックセグメンテーションの結果を利用して、セマンティクスが連続する場合に、上記平面を拡張する
前記(2)に記載の情報処理装置。
(4)上記3次元リアルタイム観測結果は、3次元専有格子地図である
前記(1)から(3)のいずれかに記載の情報処理装置。
(5)自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する手順と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する手順と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする手順と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する手順を有する
情報処理方法。
(6)コンピュータを、
自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する生成手段と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する取得手段と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする位置合わせ手段と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する拡張手段として機能させる
プログラム。
(7)自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する生成部と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する取得部と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする位置合わせ部と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する拡張部と、
上記拡張された3次元リアルタイム観測結果に基づいて飛行経路を設定する行動計画部を備える
飛行体。
(8)上記取得部は、他の飛行体から通信により上記事前地図を取得する
前記(7)に記載の飛行体。
(9)上記事前地図は、上記他の飛行体で生成された上記3次元リアルタイム観測結果に基づいた地図である
前記(8)に記載の飛行体。
(10)上記事前地図は、上記3次元リアルタイム観測結果に対して一定の高さで切って鳥瞰図に変換する処理を行って得られた地図である
前記(9)に記載の飛行体。
(11)上記事前地図は、上記3次元リアルタイム観測結果に対して上記通信が可能な程度に解像度を低下させる処理を行って得られた地図である
前記(9)に記載の飛行体。
(12)上記3次元リアルタイム観測結果に対して平面検出を行う環境構造認識部をさらに備え、
上記拡張部は、上記平面検出の結果を利用して、上記事前地図の情報を元に平面を拡張する
前記(7)から(11)のいずれかに記載の飛行体。
(13)上記環境構造認識部は、上記3次元リアルタイム観測結果に対してセマンティックセグメンテーションをさらに行い、
上記拡張部は、上記セマンティックセグメンテーションの結果を利用して、セマンティクスが連続する場合に、上記平面を拡張する
前記12に記載の飛行体。
20・・・観測領域
30・・・未観測領域
100・・・ドローン搭載PC
101・・・自己位置推定部
102・・・3次元測距部
103・・・リアルタイム観測結果管理部
104・・・環境構造認識部
105・・・事前地図取得部
106・・・位置合わせ部
107・・・拡張部
108・・・大域的行動計画部
109・・・局所的行動計画部
200・・・ドローン制御部
300・・・センサ部
400・・・外部ストレージ
Claims (13)
- 自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する生成部と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する取得部と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする位置合わせ部と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する拡張部を備える
情報処理装置。 - 上記3次元リアルタイム観測結果に対して平面検出を行う環境構造認識部をさらに備え、
上記拡張部は、上記平面検出の結果を利用して、上記事前地図の情報を元に平面を拡張する
請求項1に記載の情報処理装置。 - 上記環境構造認識部は、上記3次元リアルタイム観測結果に対してセマンティックセグメンテーションをさらに行い、
上記拡張部は、上記セマンティックセグメンテーションの結果を利用して、セマンティクスが連続する場合に、上記平面を拡張する
請求項2に記載の情報処理装置。 - 上記3次元リアルタイム観測結果は、3次元専有格子地図である
請求項1に記載の情報処理装置。 - 自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する手順と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する手順と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする手順と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する手順を有する
情報処理方法。 - コンピュータを、
自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する生成手段と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する取得手段と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする位置合わせ手段と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する拡張手段として機能させる
プログラム。 - 自己位置推定情報および3次元測距情報に基づいて3次元リアルタイム観測結果を生成する生成部と、
上記3次元リアルタイム観測結果に対応した事前地図を取得する取得部と、
上記3次元リアルタイム観測結果の上記事前地図との位置合わせをする位置合わせ部と、
上記位置合わせをした後に、上記3次元リアルタイム観測結果を上記事前地図に基づいて拡張する拡張部と、
上記拡張された3次元リアルタイム観測結果に基づいて飛行経路を設定する行動計画部を備える
飛行体。 - 上記取得部は、他の飛行体から通信により上記事前地図を取得する
請求項7に記載の飛行体。 - 上記事前地図は、上記他の飛行体で生成された上記3次元リアルタイム観測結果に基づいた地図である
請求項8に記載の飛行体。 - 上記事前地図は、上記3次元リアルタイム観測結果に対して一定の高さで切って鳥瞰図に変換する処理を行って得られた地図である
請求項9に記載の飛行体。 - 上記事前地図は、上記3次元リアルタイム観測結果に対して上記通信が可能な程度に解像度を低下させる処理を行って得られた地図である
請求項9に記載の飛行体。 - 上記3次元リアルタイム観測結果に対して平面検出を行う環境構造認識部をさらに備え、
上記拡張部は、上記平面検出の結果を利用して、上記事前地図の情報を元に平面を拡張する
請求項7に記載の飛行体。 - 上記環境構造認識部は、上記3次元リアルタイム観測結果に対してセマンティックセグメンテーションをさらに行い、
上記拡張部は、上記セマンティックセグメンテーションの結果を利用して、セマンティクスが連続する場合に、上記平面を拡張する
請求項12に記載の飛行体。
Priority Applications (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US17/767,347 US11854210B2 (en) | 2019-10-25 | 2020-10-14 | Information processing apparatus, information processing method, program, and flight object |
| JP2021554313A JP7468543B2 (ja) | 2019-10-25 | 2020-10-14 | 情報処理装置、情報処理方法、プログラムおよび飛行体 |
| EP20878165.8A EP4011764A4 (en) | 2019-10-25 | 2020-10-14 | INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, PROGRAM AND FLYWHEEL |
| CN202080072994.1A CN114556255B (zh) | 2019-10-25 | 2020-10-14 | 信息处理装置、信息处理方法、程序和飞行体 |
| US18/386,915 US20240070873A1 (en) | 2019-10-25 | 2023-11-03 | Information processing apparatus, information processing method, program, and flight object |
| JP2024052955A JP7722503B2 (ja) | 2019-10-25 | 2024-03-28 | 情報処理方法、情報処理装置およびプログラム |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019193922 | 2019-10-25 | ||
| JP2019-193922 | 2019-10-25 |
Related Child Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/767,347 A-371-Of-International US11854210B2 (en) | 2019-10-25 | 2020-10-14 | Information processing apparatus, information processing method, program, and flight object |
| US18/386,915 Continuation US20240070873A1 (en) | 2019-10-25 | 2023-11-03 | Information processing apparatus, information processing method, program, and flight object |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2021079794A1 true WO2021079794A1 (ja) | 2021-04-29 |
Family
ID=75620457
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/038704 Ceased WO2021079794A1 (ja) | 2019-10-25 | 2020-10-14 | 情報処理装置、情報処理方法、プログラムおよび飛行体 |
Country Status (5)
| Country | Link |
|---|---|
| US (2) | US11854210B2 (ja) |
| EP (1) | EP4011764A4 (ja) |
| JP (2) | JP7468543B2 (ja) |
| CN (1) | CN114556255B (ja) |
| WO (1) | WO2021079794A1 (ja) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023062747A1 (ja) * | 2021-10-13 | 2023-04-20 | 株式会社Acsl | 無人航空機を用いて点検のために風力発電装置のブレードを撮像するためのシステム、方法、プログラム及びプログラムを記憶した記憶媒体 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102524995B1 (ko) * | 2023-02-02 | 2023-04-25 | 국방과학연구소 | 전자 장치의 지도 생성 방법 |
| US20250046197A1 (en) * | 2023-08-04 | 2025-02-06 | Microavia International Limited | Generating a flight path when the uav is offline based on terrain data |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH07332980A (ja) * | 1994-06-02 | 1995-12-22 | Tech Res & Dev Inst Of Japan Def Agency | 地形地図作成方法および装置 |
| JP2005092820A (ja) * | 2003-09-19 | 2005-04-07 | Sony Corp | 環境認識装置及び方法、経路計画装置及び方法、並びにロボット装置 |
| JP2007249632A (ja) | 2006-03-16 | 2007-09-27 | Fujitsu Ltd | 障害物のある環境下で自律移動する移動ロボットおよび移動ロボットの制御方法。 |
| JP2019045892A (ja) | 2017-08-29 | 2019-03-22 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム、及び、移動体 |
Family Cites Families (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3598601B2 (ja) * | 1995-08-23 | 2004-12-08 | 神鋼電機株式会社 | 無人車の運行管理システム |
| JP6150531B2 (ja) | 2013-01-21 | 2017-06-21 | 三菱重工業株式会社 | 地形情報取得装置、地形情報取得システム、地形情報取得方法及びプログラム |
| US9779508B2 (en) * | 2014-03-26 | 2017-10-03 | Microsoft Technology Licensing, Llc | Real-time three-dimensional reconstruction of a scene from a single camera |
| US11370422B2 (en) * | 2015-02-12 | 2022-06-28 | Honda Research Institute Europe Gmbh | Method and system in a vehicle for improving prediction results of an advantageous driver assistant system |
| US10008123B2 (en) * | 2015-10-20 | 2018-06-26 | Skycatch, Inc. | Generating a mission plan for capturing aerial images with an unmanned aerial vehicle |
| US11461912B2 (en) * | 2016-01-05 | 2022-10-04 | California Institute Of Technology | Gaussian mixture models for temporal depth fusion |
| US10803634B2 (en) * | 2016-07-19 | 2020-10-13 | Image Recognition Technology, Llc | Reconstruction of three dimensional model of an object compensating for object orientation changes between surface or slice scans |
| WO2018073879A1 (ja) * | 2016-10-17 | 2018-04-26 | エスゼット ディージェイアイ テクノロジー カンパニー リミテッド | 飛行経路生成方法、飛行経路生成システム、飛行体、プログラム、及び記録媒体 |
| WO2018144929A1 (en) * | 2017-02-02 | 2018-08-09 | Infatics, Inc. (DBA DroneDeploy) | System and methods for improved aerial mapping with aerial vehicles |
| CN107145578B (zh) * | 2017-05-08 | 2020-04-10 | 深圳地平线机器人科技有限公司 | 地图构建方法、装置、设备和系统 |
| US10599161B2 (en) * | 2017-08-08 | 2020-03-24 | Skydio, Inc. | Image space motion planning of an autonomous vehicle |
| US11393114B1 (en) * | 2017-11-08 | 2022-07-19 | AI Incorporated | Method and system for collaborative construction of a map |
| CN108124489B (zh) * | 2017-12-27 | 2023-05-12 | 达闼机器人股份有限公司 | 信息处理方法、装置、云处理设备以及计算机程序产品 |
| US11614746B2 (en) | 2018-01-05 | 2023-03-28 | Irobot Corporation | Mobile cleaning robot teaming and persistent mapping |
| US11500099B2 (en) | 2018-03-14 | 2022-11-15 | Uatc, Llc | Three-dimensional object detection |
| US10957100B2 (en) * | 2018-04-06 | 2021-03-23 | Korea University Research And Business Foundation | Method and apparatus for generating 3D map of indoor space |
| US11094112B2 (en) * | 2018-09-06 | 2021-08-17 | Foresight Ai Inc. | Intelligent capturing of a dynamic physical environment |
| CN109163718A (zh) * | 2018-09-11 | 2019-01-08 | 江苏航空职业技术学院 | 一种面向建筑群的无人机自主导航方法 |
| US10937325B2 (en) * | 2018-12-27 | 2021-03-02 | Intel Corporation | Collision avoidance system, depth imaging system, vehicle, obstacle map generator, and methods thereof |
| US11600022B2 (en) * | 2020-08-28 | 2023-03-07 | Unity Technologies Sf | Motion capture calibration using drones |
-
2020
- 2020-10-14 US US17/767,347 patent/US11854210B2/en active Active
- 2020-10-14 EP EP20878165.8A patent/EP4011764A4/en not_active Withdrawn
- 2020-10-14 JP JP2021554313A patent/JP7468543B2/ja active Active
- 2020-10-14 WO PCT/JP2020/038704 patent/WO2021079794A1/ja not_active Ceased
- 2020-10-14 CN CN202080072994.1A patent/CN114556255B/zh active Active
-
2023
- 2023-11-03 US US18/386,915 patent/US20240070873A1/en active Pending
-
2024
- 2024-03-28 JP JP2024052955A patent/JP7722503B2/ja active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH07332980A (ja) * | 1994-06-02 | 1995-12-22 | Tech Res & Dev Inst Of Japan Def Agency | 地形地図作成方法および装置 |
| JP2005092820A (ja) * | 2003-09-19 | 2005-04-07 | Sony Corp | 環境認識装置及び方法、経路計画装置及び方法、並びにロボット装置 |
| JP2007249632A (ja) | 2006-03-16 | 2007-09-27 | Fujitsu Ltd | 障害物のある環境下で自律移動する移動ロボットおよび移動ロボットの制御方法。 |
| JP2019045892A (ja) | 2017-08-29 | 2019-03-22 | ソニー株式会社 | 情報処理装置、情報処理方法、プログラム、及び、移動体 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP4011764A4 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2023062747A1 (ja) * | 2021-10-13 | 2023-04-20 | 株式会社Acsl | 無人航空機を用いて点検のために風力発電装置のブレードを撮像するためのシステム、方法、プログラム及びプログラムを記憶した記憶媒体 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7722503B2 (ja) | 2025-08-13 |
| CN114556255A (zh) | 2022-05-27 |
| EP4011764A1 (en) | 2022-06-15 |
| JPWO2021079794A1 (ja) | 2021-04-29 |
| US20220392079A1 (en) | 2022-12-08 |
| EP4011764A4 (en) | 2022-09-28 |
| JP7468543B2 (ja) | 2024-04-16 |
| JP2024094326A (ja) | 2024-07-09 |
| CN114556255B (zh) | 2025-07-25 |
| US11854210B2 (en) | 2023-12-26 |
| US20240070873A1 (en) | 2024-02-29 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7722503B2 (ja) | 情報処理方法、情報処理装置およびプログラム | |
| US12111178B2 (en) | Distributed device mapping | |
| CN108827306B (zh) | 一种基于多传感器融合的无人机slam导航方法及系统 | |
| JP6827627B2 (ja) | ビークル環境マップを生成および更新するための方法およびシステム | |
| CN106441275A (zh) | 一种机器人规划路径的更新方法及装置 | |
| JP2020079997A (ja) | 情報処理装置、情報処理方法、及びプログラム | |
| JP2014139538A (ja) | 地形情報取得装置、地形情報取得システム、地形情報取得方法及びプログラム | |
| CN106548486A (zh) | 一种基于稀疏视觉特征地图的无人车位置跟踪方法 | |
| WO2016157428A1 (ja) | 計測装置、計測方法、及び、プログラム | |
| CN105844692A (zh) | 基于双目立体视觉的三维重建装置、方法、系统及无人机 | |
| MX2024001880A (es) | Vehiculo aereo no tripulado (uav) y metodo para operar el uav. | |
| JP2021157204A (ja) | 移動体および移動体の制御方法 | |
| CN109032162A (zh) | 一种基于激光雷达的无人机避障系统及控制方法 | |
| CN118424254A (zh) | 一种基于无人机的未知封闭空间的自主化建图检测方法 | |
| CN109163718A (zh) | 一种面向建筑群的无人机自主导航方法 | |
| Asadi et al. | An integrated aerial and ground vehicle (UAV-UGV) system for automated data collection for indoor construction sites | |
| Ngo et al. | Uav platforms for autonomous navigation in gps-denied environments for search and rescue missions | |
| WO2023219058A1 (ja) | 情報処理方法、情報処理装置及び情報処理システム | |
| JP2019053561A (ja) | 制御装置、および制御方法、プログラム、並びに移動体 | |
| JP7707926B2 (ja) | 撮像制御装置および撮像制御方法 | |
| WO2021049227A1 (ja) | 情報処理システム、情報処理装置及び情報処理プログラム | |
| CN119805479A (zh) | 基于激光雷达和uwb测距的矿洞巡检无人机及定位系统 | |
| JP7351609B2 (ja) | 経路探索装置及びプログラム | |
| Deng et al. | SLAM: Depth image information for mapping and inertial navigation system for localization | |
| CN118392195A (zh) | 一种基于改进3dvfh算法的无人机路径数据规划方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20878165 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2021554313 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 20878165.8 Country of ref document: EP |
|
| ENP | Entry into the national phase |
Ref document number: 2020878165 Country of ref document: EP Effective date: 20220311 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWG | Wipo information: grant in national office |
Ref document number: 202080072994.1 Country of ref document: CN |