[go: up one dir, main page]

WO2021081995A1 - Procédé et dispositif de traitement de données, dispositif de stockage de données et système de commande de mouvement - Google Patents

Procédé et dispositif de traitement de données, dispositif de stockage de données et système de commande de mouvement Download PDF

Info

Publication number
WO2021081995A1
WO2021081995A1 PCT/CN2019/115112 CN2019115112W WO2021081995A1 WO 2021081995 A1 WO2021081995 A1 WO 2021081995A1 CN 2019115112 W CN2019115112 W CN 2019115112W WO 2021081995 A1 WO2021081995 A1 WO 2021081995A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
data processing
target
processing device
graphic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/CN2019/115112
Other languages
English (en)
Chinese (zh)
Inventor
邸健
耿畅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Priority to CN201980029545.6A priority Critical patent/CN112136091A/zh
Priority to PCT/CN2019/115112 priority patent/WO2021081995A1/fr
Publication of WO2021081995A1 publication Critical patent/WO2021081995A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position

Definitions

  • This application relates to the field of mobile control technology, and in particular to a data processing method, data storage device, data processing device, mobile control system, and computer-readable storage medium.
  • the restricted area of the drone can be limited by setting the electronic fence.
  • the coordinates of the boundary point of the restricted area are stored in the drone.
  • it is often necessary to A series of operations such as coordinate transformation and calculation equations on the UAV require a large amount of calculations on the UAV, which occupies too much processing resources of the UAV.
  • the embodiments of the present application provide a data processing method, data storage device, data processing device, mobile control system, and computer-readable storage medium.
  • the data processing method of an embodiment of the present application includes: a data storage device determines a target graphic according to the shape of a target area; obtains a characterization parameter of the target graphic in a first coordinate system; and generates a reference to the target area according to the characterization parameter. The corresponding target data.
  • the target image is determined by the target area, the characterization parameters of the target image in the first coordinate system are obtained, the target data is generated according to the characterization parameters, and the information of the target area is preprocessed by the data processing method ,
  • Data processing equipment such as drones
  • the processing operations such as obtaining the target data according to the characterization parameters, saving data processing equipment Processing resources.
  • a data processing method includes: a data processing device acquires target data corresponding to a target area, where the target data is represented by the data storage device according to the characterization parameters of the target graphic corresponding to the target area in a first coordinate system Generated; determine the position relationship between the data processing device and the target area according to the position information of the current position of the data processing device in the first coordinate system; and control the data processing device according to the position relationship mobile.
  • the data storage device of the embodiment of the present application includes a processor and a memory, the memory is used to store program instructions or data, and the processor is used to read the program instructions to perform the following operations: determine the target graphic according to the shape of the target area; obtain The characterization parameter of the target graphic in the first coordinate system; and generating target data corresponding to the target area according to the characterization parameter.
  • the data processing device of the embodiment of the present application includes a processor and a memory, the memory is used to store program instructions or data, and the processor is used to read the program instructions to perform the following operations: obtain target data corresponding to a target area, and The target data is generated by the data storage device according to the characterization parameters of the target graphic corresponding to the target area in the first coordinate system; according to the position information of the current position of the data processing device in the first coordinate system, the A positional relationship between the data processing device and the target area; and controlling the movement of the data processing device according to the positional relationship.
  • the mobile control system of the embodiment of the present application includes a data storage device and a data processing device;
  • the data storage device includes a processor and a memory, the memory is used to store program instructions or data, and the processor is used to read the program instructions Perform the following operations: determine the target graphic according to the shape of the target area; obtain the characterization parameters of the target graphic in the first coordinate system; and generate target data corresponding to the target area according to the characterization parameters;
  • the data processing device Including a processor and a memory, the memory is used to store program instructions or data, the processor is used to read the program instructions to perform the following operations: obtain target data corresponding to the target area, the target data is determined by the data storage device according to The target graphic corresponding to the target area is generated by the characterization parameters in the first coordinate system; the data processing device and the target area are determined according to the position information of the current position of the data processing device in the first coordinate system And controlling the movement of the data processing device according to the position relationship.
  • the non-volatile computer-readable storage medium of the embodiment of the present application contains computer-executable instructions, and when the computer-executable instructions are executed by one or more processors, the processor executes the determination of the target graphics according to the shape of the target area Obtain the characterization parameters of the target graphic in the first coordinate system; and generate target data corresponding to the target area according to the characterization parameters; or the processor executes to obtain the target data corresponding to the target area, and the target data is Generated by the data storage device according to the characterization parameters of the target graphic corresponding to the target area in the first coordinate system; the data processing device is determined according to the position information of the current position of the data processing device in the first coordinate system A positional relationship with the target area; and controlling the movement of the data processing device according to the positional relationship.
  • FIG. 1 is a schematic diagram of a scenario in which a data processing method according to an embodiment of the present application is executed
  • FIG. 2 is a schematic diagram of the structure of a mobile control system according to an embodiment of the present application.
  • Figs. 3, 5, 6, Fig. 9 to Fig. 13, Fig. 15, Fig. 18, Fig. 20, Fig. 22 to Fig. 24 are schematic flow diagrams of the data processing method of the embodiment of the present application;
  • Figure 4, Figure 7, Figure 8, Figure 14, Figure 16, Figure 17, Figure 19, and Figure 21 are schematic diagrams of the principle of implementing the data processing method of the embodiment of the present application;
  • FIG. 25 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present application.
  • the data storage device 10 of the embodiment of the present application may be a terminal, a server, or a remote control, and the terminal may be a terminal such as a mobile phone, a watch, or a head-mounted display device, which is not limited herein.
  • the data storage device 10 may be in communication connection with a data processing device 20, and the data processing device 20 may be a mobile platform, such as a flying device, an unmanned aerial vehicle, an unmanned vehicle, an unmanned ship, a robot, and the like.
  • the data storage device 10 may send data to the data processing device 20, and the data storage device 10 may also receive data sent by the data processing device 20.
  • the data processing device 10 may be a microcontroller unit (MCU) where the flight control system (FC) on the flight device is located, or may be an application processor (AP) on the flight device.
  • MCU microcontroller unit
  • FC flight control system
  • AP application processor
  • FIG. 2 is only an example in which the data storage device 10 and the data processing device 20 are located on different devices.
  • the data storage device 10 and the data processing device 20 may also be located on the same device.
  • the data storage device 10 may also be an application processor (AP) on a flight device
  • the data processing device 20 may also be a flight control processor on the flight device, that is, a flight control system (Flight control system, FC). ) Where the MCU is located.
  • the data storage device 10 and the data processing device 20 are both application processors on the flight device.
  • the data storage device 10 and the data processing device 20 are both flight control processors on the flight device.
  • the data storage device 10 is a server and the data processing device 20 is a flying device as an example for description. It can be understood that the specific form of the data storage device 10 and the data processing device 20 may also be other, which is not limited here.
  • the target area 200 may be used to characterize an area where the data processing device 20 is prohibited from entering.
  • the target area 200 may be a restricted-flying area, and the data processing device 20 cannot enter the target area 200 to prevent the other activities in the target area 200 from being affected.
  • the target area 200 may be an obstacle area, and the data processing device 20 cannot enter the obstacle area to ensure the safety of the data processing device 20 when it moves.
  • the target area 200 is a restricted-flying area as an example for illustrative description.
  • the data processing method of the embodiment of the present application includes the steps:
  • 011 Determine the target graphic according to the shape of the target area 200;
  • the data storage device 10 of the embodiment of the present application includes a processor 11 and a memory 12.
  • the memory 12 is used to store program instructions or data
  • the processor 11 is used to read the program instructions to perform data processing in the embodiment of the present application. method.
  • the processor 11 can be used to perform step 011, step 012, and step 013, that is, the processor 11 can be used to determine the target graphic according to the shape of the target area 200; obtain the characterization parameters of the target graphic in the first coordinate system; and according to The characterization parameter generates target data corresponding to the target area 200.
  • the target area 200 determines the target image, obtains the characterization parameters of the target image in the first coordinate system, generates the target data according to the characterization parameters, and uses the data processing method to convert the target area
  • the information of 200 is preprocessed. After the data processing device 20 obtains the target data, the information of the target area 200 can be obtained. There is no need to perform processing operations such as calculating the characterization parameters on the data processing device 20 and obtaining the target data according to the characterization parameters, saving The processing resources on the data processing device 20 are eliminated.
  • the target area 200 has a certain shape.
  • the planar shape of the target area 200 may be roughly in the shape of a candy as shown in FIG. 2; when the target area 200 is a building, the target area The planar shape of the target area 200 may be roughly rectangular; when the target area 200 is a mountain, the planar shape of the target area 200 may be roughly elliptical or the like.
  • a target graphic corresponding to the target area 200 can be determined.
  • the target graphic can completely overlap the boundary of the target area 200, and the target graphic can also be larger than the target area 200 and cover the target area 200.
  • the target graphic may be one or more of a polygon, a circle, or an ellipse.
  • the shape of the target graphic may be shown in FIG. 4, and the target graphic 300 is in the shape of a candy.
  • the first coordinate system may be any coordinate system that can be used to characterize the target graphic 300.
  • the first coordinate system may be a NED (North East Ground) coordinate system or a body coordinate system defined by the target graphic 300 itself.
  • the characterizing parameters of the target graphic 300 may include characterizing parameters such as the boundary point coordinates of the target graphic 300 in the first coordinate system, the connection relationship between the boundary point coordinates, and the like, which are not limited herein. In the example shown in FIG.
  • the characterizing parameters of the target graphic 300 in the first coordinate system X1-O1-Y1 include a1, b1, c1, d1, e1, f1, g1, h1, i1, j1, k1, and l1.
  • the storage order of the coordinates of the 12 boundary points a1, b1, c1, d1, e1, f1, g1, h1, i1, j1, k1, and l1 can represent the connection relationship of the 12 boundary points, for example, According to the storage order, it can be determined that a1 and b1 are connected, b1 and c1 are connected, c1 and d1 are connected, d1 and e1 are connected, e1 and f1 are connected, f1 and g1 are connected, g1 and h1 are connected, h1 and i1 are connected, i1 and j1 are connected, j1 and k1 are connected, and k1 and l1 are connected.
  • step 013 according to the characterization parameters obtained in step 012, these characterization parameters can be processed to obtain target data.
  • the target data may include characterization parameters, and the target data may also include parameters other than the characterization parameters, which is not limited here.
  • the data processing method further includes step 014: sending target data to the data processing device 20.
  • the processor 11 of the data storage device 10 may also be used to implement step 014, that is, the processor 11 may be used to send target data to the data processing device 20.
  • the target graphic 300 can be obtained by restoring the target data, and the positional relationship between the data processing device 20 and the target area 200 corresponding to the target graphic 300 can also be determined.
  • the data processing device 20 itself does not need to perform steps 011, 012, and 013 to obtain target data, so as to save the processing resources of the data processing device 20 itself.
  • the data processing method before step 012, the data processing method further includes the steps:
  • 016 Convert the target graphic 300 from the second coordinate system to the first coordinate system.
  • the processor 11 can also be used to implement step 015 and step 016, that is, the processor 11 can be used to: based on the origin of the target graphics 300 in the target graphics 300 from the third coordinate system Down-converting to the second coordinate system; and converting the target graphic 300 from the second coordinate system to the first coordinate system.
  • the initial coordinate system of the target graphic 300 may not be the first coordinate system, and the initial characterizing parameters of the target graphic 300 may not be its characterizing parameters in the first coordinate system. Therefore, the target graphic 300 may be converted first. To the first coordinate system, so as to subsequently obtain the characterizing parameters of the target graphic 300 in the first coordinate system.
  • the second coordinate system and the third coordinate system can be any coordinate system, which is not limited here.
  • the third coordinate system X3-O3-Y3 is the GPS coordinate system
  • the second coordinate system X2-O2-Y2 is the NED coordinate system
  • the first coordinate system is the body coordinate system as an example.
  • the target graphic 300 is first converted from the GPS coordinate system to the NED coordinate system.
  • the boundary of the target area 200 can be represented by GPS coordinates, such as the GPS coordinates of the boundary of an airport runway, so the target graphic 300 can also be represented by GPS first, for example, in the GPS coordinate system X3-O3 as shown in FIG. Under -Y3, the target graphic 300 is characterized by coordinates a3, b3, c3, d3, e3, f3, g3, h3, i3, j3, k3, and l3.
  • the graphic origin can be selected from the target image 300 first, and the graphic origin is used as the conversion origin to perform the coordinate system conversion.
  • any boundary point of the target graphic 300 can be selected as the graphic origin for conversion, so that after conversion, the selected boundary point on the target graphic 300 will pass through the coordinate origin of the NED coordinate system.
  • the coordinate a3 is selected as the origin of the graphic, and the target graphic 300 is converted from the GPS coordinate system X3-O3-Y3 to the NED coordinate system X2-O2-Y2 with the coordinate a3 as the graphic origin.
  • the target graphic 300 can be characterized by coordinates a2, b2, c2, d2, e2, f2, g2, h2, i2, j2, k2, and l2. Specifically, how to convert the target graphic 300 from the GPS coordinate system to the NED coordinate system based on the graphic origin can be obtained according to the conversion relationship between the GPS coordinate system and the NED coordinate system itself, which will not be repeated here.
  • the target graphic 300 is converted from the NED coordinate system to the body coordinate system.
  • the body coordinate system can be a coordinate system determined in any manner.
  • the body coordinate system is the Cartesian coordinate system
  • the body coordinate system is the body coordinate system of the target graphic 300.
  • the body coordinate system The coordinate axis can be determined according to the target graphic 300, so that the target graphic 300 is located in a quadrant of the body coordinate system, so as to facilitate subsequent acquisition of the characteristic parameters of the target graphic 300 in the body coordinate system, and to facilitate the identification of the target graphic 300 and another position
  • the relative relationship of reduces the amount of subsequent calculations on the data. For example, in the example shown in FIG.
  • the target graphic 300 is located in the body quadrant of the body coordinate system X1-O1-Y1.
  • other body coordinate systems can also be determined so that the target graphic 300 is located in the NED of the other body coordinate system.
  • quadrants There are no restrictions on quadrants, GPS quadrants, or fourth quadrants.
  • the conversion method of the target graphic 300 from the NED coordinate system to the body coordinate system may be represented by a conversion parameter.
  • the conversion parameter may indicate the angle at which the target graphic 300 is rotated counterclockwise from the NED coordinate system.
  • the parameter ⁇ In the examples shown in Figures 4 and 8, the target graphic 300 is converted from the NED coordinate system X2-O2-Y2 to the body coordinate system X1-O1-Y1, and the coordinates a1 in the body coordinate system X1-O1-Y1 b1, c1, d1, e1, f1, g1, h1, i1, j1, k1, and l1 can be used to represent the target graphics 300.
  • step 013 includes step 0131: according to the coordinates of the origin of the graphic in the GPS coordinate system, the target graphic 300 is converted from the NED coordinate system to the conversion parameter and the characterizing parameter under the body coordinate system Generate target data.
  • the processor 11 can be used to implement step 0131, that is, the processor 11 can be used to convert the target graphic 300 from the NED coordinate system to the coordinates of the graphic origin in the GPS coordinate system.
  • the conversion parameters and characterization parameters in the ontology coordinate system generate target data.
  • the target data is generated according to the coordinates of the graphic origin in the GPS coordinate system, the conversion parameters and the characterization parameters of the target graphic 300 from the NED coordinate system to the body coordinate system, so that the data processing device 20 can convert the target data according to the target data.
  • the target graphic 300 is converted to any one of the body coordinate system, the NED coordinate system, or the GPS coordinate system to facilitate subsequent determination of the positional relationship between the data processing device 20 and the target area 200.
  • a piece of target data obtained by implementing step 0131 can be ⁇ the coordinates of the origin of the graphic in the GPS coordinate system, conversion parameters, and characterization parameters ⁇ .
  • the piece of target data obtained by combining the examples in Figure 4, Figure 7 and Figure 8 can be ⁇ a3 , ⁇ , characterization parameter ⁇ .
  • the characterization parameters vary according to the shape of different target graphics 300.
  • the characterization parameters may include the coordinates of the center of the circle in the body coordinate system, and the circle
  • the characterizing parameter can be the normal vector of each side of the polygon and the coordinates of each intersection of the polygon.
  • the target graphic 300 is a polygon
  • step 012 includes step 0121, obtaining the normal vector of each side of the polygon and the coordinates of each intersection point of the polygon in the body coordinate system.
  • the processor 11 can also be used to implement step 0121, that is, the processor 11 can be used to obtain the normal vector and the normal vector of each side of the polygon in the body coordinate system. The coordinates of each intersection of the polygon.
  • the target data may be ⁇ a3, ⁇ , each side of the polygon ⁇ , where each side of the polygon is represented by the end point on one side of each side and the normal vector of the side.
  • the data processing method further includes step 017: transforming the target graphic 300 from the second coordinate system to the first coordinate system based on the graphic origin in the target graphic 300.
  • the second coordinate system is the GPS coordinate system
  • the first coordinate system is the NED coordinate system.
  • the processor 11 may be used to implement step 017, that is, the processor 11 may be used to convert the target graphic 300 from the GPS coordinate system to the NED coordinate system based on the graphic origin in the target graphic 300 under.
  • the initial coordinate system of the target graphic 300 may not be the NED coordinate system, and the initial characterizing parameters of the target graphic 300 may not be its characterizing parameters in the NED coordinate system. Therefore, the target graphic 300 can be converted to NED first. In the coordinate system, in order to subsequently obtain the characterizing parameters of the target graphic 300 in the NED coordinate system.
  • the NED coordinate system and the GDP coordinate system can be any coordinate system, which is not limited here.
  • the NED coordinate system X2-O2-Y2 is the NED coordinate system
  • the GDP coordinate system X3-O3-Y3 is the GPS coordinate system as an example for illustration.
  • the boundary of the target area 200 can be represented by GPS coordinates, such as the GPS coordinates of the boundary of an airport runway, so the target graphic 300 can also be represented by GPS first, for example, in the GDP coordinate system X3-O3-Y3 as shown in FIG. 7 ,
  • the target graphic 300 is characterized by coordinates a3, b3, c3, d3, e3, f3, g3, h3, i3, j3, k3, and l3.
  • the graphic origin can be selected first, and the graphic origin is used as the conversion origin for conversion.
  • any boundary point of the target graphic 300 can be selected as the graphic origin for conversion, so that after conversion, the selected boundary point on the target graphic 300 will pass through the coordinate origin of the NED coordinate system.
  • the coordinate a3 is selected as the origin of the graphic, and the target graphic 300 is converted from the GDP coordinate system X3-O3-Y3 to the NED coordinate system X2-O2-Y2 with the coordinate a3 as the graphic origin.
  • the target graphic 300 can be characterized by coordinates a2, b2, c2, d2, e2, f2, g2, h2, i2, j2, k2, and l2. How to convert the target graphic 300 from the GDP coordinate system to the NED coordinate system based on the graphic origin can be obtained according to the conversion relationship between the GDP coordinate system and the NED coordinate system itself, which will not be repeated here.
  • step 013 includes step 0132: generating target data corresponding to the target area 200 according to the characterization parameters and the coordinates of the graph origin in the GDP coordinate system.
  • the processor 11 can be used to implement step 0132, that is, the processor 11 can be used to generate target data corresponding to the target area 200 according to the characterization parameters and the coordinates of the graph origin in the GDP coordinate system. .
  • step 0132 the target data is generated according to the coordinates of the graphic origin in the GDP coordinate system and the characterization parameters, so that the data processing device 20 can convert the target graphic 300 to any one of the NED coordinate system or the GDP coordinate system according to the target data.
  • a piece of target data obtained by implementing step 0132 may be ⁇ coordinates of the origin of the graph in the GDP coordinate system, characterization parameters ⁇ , and a piece of target data obtained by combining the examples in FIG. 7 and FIG. 8 may be ⁇ a3, characterization parameters ⁇ .
  • the specific form of the characterizing parameter can refer to the description of step 0131 and step 0121 above, which will not be repeated here.
  • the data processing method includes steps:
  • the target data is generated by the data storage device 10 according to the characterization parameters of the target graphic 300 corresponding to the target area 200 in the first coordinate system;
  • the data processing device 20 of the embodiment of the present application includes a processor 21 and a memory 22.
  • the memory 22 is used to store program instructions or data
  • the processor 21 is used to read the program instructions to perform the data processing of the embodiment of the present application. method.
  • the processor 21 can be used to perform step 021, step 022, and step 023. That is, the processor 21 can be used to obtain the target data corresponding to the target area 200.
  • the target data is determined by the data storage device 10 according to the target graphic 300 corresponding to the target area 200 in the first Generated by characterizing parameters in a coordinate system; determine the position relationship between the data processing device 20 and the target area 200 according to the position information of the current position of the data processing device 20 in the first coordinate system; and control the data processing device 20 according to the position relationship Mobile.
  • steps 021, 022, and step 023 after the data processing device 20 obtains the target data, it can obtain the characterization parameters of the target graphic 300 corresponding to the target area 200 in the first coordinate system, and there is no need to perform the target graphic 300 Perform the steps of converting and calculating characterization parameters, and can further control the movement of the data processing device 20 according to the positional relationship between the data processing device 20 and the target area 200, reducing the burden of the data processing device 20 on data processing, and saving the processing of the data processing device 20 Resources.
  • step 021 the target data corresponding to the target area 200 is obtained.
  • the target data may be sent by the data storage device 10 to the data processing device 20 immediately, or may be stored in the memory 22 of the data processing device 20 in advance.
  • the target area 200 may include a restricted-flying area or an obstacle area.
  • the target data can be generated by any of the steps shown in FIG. 1 to FIG. 12, and will not be repeated here.
  • the position information of the current position of the data processing device 20 in the first coordinate system can be obtained first. Since the position of the target graphic 300 in the first coordinate system can be obtained through the target data, the position of the target graphic 300 in the first coordinate system can be obtained through the same in the first coordinate system Under the current position of the data processing device 20 and the target graphic 300, the relationship between the current position and the target graphic 300 can be determined, and the position relationship between the data processing device 20 and the target area 200 can be further determined subsequently.
  • the target graphic 300 in the first coordinate system X1-O1-Y1, the target graphic 300 consists of coordinates a1, b1, c1, d1, e1, f1, g1, h1, i1, j1, k1, and l1. Characterization: The position information of the data processing device 20 is characterized by the coordinate P1. By judging the relationship between the coordinate P1 and the target graphic 300, the position relationship between the data processing device 20 and the target area 200 can be determined.
  • step 023 the movement of the data processing device 20 is controlled according to the position relationship, so that the strategy for controlling the movement of the data processing device 20 is adapted to the position relationship.
  • the target data further includes the coordinates of the graphic origin of the target graphic 300 in the third coordinate system, and the conversion of the target graphic 300 from the second coordinate system to the first coordinate system Conversion parameters, before step 022, the data processing method further includes the steps:
  • the target data also includes the coordinates of the graphic origin of the target graphic 300 in the third coordinate system, and the conversion of the target graphic 300 from the second coordinate system to the first coordinate system
  • the processor 21 may also be used to perform step 024 and step 025 for converting parameters. That is, the processor 21 may be used to convert the coordinates of the current position of the data processing device 20 in the third coordinate system to the second coordinate system according to the coordinates of the origin of the graphic in the third coordinate system; and to convert the data according to the conversion parameters
  • the coordinates of the current position of the processing device 20 in the second coordinate system are converted to the first coordinate system.
  • the current position when acquiring the current position of the data processing device 20, the current position may not be represented in the first coordinate system at the beginning. Therefore, the current position can be converted to the first coordinate system to change the current position. Both the target graphic 300 and the target graphic 300 are expressed in the first coordinate system, which facilitates the determination of the positional relationship between the data processing device 20 and the target area 200.
  • the target data may be ⁇ the coordinates of the graphic origin of the target graphic 300 in the third coordinate system, the target graphic 300 is converted from the second coordinate system to the conversion parameters in the first coordinate system, and the target graphic
  • the characterization parameter of 300 in the first coordinate system is in the form of ⁇ (such as the target data obtained by implementing the above step 0131).
  • the first coordinate system, the second coordinate system, and the third coordinate system may be arbitrary coordinate systems, and there is no limitation here. Please refer to Figure 14, Figure 16, and Figure 17.
  • the third coordinate system X3-O3-Y3 is the GPS coordinate system
  • the second coordinate system X2-O2-Y2 is the NED coordinate system
  • the first coordinate system X1-O1 -Y1 is the body coordinate system of the target graphic 300 as an example for illustrative description.
  • step 024 the current position of the data processing device 20 is first converted from the third coordinate system to the second coordinate system.
  • the coordinates of the current position in the third coordinate system can be obtained first, for example, the GPS coordinates of the current position can be obtained first.
  • the GPS coordinates can be represented by the coordinate P3 in the third coordinate system X3-O3-Y3 shown in FIG. 16 .
  • the coordinates of the graphic origin of the target graphic 300 in the third coordinate system in the target data can be used as the coordinate origin for conversion, so that the target graphic 300 and The current position is transformed with the same coordinate origin, and the relationship between the target graphic 300 and the current position will not change after the transformation.
  • the current position is represented by the coordinate P3 in the third coordinate system X3-O3-Y3.
  • the current position can be represented by the coordinate P2 said.
  • step 025 the coordinates of the current position of the data processing device 20 in the second coordinate system are converted to the first coordinate system according to the conversion parameters.
  • the conversion parameter can be obtained from the target data
  • the target graphic 300 is converted from the second coordinate system to the conversion parameter under the first coordinate system, so that the target graphic 300 and the current position are converted with the same conversion parameters.
  • the relationship between the target graphic 300 and the current position on the map will not change.
  • the current position can be represented by the coordinate P2 in the second coordinate system X2-O2-Y2.
  • the current position can be represented by The coordinate P1 indicates that the coordinate P1 is the position information of the current position of the data processing device 20 in the first coordinate system X1-O1-Y1. Since the target data already includes the characterizing parameters of the target graphic 300 in the first coordinate system X1-O1-Y1, the current position and target represented by the position information coordinate P1 can be obtained in the first coordinate system X1-O1-Y1. The on-graphic relationship between the graphics 300, and this on-graphic relationship can be used to determine the positional relationship between the data processing device 20 and the target area 200.
  • step 024 and step 025 the current position is converted from the GPS coordinate system to the NED coordinate system, and then from the NED coordinate system to the body coordinate system. You can select the appropriate coordinate axis of the body coordinate system to make the current position
  • the graph relationship with the target graph 300 is easy to calculate, which saves the processing resources of the processing data processing device 20.
  • the target data further includes the coordinates of the graphic origin of the target graphic 300 in the second coordinate system.
  • the data processing method further includes step 026: according to the graphic origin in the second coordinate system.
  • the coordinates in the coordinate system convert the coordinates of the current position of the data processing device 20 in the GPS coordinate system to the NED coordinate system.
  • the target data also includes the coordinates of the graphic origin of the target graphic 300 in the second coordinate system.
  • the processor 21 can also be used to implement step 026, that is, to process The device 21 can be used to convert the coordinates of the current position of the data processing device 20 in the second coordinate system to the first coordinate system according to the coordinates of the origin of the graphic in the second coordinate system.
  • the target data may be in the form of ⁇ the coordinates of the graphic origin of the target graphic 300 in the second coordinate system, and the characterization parameters of the target graphic 300 in the first coordinate system ⁇ (for example, the target data obtained by implementing the above step 0132) ).
  • the first coordinate system and the second coordinate system can be arbitrary coordinate systems, and there is no limitation here. Please refer to Figure 16 and Figure 19.
  • This embodiment takes the second coordinate system X3-O3-Y3 as the GPS coordinate system and the first coordinate system X2-O2-Y2 as the NED coordinate system.
  • the first coordinate system X2-O2-Y2 can be characterized by coordinates a2, b2, c2, d2, e2, f2, g2, h2, i2, j2, k2, and l2.
  • step 026 the current position of the data processing device 20 is converted from the second coordinate system to the first coordinate system.
  • the coordinates of the current position in the second coordinate system can be acquired first, for example, the GPS coordinates of the current position can be acquired first.
  • the GPS coordinates can be represented by the coordinate P3 in the second coordinate system X3-O3-Y3 shown in FIG. 16 .
  • the coordinates of the graphic origin of the target graphic 300 in the second coordinate system in the target data can be used as the coordinate origin for conversion, so that the target graphic 300 and The current position is transformed with the same coordinate origin, and the relationship between the target graphic 300 and the current position will not change after the transformation.
  • the current position is represented by the coordinate P3 in the second coordinate system X3-O3-Y3.
  • the current position can be represented by the coordinate P2 indicates that the coordinate P2 is the position information of the current position of the data processing device 20 in the first coordinate system X2-O2-Y2.
  • the target data already includes the characterizing parameters of the target graphic 300 in the first coordinate system X2-O2-Y2
  • the current position and target represented by the position information coordinate P2 can be obtained in the first coordinate system X2-O2-Y2.
  • the on-graphic relationship between the graphics 300, and this on-graphic relationship can be used to determine the positional relationship between the data processing device 20 and the target area 200.
  • step 022 includes step 0221: determining whether the data processing device 20 falls within the target area 200 according to the position information of the current position of the data processing device 20 in the first coordinate system.
  • the processor 21 may be used to implement step 0221, that is, the processor 21 may be used to determine the data processing device according to the position information of the current position of the data processing device 20 in the first coordinate system Whether 20 falls within the target area 200.
  • the position information of the current position of the data processing device 20 in the first coordinate system it can be determined whether the current position falls within the target graphic 300 in the first coordinate system, so as to determine whether the data processing device 20 falls within Within the target area 200, so as to facilitate the subsequent selection of the movement strategy of the data processing device 20.
  • the location information indicates that the current location is within the target graphic 300
  • the location information indicates that the current location is outside the target graphic 300, it can be determined that the data processing device 20 does not fall into the target area.
  • the target graphic 300 is surrounded by coordinates a1, b1, c1, d1, e1, f1, g1, h1, i1, j1, k1, and l1. It is represented by the formed graphic, and the position information is represented by the coordinate P1, so it is only necessary to determine whether the coordinate P1 falls within the range of the target graphic 300.
  • the coordinate axis of the first coordinate system can be appropriately determined, so that the target graphic 300 is located in a quadrant of the first coordinate system. As shown in FIG. 21, the target graphic 300 falls on the first coordinate system. In the first quadrant of the coordinate system X1-O1-Y1, when determining the relationship between the coordinate P1 and the target graphic 300, you can first determine whether at least one of the coordinates of the coordinate P1 on the X1 axis or the Y1 axis is negative. If one is negative, it can be easily judged that the coordinate P1 does not fall within the range of the target graphic 300, so as to reduce the amount of calculation.
  • the data processing method further includes the steps:
  • 028 Determine the closest distance between the data processing device 20 and the target area 200 according to the distance between the location information and the closest boundary.
  • the processor 21 can also be used to implement steps 027 and 028, that is, the processor 21 can be used if the data processing device 20 does not fall into the target area 200.
  • the processor 21 According to the position information of the current position of the data processing device 20 in the first coordinate system, determine the one or more boundaries of the target area 200 that is closest to the data processing device 20; and according to the position information and the closest boundary The distance determines the closest distance between the data processing device 20 and the target area 200.
  • the closest distance between the data processing device 20 and the target area 200 can be obtained without calculating the distance between the data processing device 20 and all the boundaries, which reduces The amount of calculation saves the processing resources of the data processing device 20.
  • the boundary closest to the coordinate P1 in the target graphic 300 is d1e1, which means that the data processing device 20 and the target area 200
  • the distance of the boundary corresponding to d1e1 is the closest, and the closest distance can be obtained by only calculating the distance between the data processing device 20 and the corresponding boundary.
  • multiple regions can be divided in the first coordinate system X1-O1-Y1, such as the regions R1, R2, R3, R4, R5, R6, R7, R8, and R9 as shown in FIG. 21.
  • the data processing method when the first coordinate system is the body coordinate system, after step 022, the data processing method further includes step 029: according to the positional relationship between the data processing device 20 and the target area 200 and conversion parameters, The position relationship is converted to the NED coordinate system, and the conversion parameter is the conversion parameter for converting the target graphic 300 from the NED coordinate system to the body coordinate system.
  • the processor 21 when the first coordinate system is the body coordinate system, after the processor 21 implements step 022, the processor 21 can also be used to implement step 029: according to the relationship between the data processing device 20 and the target area 200 The position relationship and the conversion parameters are converted to the NED coordinate system, and the conversion parameters are the conversion parameters for converting the target graphic 300 from the NED coordinate system to the body coordinate system.
  • the body coordinate system it can be judged whether the data processing device 20 falls within the target area 200. However, when it is specifically necessary to clarify the current movement direction of the data processing device 20 and the relative orientation of the data processing device 20 and the target area 200, It needs to be performed in the NED coordinate system to ensure that the movement of the data processing device 20 can be correctly controlled.
  • step 029 the positional relationship between the data processing device 20 and the target area 200 can be obtained from the relationship between the position information of the current position of the data processing device 20 in the first coordinate system and the target graphic 300.
  • the conversion parameters can be obtained directly from the target data.
  • step 023 includes step 0231: when the positional relationship is different, different control strategies are used to control the movement of the data processing device 20.
  • the processor 21 may be used to implement step 0231, that is, the processor 21 may be used to control the movement of the data processing device 20 using different control strategies when the positional relationship is different.
  • different control strategies are adopted to control the movement of the data processing device 20, so that the relationship between the data processing device 20 and the target area 200 can be better maintained.
  • the ways in which different positional relationships correspond to different control strategies include one or more of the following:
  • the data processing device 20 When the data processing device 20 is located in the target graphic 300, the data processing device 20 is controlled to land to prevent the data processing device 20 from continuing to move within the target area 200; when the data processing device 20 is located outside the target graphic 300, and the data processing device 20 is When the closest distance to the boundary of the target area 200 is less than the distance threshold, the data processing device 20 is controlled to move away from the target graphic 300 to increase the distance between the data processing device 20 and the target area 200; when the data processing device 20 is located outside the target graphic 300, and the data When the closest distance between the processing device 20 and the boundary of the target area 200 is less than the distance threshold, the data processing device 20 is controlled to move along the direction of the boundary to ensure that the data processing device 20 is no longer close to the target area 200; when the data processing device 20 is located in the target graphic 300 In addition, and the closest distance between the data processing device 20 and the boundary is less than the distance threshold, the data processing device 20 is controlled to change the movement route to re-plan the movement route to avoid the target area 200; and
  • the mobile control system 100 includes the data storage device 10 and the data processing device 20 described in any of the foregoing embodiments.
  • This application also discloses a non-volatile computer-readable storage medium 400 containing computer-executable instructions 401.
  • the processor 500 executes The data processing method shown in any embodiment of this application.
  • the processor executes computer-executable instructions
  • the processor executes the steps:
  • 011 Determine the target graphic according to the shape of the target area 200;
  • the processor executes computer-executable instructions
  • the processor executes the steps:
  • the target data is generated by the data storage device 10 according to the characterization parameters of the target graphic corresponding to the target area 200 in the first coordinate system;

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Procédé de traitement de données, dispositif de stockage de données, dispositif de traitement de données, système de commande de mouvement et support d'enregistrement lisible par ordinateur. Le procédé de traitement de données comprend les étapes comprenant : la détermination d'un graphique cible en fonction de la forme d'une région cible ; l'obtention de paramètres de représentation du graphique cible dans un premier système de coordonnées ; et la génération de données cibles correspondant à la région cible selon les paramètres de représentation.
PCT/CN2019/115112 2019-11-01 2019-11-01 Procédé et dispositif de traitement de données, dispositif de stockage de données et système de commande de mouvement Ceased WO2021081995A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980029545.6A CN112136091A (zh) 2019-11-01 2019-11-01 数据处理方法及设备、数据存储设备、移动控制系统
PCT/CN2019/115112 WO2021081995A1 (fr) 2019-11-01 2019-11-01 Procédé et dispositif de traitement de données, dispositif de stockage de données et système de commande de mouvement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/115112 WO2021081995A1 (fr) 2019-11-01 2019-11-01 Procédé et dispositif de traitement de données, dispositif de stockage de données et système de commande de mouvement

Publications (1)

Publication Number Publication Date
WO2021081995A1 true WO2021081995A1 (fr) 2021-05-06

Family

ID=73850164

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/115112 Ceased WO2021081995A1 (fr) 2019-11-01 2019-11-01 Procédé et dispositif de traitement de données, dispositif de stockage de données et système de commande de mouvement

Country Status (2)

Country Link
CN (1) CN112136091A (fr)
WO (1) WO2021081995A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114114349A (zh) * 2021-11-18 2022-03-01 杭州海康威视数字技术股份有限公司 数据处理方法、装置、电子设备及计算机程序产品

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8965679B2 (en) * 2012-06-11 2015-02-24 Honeywell International Inc. Systems and methods for unmanned aircraft system collision avoidance
CN106249753A (zh) * 2016-09-05 2016-12-21 广州极飞科技有限公司 对无人机进行控制的方法、控制装置及无人机
CN106461396A (zh) * 2014-04-17 2017-02-22 深圳市大疆创新科技有限公司 适于限飞区域的飞行控制
CN107407938A (zh) * 2015-03-31 2017-11-28 深圳市大疆创新科技有限公司 针对限飞区域的开放平台
CN109814455A (zh) * 2019-01-31 2019-05-28 拓攻(南京)机器人有限公司 一种无人机的禁飞控制方法、装置、设备以及存储介质

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018006216A1 (fr) * 2016-07-04 2018-01-11 SZ DJI Technology Co., Ltd. Support de fonctionnement aérien et gestion en temps réel
KR102859653B1 (ko) * 2018-03-29 2025-09-16 소니그룹주식회사 신호 처리 장치 및 신호 처리 방법, 컴퓨터 판독가능 매체, 그리고 이동체

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8965679B2 (en) * 2012-06-11 2015-02-24 Honeywell International Inc. Systems and methods for unmanned aircraft system collision avoidance
CN106461396A (zh) * 2014-04-17 2017-02-22 深圳市大疆创新科技有限公司 适于限飞区域的飞行控制
CN107407938A (zh) * 2015-03-31 2017-11-28 深圳市大疆创新科技有限公司 针对限飞区域的开放平台
CN106249753A (zh) * 2016-09-05 2016-12-21 广州极飞科技有限公司 对无人机进行控制的方法、控制装置及无人机
CN109814455A (zh) * 2019-01-31 2019-05-28 拓攻(南京)机器人有限公司 一种无人机的禁飞控制方法、装置、设备以及存储介质

Also Published As

Publication number Publication date
CN112136091A (zh) 2020-12-25

Similar Documents

Publication Publication Date Title
US10466058B2 (en) Navigation for vehicles
US11789467B2 (en) Method, apparatus, terminal, and storage medium for elevation surrounding flight control
CN116830057A (zh) 无人机(uav)集群控制
WO2017211029A1 (fr) Procédé et dispositif de planification de trajectoire de vol pour véhicule aérien sans pilote
WO2018120350A1 (fr) Procédé et dispositif de positionnement de véhicule aérien sans pilote
WO2021081960A1 (fr) Procédé, dispositif et système de planification d'itinéraire et support de stockage
US11221635B2 (en) Aerial vehicle heading control method and apparatus and electronic device
CN105892487B (zh) 一种无人机8字形航迹控制方法
WO2019173981A1 (fr) Procédé et dispositif de commande d'aéronef sans pilote, aéronef sans pilote, système et support d'enregistrement
WO2020155425A1 (fr) Procédé de commande d'exclusion aérienne, appareil et dispositif pour véhicule aérien sans pilote, et support d'informations
WO2020220195A1 (fr) Procédé de commande de véhicule aérien sans pilote, dispositif et système de pulvérisation et véhicule aérien sans pilote et support d'informations
WO2018120351A1 (fr) Procédé et dispositif de positionnement de véhicule aérien sans pilote
WO2022247498A1 (fr) Surveillance de véhicule aérien sans pilote
WO2021087750A1 (fr) Procédé et dispositif de planification d'itinéraire de véhicule aérien sans pilote
CN107735737A (zh) 一种航点编辑方法、装置、设备及飞行器
WO2020000127A1 (fr) Procédé de commande de suivi de chemin de navigation, dispositif, robot mobile et système
CN111930143B (zh) 一种无人机飞行路径生成方法、装置、无人机及存储介质
WO2022226720A1 (fr) Procédé de planification de trajet, dispositif de planification de trajet et support
CN107450586B (zh) 航路的调整方法和系统以及无人机系统
WO2023273415A1 (fr) Procédé et appareil de positionnement basés sur un véhicule aérien sans pilote, support de stockage, dispositif électronique et produit
CN108268048A (zh) 无人机训飞操控方法和无人机训飞操控装置
CN119759083A (zh) 一种基于大型语言模型的无人机自主目标搜寻系统及方法
WO2020237422A1 (fr) Procédé d'arpentage aérien, aéronef et support d'informations
WO2021081995A1 (fr) Procédé et dispositif de traitement de données, dispositif de stockage de données et système de commande de mouvement
CN112149467B (zh) 飞机集群执行任务的方法和长机

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19950711

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19950711

Country of ref document: EP

Kind code of ref document: A1