WO2015107623A1 - Système de gestion et procédé de spécification de positions - Google Patents
Système de gestion et procédé de spécification de positions Download PDFInfo
- Publication number
- WO2015107623A1 WO2015107623A1 PCT/JP2014/050499 JP2014050499W WO2015107623A1 WO 2015107623 A1 WO2015107623 A1 WO 2015107623A1 JP 2014050499 W JP2014050499 W JP 2014050499W WO 2015107623 A1 WO2015107623 A1 WO 2015107623A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- vehicle
- information
- specifying
- feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/14—Traffic control systems for road vehicles indicating individual free spaces in parking areas
- G08G1/145—Traffic control systems for road vehicles indicating individual free spaces in parking areas where the indication depends on the parking areas
- G08G1/146—Traffic control systems for road vehicles indicating individual free spaces in parking areas where the indication depends on the parking areas where the parking area is a limited parking space, e.g. parking garage, restricted space
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/14—Traffic control systems for road vehicles indicating individual free spaces in parking areas
- G08G1/141—Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces
- G08G1/142—Traffic control systems for road vehicles indicating individual free spaces in parking areas with means giving the indication of available parking spaces external to the vehicles
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
Definitions
- the present invention relates to a management system and a location specifying method.
- GPS positioning based on radio wave reception results from GPS (Global Positioning System) satellites and odometry positioning using acceleration sensors, gyro sensors, wheel rotation speed, wheel angle, etc. are generally used.
- GPS Global Positioning System
- odometry positioning using acceleration sensors, gyro sensors, wheel rotation speed, wheel angle, etc. have been done.
- indoors such as in warehouses, multistory parking lots, and in the hold (except when the space is completely closed, the upper part is closed or the upper part is substantially closed, and radio waves from GPS satellites cannot be received.
- GPS positioning cannot be performed with respect to the position of a vehicle parked in a space (hereinafter also referred to as “in a warehouse, etc.”).
- the vehicle position in the indoor can be specified, but in order to specify the vehicle position with high accuracy, a sensor with a small offset is used. It is necessary to use it. Such a sensor is very expensive, and it is not practical to employ a configuration in which each sensor is arranged.
- the result of odometry positioning using the wheel rotation speed, wheel angle, etc. for each vehicle cannot be acquired as information from the ECU (Electronic Control Unit) of the vehicle.
- ECU Electronic Control Unit
- the ceiling height is generally limited, so it is difficult to install an appropriate support column that allows the entire floor to be viewed from above. there were.
- the configuration is not simple for specifying the position of an object such as a vehicle.
- the facility construction was unavoidable.
- the invention according to claim 1 is an unmanned air vehicle capable of flying indoors; and information related to a position of a predetermined object that is mounted on the unmanned air vehicle and is present in the indoor space is in flight of the unmanned air vehicle.
- a position-related information acquiring unit to be acquired; and a position specifying unit for specifying the position of the object in the room based on the information acquired by the position-related information unit. is there.
- the invention according to claim 7 is an unmanned traveling body capable of traveling on an indoor floor surface; the unmanned traveling body is mounted on the unmanned traveling body, and information related to a position of a predetermined object existing in the indoor is displayed on the unmanned traveling body.
- a position-related information acquiring unit that acquires from above the predetermined object during traveling, and a position specifying unit that specifies the position of the object indoors based on the information acquired by the position-related information unit
- a management system characterized by comprising:
- an unmanned air vehicle capable of flying indoors; information related to a position of a predetermined object that is mounted on the unmanned air vehicle and is present in the indoor space during the flight of the unmanned air vehicle
- a position specifying method used in a management system comprising: a position related information acquiring unit to acquire; and a position specifying unit for specifying the position of the object in the room, wherein the position related information acquiring unit includes: An acquisition step of acquiring information related to the position of the object during the flight of the unmanned air vehicle; and the position of the target object in the room based on the information acquired by the position specifying unit in the acquisition step And a position specifying step for specifying the position.
- FIG. 1 is a block diagram illustrating a configuration of a management system 500 according to an embodiment.
- This management system 500 is a system that manages the vehicles CR 1 , CR 2 , CR 3 ,... Parked indoors in the building BLD 1 .
- a feature information sheet QR j in which feature information indicating features unique to j is converted into a QR code (registered trademark) is attached. The content of the feature information is determined in advance from the viewpoint of vehicle management inside the building BLD1.
- the management system 500 includes an unmanned air vehicle 100, an air vehicle mounting device 200, a relay device 300, and a processing control device 400.
- the unmanned air vehicle 100, the air vehicle mounting device 200, and the relay device 300 are arranged in the building BLD1.
- the process control apparatus 400 is arrange
- the unmanned air vehicle 100 includes a plurality of propellers.
- the unmanned air vehicle 100 can be remotely controlled by the processing control device 400 such as the flight speed and the flight path.
- the processing control device 400 such as the flight speed and the flight path.
- a flight base BS in which charging facilities for the unmanned air vehicle 100 are prepared is installed, and the unmanned air vehicle 100 can be charged at the air base BS.
- the above-mentioned flying object mounting apparatus 200 is mounted on the unmanned flying object 100.
- This flying object mounting device 200 can wirelessly communicate with the relay device 300. The details of the configuration of the flying object mounting apparatus 200 will be described later.
- the relay device 300 is configured such that the main body is arranged at the flying base BS.
- the relay device 300 includes an antenna 300A 1 for performing wireless communication with the flying object mounting device 200 and an antenna 300A 2 for performing wireless communication with the processing control device 400.
- the antenna 300A 2 is installed outdoors of a building BLD1.
- the relay apparatus 300 When the relay apparatus 300 receives the radio signal transmitted from the flying object mounting apparatus 200 with the antenna 300A 1 , the relay apparatus 300 performs amplification processing and the like as appropriate, and then transmits the antenna signal from the antenna 300A 2 to the processing control apparatus 400. Also, repeater 300 receives the radio signal transmitted from the processing control unit 400 by the antenna 300A 2, is transmitted from the antenna 300A 1 after the amplification processing or the like as appropriate to the aircraft mounted apparatus 200.
- the processing control apparatus 400 includes an antenna 400A for performing wireless communication with the relay apparatus 300.
- the antenna 400A is installed outside the building BLD2.
- the flying object mounting apparatus 200 includes an antenna 210, a wireless transmission / reception unit 220, and a rotor control unit 230, as shown in FIG.
- the flying object mounting apparatus 200 includes a position related information acquisition unit 240, a displacement information acquisition unit 250, an ambient environment information acquisition unit 260, and a feature information acquisition unit 270.
- the wireless transmission / reception unit 220 uses the antenna 210 to transmit / receive information to / from the relay device 300 (and thus the processing control device 400). Then, when receiving the information transmitted from the relay device 300 via the antenna 210, the radio transmission / reception unit 220 transmits the information to any of the elements 230 to 270 according to the content of the information. Further, when receiving the information sent from any of the elements 230 to 270, the wireless transmission / reception unit 220 transmits the information to the relay device 300 via the antenna 210.
- the rotor control unit 230 receives the flight control information sent from the processing control device 400 via the relay device 300. And the rotor control part 230 controls rotation of the several propeller with which the unmanned air vehicle 100 is provided according to the said flight control information. For this reason, the unmanned air vehicle 100 can be made to fly at a flight speed and a flight path according to the flight plan generated by the processing control device 400.
- the position-related information acquisition unit 240 includes an imaging device such as an optical camera.
- the imaging device captures an image below the unmanned air vehicle 100 while the unmanned air vehicle 100 is flying. This imaging result is sent to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
- the displacement information acquisition unit 250 includes a so-called internal sensor such as a three-dimensional acceleration sensor or a gyro sensor.
- the detection results by these internal sensors are sent as displacement information to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
- the ambient environment information acquisition unit 260 includes a so-called external sensor such as a laser range finder (hereinafter referred to as “LRF”).
- LRF laser range finder
- the LRF detects the distance from the current position of the unmanned aerial vehicle 100 to the flying obstacle (eg, wall, column, beam, ceiling, floor, etc.) with respect to all directions from the current position of the unmanned air vehicle 100.
- the detection results by these internal sensors are sent to the processing control apparatus 400 via the wireless transmission / reception unit 220 and the relay apparatus 300 as ambient environment information.
- the feature information acquisition unit 270 shares the above-described imaging device with the position-related information acquisition unit 240 and includes a QR code decoding processing unit.
- the imaging device captures an image below the unmanned air vehicle 100 while the unmanned air vehicle 100 is flying.
- the decoding processing unit detects the QR code in the imaging result by the imaging device, the decoding processing unit acquires the characteristic information for each vehicle CR j described above by decoding the QR code.
- the feature information acquired in this way is sent to the processing control device 400 via the wireless transmission / reception unit 220 and the relay device 300.
- the flying object mounting apparatus 200 further includes an illumination unit (not shown) that illuminates the imaging range.
- the processing control apparatus 400 includes a wireless transmission / reception unit 420, a storage unit 430, and a flight control unit 440 in addition to the antenna 400A described above. Further, the processing control apparatus 400 includes a flying object position detection unit 451, a map generation unit 455, a vehicle position specification unit 461, and a vehicle feature specification unit 465. Furthermore, the processing control apparatus 400 includes a display unit 470 and an input unit 480.
- the wireless transmission / reception unit 420 transmits / receives information to / from the relay device 300 (and thus the flying object mounting device 200) using the antenna 400A. Then, when receiving the information transmitted from the relay device 300 via the antenna 400A, the wireless transmission / reception unit 420 appropriately transmits the information to the elements 451 to 465 according to the content of the information. In addition, when receiving the information sent from the flight control unit 440, the wireless transmission / reception unit 420 transmits the information to the relay device 300 via the antenna 400A.
- Information transmitted from the elements 240 to 270 of the flying object mounting device 200 is sequentially transmitted via the wireless transmission unit 220 and the antenna 210, the relay device 300, and the antenna 400A and the wireless transmission / reception unit 420 of the processing control device 400.
- the information is appropriately transmitted to the elements 451 to 465 of the processing control apparatus 400.
- information sent from the flight control unit 440 of the processing control device 400 is sequentially transmitted via the wireless transmission / reception unit 420 and the antenna 400A, the relay device 300, and the antenna 210 and the wireless transmission unit 220 of the flying object mounting device 200. It is transmitted to the rotor control unit 230 of the flying object mounting device 200.
- the description of the sequentially interposed elements is omitted as described above.
- the storage unit 430 stores various information used by the processing control device 400. As information stored in the storage unit 430 in this manner, the map information generated by the map generation unit 455, the vehicle position specified by the vehicle position specifying unit 461, and the feature information specified by the vehicle feature specifying unit 465 are associated with each other. Vehicle information table displayed. Any of the elements 440 to 465 can access the storage unit 430.
- the flight control unit 440 receives a flight control request sent from any of the map generation unit 455, the vehicle position specifying unit 461, and the vehicle feature specifying unit 465. Then, the flight control unit 440 is based on the map information in the storage unit 430 and the current position of the unmanned air vehicle 100 sent from the flying object position detection unit 451 (hereinafter referred to as “flying object current position”). Flight control information for realizing the flight of the flight speed and flight path (including the flight altitude) included in the flight plan specified in the flight control request is sequentially generated. The flight control information generated in this way is transmitted to the rotor control unit 230 of the flying object mounting device 200.
- the above-mentioned flying object position detection unit 451 displays the detection result by the inner world sensor sent from the displacement information acquisition unit 250 of the flying object mounting device 200 and the detection result by the outer world sensor sent from the surrounding environment information acquisition unit 260. Receive. Then, the flying object position detection unit 451 detects the current flying object position in the map represented by the map information based on the detection result and the map information in the storage unit 430. The aircraft current position thus detected is sequentially sent to the flight control unit 440, the map generation unit 455, the vehicle position specifying unit 461, and the vehicle feature specifying unit 465.
- the flying object position detection unit 451 calculates a provisional flying object current position using the calculation result of the moving distance, the positioning result, and the like obtained from the detection result (that is, the displacement information) by the internal sensor. Then, the flying object position detection unit 451 corrects based on the detection result (that is, ambient environment information) by the external sensor and the map information in the storage unit 430. For this reason, the flying object position detection unit 451 can accurately detect the current position of the flying object in the map represented by the map information while preventing the accumulation of the offset in the detection result by the internal sensor. It has become.
- the map generation unit 455 generates at least one of an indoor 2D map and a 3D map of the building BLD1. Generation of such a map is started when a map generation start command sent from the input unit 480 is received.
- the map generation unit 455 When generating a map, the map generation unit 455 receives the detection result from the external sensor sent from the surrounding environment information acquisition unit 260. Then, the map generation unit 455 generates map information around the unmanned air vehicle 100 based on the detection result of the external sensor and the current vehicle position transmitted from the air vehicle position detection unit 451.
- the map generation unit 455 generates a flight plan for generating a map of the unfinished area, and flight control specifying the generated flight plan The request is sent to the flight control unit 440.
- the unmanned air vehicle 100 flies according to the flight plan.
- the map generation unit 455 receives the detection result by the external sensor mounted on the unmanned air vehicle 100 that is flying according to the flight plan for generating the map. Then, the map generation unit 455 generates a map around the unmanned air vehicle 100 based on the detection result by the external sensor and the current position of the air vehicle sent from the air vehicle position detection unit 451.
- the map generation unit 455 updates the map information in the storage unit 430 by adding the map information of the new area.
- the map generation unit 455 when the map generation unit 455 receives a map display command sent from the input unit 480, the map generation unit 455 refers to the map information in the storage unit 430 and generates display data for displaying the indoor map of the building BLD1. . Then, the map generation unit 455 sends the generated display data to the display unit 470. As a result, an indoor map of the building BLD1 is displayed on the display unit 470.
- the position detection unit 451 detects the position of the unmanned air vehicle 100 using the map generated at each time point. Is called. That is, in this embodiment, the indoor map of the building BLD1 is generated with high accuracy by using a so-called SLAM (SimultaneousaneLocalization And Mapping) technique.
- SLAM SimultaneousaneLocalization And Mapping
- the specification of the vehicle position is started when a vehicle position specification start command sent from the input unit 480 is received.
- the vehicle position specifying unit 461 When specifying the vehicle position, the vehicle position specifying unit 461 generates a flight plan from the current position of the unmanned air vehicle 100 for specifying the vehicle position with reference to the map information in the storage unit 430. A flight control request designating the flight plan is sent to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the flight plan.
- the vehicle position specifying unit 461 transmits a lower imaging designation to the position related information acquisition unit 240 of the flying object mounting device 200.
- the imaging result below the unmanned aerial vehicle 100 performing the flight according to the flight plan for specifying the vehicle position described above is transmitted from the position related information acquisition unit 240 to the vehicle position specifying unit 461.
- the vehicle position specifying unit 461 analyzes the imaging result, and indoors of each building BLD1 of the vehicle CR j parked inside the building BLD1. Specify the position at. Then, the vehicle position specifying unit 461 registers the newly specified vehicle position in the vehicle position portion in the vehicle information table in the storage unit 430.
- the vehicle position specifying unit 461 when the vehicle position specifying unit 461 receives the vehicle position display command sent from the input unit 480, the vehicle position specifying unit 461 refers to the map information in the storage unit 430 and the vehicle position in the vehicle information table, and the vehicle position in the building BLD1 indoors. Display data for displaying is generated. Then, the vehicle position specifying unit 461 sends the generated display data to the display unit 470. As a result, the position of the parked vehicle inside the building BLD1 is displayed on the display unit 470.
- the specification of the feature information is started when a vehicle feature specification start command sent from the input unit 480 is received.
- the vehicle feature specifying unit 465 When specifying the feature information, the vehicle feature specifying unit 465 generates a flight plan for specifying the feature information with reference to the map information in the storage unit 430, and issues a flight control request specifying the generated flight plan. This is sent to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the flight plan.
- the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270 of the flying object mounting device 200.
- the feature information for each vehicle CR j which is the QR code decoding result obtained by the flight according to the flight plan for specifying the feature information, is sent from the feature information acquisition unit 270 to the vehicle feature specification unit 465. Sent.
- the vehicle feature specifying unit 465 associates the newly received feature information with the vehicle position based on the current vehicle position at that time. Then, the vehicle feature specifying unit 465 registers the feature information newly associated with the vehicle position in the feature information portion in the vehicle information table in the storage unit 430 in association with the vehicle position.
- the vehicle feature specifying unit 465 when the vehicle feature specifying unit 465 receives the vehicle feature display command sent from the input unit 480, the vehicle feature specifying unit 465 refers to the vehicle position specified by the vehicle feature display command and the vehicle information table and sets the vehicle position. Display data for displaying feature information of the parked vehicle is generated. Then, the vehicle feature specifying unit 465 sends the generated display data to the display unit 470. As a result, the characteristic information of the specific vehicle parked indoors in the building BLD1 is displayed on the display unit 470.
- the display unit 470 includes a display device such as a liquid crystal panel, an organic EL (Electro Luminescence) panel, and a PDP (Plasma Display Panel).
- a display device such as a liquid crystal panel, an organic EL (Electro Luminescence) panel, and a PDP (Plasma Display Panel).
- the display unit 470 receives display data sent from the map generation unit 455, the vehicle position specifying unit 461, or the vehicle feature specifying unit 465, the display unit 470 displays an image corresponding to the display data.
- the input unit 480 includes a stroke device such as a keyboard and a pointing device such as a mouse.
- a stroke device such as a keyboard
- a pointing device such as a mouse.
- the input unit 480 sends an input result to the map generation unit 455 when a map generation start command or a map display command is input.
- the input unit 480 sends the input result to the vehicle position specification unit 461.
- the input unit 480 sends the input result to the vehicle feature identification unit 465.
- map generation processing by the map generation unit 455, vehicle position specification processing by the vehicle position specification unit 461, and vehicle feature specification processing by the vehicle feature specification unit 465 are described.
- the indoor map information of the building BLD1 is not stored in the storage unit 430 at all.
- vehicle position information and feature information are not registered in the vehicle information table in the storage unit 430.
- the unmanned air vehicle 100 is anchored at the air base BS of the building BLD1.
- the flying object mounting apparatus 200 has already started operation.
- map generation processing by the map generation unit 455, vehicle position specification processing by the vehicle position specification unit 461, and vehicle feature specification processing by the vehicle feature specification unit 465 are input to the input unit 480. Are executed sequentially according to the user's command input.
- the map generation unit 455 When receiving the map generation start command sent from the input unit 480, the map generation unit 455 starts the map generation process.
- this map generation process as shown in FIG. 4, first, in step S11, the map generation unit 455 determines that the unmanned air vehicle 100 is located at the initial position (the position of the flight base BS of the building BLD1). The detection results by the external sensor sent from the ambient environment information acquisition unit 260 are collected. Then, a map is generated for a peripheral area (hereinafter, simply referred to as “peripheral area”) in which the detection result of the external sensor can ensure a predetermined accuracy. And the map production
- step S12 the map generation unit 455 generates an initial flight plan for flying to one outer edge position of the region corresponding to the map information in the storage unit 430 in order to generate a map of the ungenerated region. To do. Then, the map generation unit 455 sends a flight control request specifying the generated first flight plan to the flight control unit 440. As a result, the unmanned aerial vehicle 100 performs a flight according to the first flight plan.
- step S13 the map generation unit 455 is unmanned based on the detection result during the flight by the external sensor mounted on the unmanned air vehicle 100 and the current vehicle position that is sequentially sent from the air vehicle position detection unit 451. A three-dimensional map of the surrounding area of the position of the flying object 100 in flight is generated. And the map production
- the air vehicle position detection unit 451 accurately detects the current vehicle position as described above and sequentially sends the detected current flight position to the map generation unit 455. ing.
- step S14 the map generation unit 455 determines whether or not the map of the entire indoor area of the building BLD1 has been completed. In making such a determination, the map generation unit 455 determines whether or not the unmanned air vehicle 100 is only in a position where it cannot be moved due to an obstacle.
- step S14 determines whether the result of the determination in step S14 is negative (step S14: N). If the result of the determination in step S14 is negative (step S14: N), the process proceeds to step S15.
- step S15 in order to generate a map of the ungenerated area, the map generation unit 455 creates a next flight plan for flying to one outer edge position of the area corresponding to the map information in the storage unit 430 at that time. Generate. Then, the map generation unit 455 sends a flight control request specifying the generated next flight plan to the flight control unit 440. As a result, the unmanned air vehicle 100 performs the flight according to the next flight plan.
- step S15 When the process of step S15 is completed, the process returns to step S13. Thereafter, the processes in steps S13 to S15 are repeated until the result of the determination in step S14 becomes affirmative.
- step S16 the map generation unit 455 generates a flight plan (hereinafter referred to as “return flight plan”) for the unmanned air vehicle 100 to return to the flight base BS. Subsequently, the map generation unit 455 sends a flight control request designating the generated return flight plan to the flight control unit 440. As a result, the unmanned air vehicle 100 flies according to the return flight plan and returns to the flight base BS. Then, the map generation process ends.
- return flight plan hereinafter referred to as “return flight plan”
- the vehicle position specifying unit 461 When the vehicle position specifying unit 461 receives the vehicle position specifying start command sent from the input unit 480, the vehicle position specifying unit 461 starts the vehicle position specifying process.
- the vehicle position specifying unit 461 refers to the map information in the storage unit 430, and the flight plan for specifying the vehicle position. Is generated. Then, a flight control request designating the generated flight plan is sent to the flight control unit 440. Further, the vehicle position specifying unit 461 transmits a lower imaging designation to the position related information acquisition unit 240 of the flying object mounting device 200. As a result, the imaging result below the unmanned air vehicle 100 that is flying according to the flight plan for specifying the vehicle position is transmitted from the position related information acquisition unit 240 to the vehicle position specifying unit 461.
- step S ⁇ b> 22 the vehicle position specifying unit 461 sequentially sends the imaging results below the unmanned air vehicle 100 during the flight sent from the position related information acquisition unit 240 from the flying object position detection unit 451. Collect in association with the current position of the vehicle. Then, when the collection of images for specifying the position of each vehicle parked indoors in the building BLD1 is completed, the process proceeds to step S23.
- step S23 the image collected in step S22 is analyzed, and the position of each vehicle parked indoors in building BLD1 is specified. Then, the vehicle position specifying unit 461 registers the newly specified vehicle position in the vehicle position portion in the vehicle information table in the storage unit 430. Thereafter, the vehicle position specifying process ends.
- the vehicle position specifying unit 461 After receiving the vehicle position display command sent from the input unit 480 after the vehicle position specifying process is completed as described above, the vehicle position specifying unit 461 reads the vehicle position in the map information and the vehicle information table in the storage unit 430. The display data for displaying the vehicle position inside the building BLD1 is generated. Then, the vehicle position specifying unit 461 sends the generated display data to the display unit 470.
- the position of the parked vehicle inside the building BLD1 is displayed on the display unit 470.
- An example of the display image displayed on the display unit 470 in this way is shown in FIG.
- the vehicle feature specifying unit 465 When the vehicle feature specifying unit 465 receives the vehicle feature specifying start command sent from the input unit 480, the vehicle feature specifying unit 465 starts the vehicle feature specifying process.
- the vehicle feature specifying unit 465 first specifies the feature information from the current position of the flying object (hereinafter referred to as “first vehicle”). The flight plan is generated. Then, the vehicle feature identification unit 465 sends a flight control request designating the generated flight plan to the first vehicle to the flight control unit 440. Further, the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270.
- the unmanned air vehicle 100 flies in accordance with the flight plan and reaches the vicinity of the first vehicle. Thereafter, in the feature information acquisition unit 270, the QR code image included in the lower imaging result is analyzed, and the feature information is decoded. Then, the feature information acquisition unit 270 sends the decoded feature information to the vehicle feature identification unit 465.
- the vehicle feature specifying unit 465 captures the QR code in the feature information sheet affixed to the first vehicle when the feature information reaches the target vehicle. Generate a flight plan for the possible altitude.
- step S32 the vehicle feature specifying unit 465 collects the feature information sent from the feature information acquisition unit 270 in association with the current vehicle position that is sequentially sent from the flying object position detection unit 451. Then, the vehicle feature specifying unit 465 adds the newly collected feature information to the feature information portion in the vehicle information table in the storage unit 430 based on the current vehicle position at the time of receiving the feature information. Register in association with the vehicle position corresponding to the position.
- step S33 it is determined whether or not the vehicle feature specifying unit 465 has specified feature information for all of the vehicles parked indoors in the building BLD1. At the time of such determination, the vehicle feature specifying unit 465 determines whether or not the feature information has been registered for all the vehicle positions registered in the vehicle position portion in the vehicle information table in the storage unit 430.
- step S34 a flight plan from the current position of the flying object to the next vehicle for specifying feature information (hereinafter referred to as “next vehicle”) is generated in the same manner as in step S31 described above. Then, the vehicle feature specifying unit 465 sends a flight control request designating the generated flight plan to the next vehicle to the flight control unit 440. Further, the vehicle feature specifying unit 465 transmits a lower imaging designation and a QR code decoding designation to the feature information acquisition unit 270.
- step S34 ends, the process returns to step S32. Thereafter, the processes in steps S32 to S34 are repeated until the result of the determination in step S33 becomes affirmative.
- step S33 the process proceeds to step S35.
- the vehicle feature identification unit 465 generates a return flight plan.
- the vehicle feature specifying unit 465 sends a flight control request designating the generated return flight plan to the flight control unit 440.
- the unmanned air vehicle 100 flies according to the return flight plan and returns to the flight base BS. Then, the vehicle feature identification process ends.
- the vehicle feature specifying unit 465 After receiving the vehicle feature display command sent from the input unit 480 after the vehicle feature specifying process is completed as described above, the vehicle feature specifying unit 465 receives the vehicle position specified by the vehicle feature display command, and With reference to the vehicle information table, display data for displaying the feature information of the vehicle parked at the vehicle position is generated. Then, the vehicle feature specifying unit 465 sends the generated display data to the display unit 470. As a result, the characteristic information of the specific vehicle parked indoors in the building BLD1 is displayed on the display unit 470.
- a vehicle feature display command is input by designating one vehicle display position in the display image on the display unit 470 shown in FIG. 6 described above with a pointing device.
- the feature information of the designated vehicle is displayed so as to be superimposed on the display image on the display unit 470 shown in FIG.
- the position-related information acquisition unit 240 mounted on the unmanned air vehicle 100 that can fly indoors is the vehicle of the vehicle existing indoors in the building BLD 1 during the flight of the unmanned air vehicle 100. Get information related to location. And based on the information acquired by the said position relevant-information acquisition part 240, the vehicle position specific
- the map generation unit 455 generates an indoor map of the building BLD1 based on the information acquired by the displacement information acquisition unit 250 and the surrounding environment information acquisition unit 260 mounted on the unmanned air vehicle 100. Simultaneously with the generation of the map, the flying object position detection unit 451 detects the position of the unmanned flying object 100 in the map being generated by the map generation unit 455. That is, an indoor map of the building BLD1 is generated using a so-called SLAM technique. For this reason, the indoor map of building BLD1 can be generated with high accuracy.
- the position of the unmanned air vehicle 100 in flight for specifying the vehicle position described above is determined based on the information acquired by the vehicle position detection unit 451 by the displacement information acquisition unit 250 and the ambient environment information acquisition unit 260. Then, the position of the unmanned air vehicle 100 in the generated map is detected. Therefore, the position of the unmanned air vehicle 100 in flight is accurately detected in order to identify the vehicle position while avoiding a decrease in accuracy due to the accumulation of offsets of the external sensors included in the displacement information acquisition unit 250. be able to. As a result, the position of the vehicle can be accurately identified.
- the feature information acquisition unit 270 mounted on the unmanned air vehicle 100 acquires the feature information of each vehicle. Then, the vehicle feature specifying unit 465 specifies the feature information of each vehicle by associating the feature information acquired by the feature information acquiring unit 270 with the position of the vehicle specified by the vehicle position specifying unit 461. For this reason, in addition to the position of each vehicle, the characteristic information of each vehicle can be specified.
- a feature information sheet in which feature information is QR-coded is attached to each vehicle. For this reason, the size of the feature information sheet can be made compact, and the feature information of each vehicle can be easily acquired.
- the position related information acquisition unit 240 includes an imaging device, and the vehicle position specifying unit 461 analyzes the imaging result of the imaging device and specifies the position of the vehicle indoors in the building BLD1. For this reason, the position of the vehicle inside the building BLD1 can be specified with a simple configuration.
- the feature information sheet displays the feature information of the affixed vehicle as a QR code. May be.
- an IC tag may be used instead of the feature information sheet, and the feature information may be acquired by non-contact communication with the IC.
- the feature information acquisition unit 270 includes the QR code decoding processing unit.
- the vehicle feature specifying unit may include the QR code decoding processing unit.
- the object for specifying the position and the characteristic information is the vehicle, but other objects than the vehicle may be the object for specifying the position and the characteristic information.
- a spectator holding a ticket with an IC tag in which a designated seating position is stored as feature information is targeted for position identification, and the seating position of the spectator holding the ticket is the correct seating position. You may make it confirm whether there exists.
- the position-related information acquisition unit includes the imaging device.
- the imaging device instead of the imaging device, a laser radar, a thermal sensor, or the like may be included.
- the map generation unit generates a map using both the displacement information and the surrounding environment information.
- the map generation unit may create a map using only one of the displacement information and the surrounding environment information.
- the processing control apparatus is a single apparatus, but the function of the processing control apparatus may be achieved by a plurality of apparatuses that can communicate with each other.
- the plurality of devices can be, for example, a server device that has excellent computing ability for image analysis or the like, and a personal computer that can communicate with the server device.
- the position related information acquisition unit, the movement distance and positioning information acquisition, the environment information acquisition unit, and the feature information acquisition unit are mounted on the unmanned air vehicle. You may make it mount these acquisition parts in the unmanned traveling body which can drive
- the relay device and the processing control device are separate devices, but may be a single device.
- the relay device and the processing control device are connected wirelessly, but may be connected by wire.
- the relay device and the processing control device are arranged separately in different buildings, but may be arranged in the same building.
- the display unit displays information on the current flying object position of the unmanned flying object detected by the flying object position detection part and the position of the air base, the relay device, etc. alone or in combination with the position specifying result. You may do it.
- one unmanned air vehicle 100 and one air base BS are provided.
- a plurality of unmanned air vehicles 100 may be provided, or a plurality of flight bases BS may be provided.
- a plurality of unmanned air vehicles 100 may be provided, and a plurality of flight bases BS may be provided.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
La présente invention concerne une unité d'acquisition d'informations associées à des positions pour un dispositif (200) monté sur un corps volant monté sur un corps volant sans pilote (100) pouvant voler en intérieur et faisant l'acquisition d'informations associées aux positions de véhicules (CR1, CR2, CR3, ...) à l'intérieur d'un bâtiment (BLD1) lors d'un vol du corps volant sans pilote (100). Lorsqu'une unité de spécification de positions de véhicules équipant un dispositif de commande de traitement (400) reçoit, par le biais d'une transmission sans fil, des informations acquises par l'unité d'acquisition d'informations associées à des positions, l'unité de spécification de positions de véhicules spécifie les positions des véhicules (CR1, CR2, CR3, ...) à l'intérieur du bâtiment (BLD1). Par conséquent, même si le bâtiment (BLD1) est en service, les positions des véhicules (CR1, CR2, CR3, ...) à l'intérieur du bâtiment (BLD1) peuvent être spécifiées facilement sans nécessiter l'installation d'un nouvel équipement.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2014/050499 WO2015107623A1 (fr) | 2014-01-15 | 2014-01-15 | Système de gestion et procédé de spécification de positions |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2014/050499 WO2015107623A1 (fr) | 2014-01-15 | 2014-01-15 | Système de gestion et procédé de spécification de positions |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015107623A1 true WO2015107623A1 (fr) | 2015-07-23 |
Family
ID=53542547
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2014/050499 Ceased WO2015107623A1 (fr) | 2014-01-15 | 2014-01-15 | Système de gestion et procédé de spécification de positions |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2015107623A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016015628A (ja) * | 2014-07-02 | 2016-01-28 | 三菱重工業株式会社 | 構造物の屋内監視システム及び方法 |
| JP2017059955A (ja) * | 2015-09-15 | 2017-03-23 | ツネイシホールディングス株式会社 | 撮像システム及びコンピュータプログラム |
| WO2018131165A1 (fr) * | 2017-01-16 | 2018-07-19 | 富士通株式会社 | Programme de traitement d'informations, procédé de traitement d'informations et dispositif de traitement d'informations |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004338889A (ja) * | 2003-05-16 | 2004-12-02 | Hitachi Ltd | 映像認識装置 |
| JP2005083984A (ja) * | 2003-09-10 | 2005-03-31 | Neomax Co Ltd | 物品位置確認システム |
| JP2010055444A (ja) * | 2008-08-29 | 2010-03-11 | Hitachi Industrial Equipment Systems Co Ltd | ロボットシステム |
| JP2013086912A (ja) * | 2011-10-17 | 2013-05-13 | Fujitsu Advanced Engineering Ltd | 物品管理システム、物品管理方法及び物品管理プログラム |
-
2014
- 2014-01-15 WO PCT/JP2014/050499 patent/WO2015107623A1/fr not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004338889A (ja) * | 2003-05-16 | 2004-12-02 | Hitachi Ltd | 映像認識装置 |
| JP2005083984A (ja) * | 2003-09-10 | 2005-03-31 | Neomax Co Ltd | 物品位置確認システム |
| JP2010055444A (ja) * | 2008-08-29 | 2010-03-11 | Hitachi Industrial Equipment Systems Co Ltd | ロボットシステム |
| JP2013086912A (ja) * | 2011-10-17 | 2013-05-13 | Fujitsu Advanced Engineering Ltd | 物品管理システム、物品管理方法及び物品管理プログラム |
Cited By (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016015628A (ja) * | 2014-07-02 | 2016-01-28 | 三菱重工業株式会社 | 構造物の屋内監視システム及び方法 |
| JP2017059955A (ja) * | 2015-09-15 | 2017-03-23 | ツネイシホールディングス株式会社 | 撮像システム及びコンピュータプログラム |
| WO2018131165A1 (fr) * | 2017-01-16 | 2018-07-19 | 富士通株式会社 | Programme de traitement d'informations, procédé de traitement d'informations et dispositif de traitement d'informations |
| JPWO2018131165A1 (ja) * | 2017-01-16 | 2019-11-07 | 富士通株式会社 | 情報処理プログラム、情報処理方法および情報処理装置 |
| US10885357B2 (en) | 2017-01-16 | 2021-01-05 | Fujitsu Limited | Recording medium recording information processing program, information processing method, and information processing apparatus |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP2015131713A (ja) | 管理システム、飛行制御方法、飛行制御プログラム及び記録媒体 | |
| US11604479B2 (en) | Methods and system for vision-based landing | |
| US10599149B2 (en) | Salient feature based vehicle positioning | |
| US10802509B2 (en) | Selective processing of sensor data | |
| US11725940B2 (en) | Unmanned aerial vehicle control point selection system | |
| EP3729227B1 (fr) | Localisation basée sur l'image pour véhicules aériens sans pilote et systèmes et procédés associés | |
| US20150237481A1 (en) | Navigation method and device | |
| CN111256701A (zh) | 一种设备定位方法和系统 | |
| CN114675671A (zh) | 多传感器环境地图构建 | |
| US20190003840A1 (en) | Map registration point collection with mobile drone | |
| CN111026107B (zh) | 用于确定可移动物体的位置的方法和系统 | |
| JP2020170213A (ja) | ドローン作業支援システム及びドローン作業支援方法 | |
| WO2021166845A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
| KR102105105B1 (ko) | 실내 측위 방법 및 이를 수행하는 장치들 | |
| WO2015107623A1 (fr) | Système de gestion et procédé de spécification de positions | |
| CN110892353A (zh) | 控制方法、控制装置、无人飞行器的控制终端 | |
| US20210247773A1 (en) | Estimation system, estimation apparatus, estimation method, and computer program | |
| JP7698364B2 (ja) | 情報処理システム及び移動体、情報処理方法、プログラム | |
| WO2021064982A1 (fr) | Dispositif et procédé de traitement d'informations | |
| CN113227818A (zh) | 使用对齐参照系在物理空间中进行物体跟踪的系统 | |
| US20250035460A1 (en) | Display control device and display control method | |
| CN111752293B (zh) | 用于对能够自主移动的机器进行导引的方法和电子设备 | |
| JP7467190B2 (ja) | 位置推定装置、位置推定システム及び位置推定方法 | |
| JP2024146338A (ja) | 情報処理方法、情報処理装置、コンピュータプログラム、及び情報処理システム | |
| JP2024097153A (ja) | 建築物又は土木構造物の管理支援システム、及び、移動通信端末 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14879213 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 14879213 Country of ref document: EP Kind code of ref document: A1 |