[go: up one dir, main page]

WO2025169696A1 - Flying body detecting device, flying body detecting system, flying body detecting method, and program storage medium - Google Patents

Flying body detecting device, flying body detecting system, flying body detecting method, and program storage medium

Info

Publication number
WO2025169696A1
WO2025169696A1 PCT/JP2025/001506 JP2025001506W WO2025169696A1 WO 2025169696 A1 WO2025169696 A1 WO 2025169696A1 JP 2025001506 W JP2025001506 W JP 2025001506W WO 2025169696 A1 WO2025169696 A1 WO 2025169696A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
detection
telephoto
flying object
wide
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2025/001506
Other languages
French (fr)
Japanese (ja)
Inventor
尚司 谷内田
恭太 比嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Publication of WO2025169696A1 publication Critical patent/WO2025169696A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This disclosure relates to an airborne object detection device, an airborne object detection system, an airborne object detection method, and a program storage medium for detecting airborne objects.
  • No-fly zones for example, the airspace above airports and important facilities and the surrounding areas
  • unmanned aerial vehicles small unmanned aircraft
  • UAVs Unmanned Aerial Vehicles
  • Patent Document 1 JP 2007-116666 A discloses technology for efficiently monitoring surveillance areas spanning several kilometers or more, and accurately capturing and photographing moving objects within the surveillance area. That is, with the technology shown in Patent Document 1, the entire surveillance area is photographed with a wide-angle camera. When a moving object is detected in an image captured by the wide-angle camera, the attitude of the telephoto camera is controlled to point the optical axis of the telephoto camera in the direction of the moving object. In other words, the telephoto camera photographs the moving object while tracking it.
  • the main purpose of this disclosure is to provide technology that prevents the detection accuracy of an aerial vehicle in a detection area from decreasing due to the distance from the imaging device that captures the detection area.
  • the flying object detection device of the present disclosure as one aspect thereof, an acquisition unit that acquires a telephoto image, which is an image captured by a telephoto type imaging device that captures a distant part of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location to capture images of the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices; a detection unit that detects a target flying object in a distant part of the detection area from a telephoto image and detects a target flying object in a part of the detection area other than the distant part from a wide-angle image; a generation unit that generates an image reflecting the detection result by superimposing information indicating the position of the detected flying object in the image on the wide-angle image; and an output unit that outputs an image reflecting the detection result.
  • a telephoto image which is an image captured by a telephoto type imaging device that captures a distant part
  • one aspect of the flying object detection system of the present disclosure is a telephoto type imaging device that captures an image of a distant part of the detection area that is far from the installation location, among a plurality of types of imaging devices with different angles of view that are installed at a common installation location for capturing an image of the detection area; a wide-angle type imaging device among the plurality of types of imaging devices that captures images of a detection area including a portion that is not captured by the telephoto type imaging device; and and the above-mentioned flying object detection device that uses images captured by the telephoto type imaging device and the wide-angle type imaging device.
  • the flying object detection method of the present disclosure includes: By computer, Among a plurality of types of imaging devices with different angles of view installed at a common installation location for imaging the detection area, a telephoto image is acquired which is an image captured by a telephoto type imaging device which captures a distant portion of the detection area that is far from the installation location, and a wide-angle image is acquired which is an image captured by a wide-angle type imaging device among the plurality of types of imaging devices, Detecting a target flying object in a distant part of the detection area from a telephoto image, and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image; generating an image reflecting the detection results by superimposing information representing the position of the detected flying object in the image on the wide-angle image; An image reflecting the detection results is output.
  • the program storage medium of the present disclosure includes: a process of acquiring a telephoto image, which is an image captured by a telephoto type imaging device that captures a distant portion of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location to capture the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices; A process of detecting a target flying object in a distant part of the detection area from a telephoto image, and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image; A process of generating an image reflecting the detection result by superimposing information representing the position of the detected flying object in the image on the wide-angle image; The computer program for causing a computer to execute a process of outputting an image reflecting the detection result is stored.
  • FIG. 1 is a diagram illustrating the configuration of an embodiment of an air vehicle detection device according to the present disclosure.
  • FIG. 2 is a diagram illustrating an imaging device that constitutes the flying object detection system according to the present disclosure. 2 is a diagram illustrating the imaging device.
  • FIG. FIG. 10 is a diagram illustrating an example of a detection result reflection image.
  • FIG. 10 is a diagram illustrating another example of the detection result reflection image.
  • FIG. 10 is a diagram illustrating yet another example of the detection result reflection image.
  • 10 is a flowchart illustrating an example of an operation related to detecting a flying object in the flying object detection device.
  • 10A and 10B are diagrams illustrating modified examples of the generation unit in the flying object detection device.
  • 10A and 10B are diagrams illustrating other embodiments.
  • 10 is a flowchart illustrating an example of an operation of the
  • the flying object detection system 1 of the first embodiment is a system that detects a target flying object in a detection area using images captured by an imaging device, and includes multiple types of imaging devices 2 and 3 and an flying object detection device (hereinafter also simply referred to as a detection device) 5, as shown in Fig. 1.
  • the detection device 5 is a computer device that detects flying objects in the detection area by image analysis of the images captured by the imaging devices 2 and 3.
  • the flying object detection system 1 is applied to a surveillance system.
  • a surveillance system to which the flying object detection system 1 is applied is, for example, a system that monitors the airspace above and surrounding areas of important facilities such as airports and nuclear power plants, and the surveillance area includes no-fly zones where the flight of unmanned aerial vehicles is prohibited by law. Because the flying object detection system 1 is applied to such a surveillance system, the detection area of the flying object detection system 1 is the surveillance area of the surveillance system to which it is applied.
  • the target air vehicles to be detected by the air vehicle detection system 1 are determined in advance by a system designer, taking into consideration, for example, the type of facility being monitored by the applicable monitoring system and its surrounding environment.
  • air vehicles set as detection targets include unmanned aerial vehicles (unmanned aerial vehicles) such as unmanned airplanes, unmanned rotorcraft, and unmanned airships, which can be flown by remote control or automatic piloting, as well as balloons, hang gliders, and paragliders, which are flown by humans using specific aviation equipment.
  • unmanned aerial vehicles unmanned aerial vehicles
  • unmanned aerial vehicles such as unmanned airplanes, unmanned rotorcraft, and unmanned airships, which can be flown by remote control or automatic piloting, as well as balloons, hang gliders, and paragliders, which are flown by humans using specific aviation equipment.
  • drones unmanned aerial vehicles
  • imaging devices with different angles of view are used as the imaging devices that make up the flying object detection system 1.
  • One type of imaging device used in the flying object detection system 1 is a telephoto type imaging device equipped with a telephoto lens.
  • Another type of imaging device used in the flying object detection system 1 is a wide-angle type imaging device equipped with a wide-angle lens.
  • the lenses provided in the imaging devices can be classified into three types depending on the angle of view: "standard lens,” “wide-angle lens,” and "telephoto lens.”
  • a “standard lens” is a lens with an angle of view of approximately 45 to 50 degrees, which is said to be close to the human field of view.
  • a “wide-angle lens” is a lens with a wider angle of view than a “standard lens,” for example, 60 degrees or more.
  • a “telephoto lens” is a lens with a narrower angle of view than a "standard lens,” for example, 30 degrees or less.
  • a wide-angle type imaging device will also be referred to as a wide-angle camera 2
  • a telephoto type imaging device will also be referred to as a telephoto camera 3.
  • the wide-angle camera 2 and telephoto camera 3 are installed in a common installation location where they can capture images of the detection area, with the image capture direction fixed.
  • the wide-angle camera 2 and telephoto camera 3 are installed at a height H(y) of approximately 10 meters, allowing them to view the detection area as shown in Figure 2.
  • Figure 2 shows a schematic diagram of an example of the capture range of the detection area of the wide-angle camera 2 and telephoto camera 3 installed in this manner, viewed from a direction along the ground surface.
  • the portion of the field of view of the wide-angle camera 2 that captures the detection area is represented by the hatched area Zw.
  • the portion of the detection area that is captured by the telephoto camera 3 is represented by the lightly shaded area Zt.
  • unmanned aerial vehicles such as drones
  • altitude restrictions such as prohibiting flight in airspace above 150 meters above the ground.
  • the example in Figure 2 shows a detection area that takes into account altitude restrictions for such flying objects (unmanned aerial vehicles).
  • Telephoto camera 3 captures images of the distant part of the detection area that is far from the installation location of telephoto camera 3.
  • the camera settings (magnification, etc.) of telephoto camera 3 are configured to capture images of the distant part of the detection area, such as a distance D(z) of approximately 800 meters from the installation location of telephoto camera 3.
  • the image captured by telephoto camera 3 is also referred to as a telephoto image.
  • the telephoto image is used by detection device 5 to detect the target flying object in the distant part of the detection area.
  • FIG. 3 shows a schematic diagram of an example of the capture range in each detection area of wide-angle camera 2 and telephoto camera 3, viewed from the zenith.
  • the portion of the field of view of wide-angle camera 2 that captures the detection area is represented by the hatched portion Zw.
  • the portion of the detection area that is captured by telephoto camera 3 is represented by the lightly shaded portion Zt.
  • FIG. 3 there are cases where a single telephoto camera 3 is unable to capture the entire distant portion of the detection area.
  • multiple telephoto cameras 3 are installed in the same installation location, and the shooting directions of the multiple telephoto cameras 3 are set so that the shooting ranges of these telephoto cameras 3 are shifted and the entire distant portion is captured by the multiple telephoto cameras 3.
  • the distant portion of the detection area is captured by multiple telephoto-type shooting devices with mutually shifted fields of view.
  • the shooting directions of the multiple telephoto cameras 3 are set so that the shooting ranges are shifted in a direction along the ground surface, but depending on the height H(d) of the detection area from the ground surface and the width of the field of view of the telephoto cameras 3, the shooting directions of the multiple telephoto cameras 3 may be set so that the shooting ranges are shifted in the vertical direction. Furthermore, the number of telephoto cameras 3 installed depends on the width of the distant portion of the detection area and the width of the field of view of the telephoto cameras 3, and is not limited.
  • multiple wide-angle cameras 2 may be installed in the same installation location, and the shooting directions of these wide-angle cameras 2 may be set so that the shooting ranges of these wide-angle cameras 2 are shifted and the entire detection area is captured by multiple wide-angle cameras 2.
  • the cameras are set so that the difference between the size of the target of detection object captured in the telephoto image and the size of the target of detection object captured in the wide-angle image is within a predetermined range (tolerance).
  • the cameras are set so that the size of the target of detection object captured in the telephoto image is the same as the size of the target of detection object captured in the wide-angle image.
  • Such camera settings are related to reducing the load on the detection device 5 in the detection process for the target flying object. Taking into account the reduction in the load of the detection process, the tolerance range related to the difference in size of the target of detection object captured in the telephoto image and the wide-angle image described above is set.
  • the detection device 5 is a computer device that detects target flying objects by analyzing wide-angle images captured by the wide-angle camera 2 and telephoto images captured by the telephoto camera 3. That is, as shown in FIG. 1, the detection device 5 is directly or indirectly connected to each of the wide-angle camera 2 and the telephoto camera 3, and includes an arithmetic unit 50 and a storage device 40.
  • the storage device 40 includes a storage medium for storing data and a computer program (hereinafter also referred to as a program) 41.
  • a program 41 a computer program 41.
  • semiconductor memory devices such as RAM (Random Access Memory) and ROM (Read Only Memory).
  • the arithmetic device 50 is composed of processors such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit).
  • the arithmetic device 50 can have functions based on a program 41 by reading and executing the program 41 stored in the storage device 40.
  • the arithmetic device 50 has an acquisition unit 51, a detection unit 52, a generation unit 53, and an output unit 54 as functional units related to detecting flying objects.
  • the acquisition unit 51 acquires wide-angle images captured by the wide-angle camera 2 and telephoto images captured by the telephoto camera 3.
  • the method by which the acquisition unit 51 acquires wide-angle images and telephoto images is not limited here; for example, the acquisition unit 51 may acquire captured images (wide-angle images and telephoto images) from the wide-angle camera 2 and telephoto camera 3, respectively, or may acquire wide-angle images and telephoto images by reading them from a database (not shown) that has temporarily stored the wide-angle images and telephoto images from the wide-angle camera 2 and telephoto camera 3, respectively.
  • the detection unit 52 detects target flying objects in the distant part of the detection area from the telephoto image, and detects target flying objects in the part of the detection area covered by the wide-angle camera (the part of the detection area other than the distant part) from the wide-angle image.
  • Various methods have been proposed for detecting target flying objects from captured images (wide-angle images or telephoto images), and one example, although not limited to these, is a detection method that uses AI (Artificial Intelligence) technology. In this case, a detection model generated using AI technology is used. This detection model is generated by learning from images of the target flying object as training data.
  • AI Artificial Intelligence
  • the input information to the detection model is the captured image (wide-angle image or telephoto image), and the output information from the detection model includes information indicating the presence or absence of the target flying object in the input captured image, and, if the target flying object is detected, information indicating the position of the detected flying object in the captured image (hereinafter also referred to as detection position information).
  • the wide-angle camera 2 and telephoto camera 3 are set so that when a flying object of the same size to be detected is photographed in the wide-angle camera's portion and the distant portion of the detection area, the size of the image of the flying object captured in each of the wide-angle and telephoto images will be within a predetermined range (acceptable range).
  • the cameras are set so that the image analysis of the wide-angle image and the telephoto image can be performed using the same detection model.
  • the same detection model is used for the image analysis of the wide-angle image and the telephoto image.
  • a detection model that can distinguish and detect each of these multiple types of flying objects is generated by learning images of these multiple types of flying objects.
  • the detection unit 52 may use this detection model. In this case, the detection unit 52 can also output information indicating the type of flying object detected.
  • the detection unit 52 uses this detection model to detect the target flying object from both the wide-angle image and the telephoto image. Furthermore, if the detection unit 52 detects a flying object from the telephoto image, the flying object should also appear in the wide-angle image, and so the detection unit 52 associates the flying object in the wide-angle image with the telephoto image. This process uses positional relationship data between the telephoto image and the wide-angle image, which associates image portions that show the same real space. This data is generated in advance and stored in the storage device 40.
  • Information representing the detection results by the detection unit 52 is stored in the storage device 40 in association with, for example, information identifying the imaging device (wide-angle camera 2 or telephoto camera) that captured the captured image subjected to the detection process, information representing the frame number of the captured image subjected to the detection process, and information about the time of capture.
  • the detection process for the target flying object is performed using the same detection model for both wide-angle images and telephoto images. This reduces the processing load on the detection device 5 compared to when wide-angle images and telephoto images are processed using separate detection models, and also shortens the time required for the detection process for the flying object. This contributes to real-time detection of flying objects using images captured by the wide-angle camera 2 and telephoto camera 3.
  • the generation unit 53 generates an image reflecting the detection results.
  • the image reflecting the detection results is an image in which information representing the position of the detected flying object in the image is superimposed on the wide-angle image.
  • the information representing the position of the detected flying object in the image is represented, for example, by a graphic.
  • the graphic representing the position of the detected flying object in the image (hereinafter also referred to as the flying object detection graphic) is not limited here and may be, for example, a circle, triangle, or square, or a symbol or a mark resembling an flying object, and is set appropriately by a system designer, etc.
  • the information used in the process of generating the detection result reflection image is information about the position of the flying object in the wide-angle image detected from the wide-angle image (i.e., detection position information), and information about the position of the flying object in the wide-angle image detected from the telephoto image by matching the telephoto image with the wide-angle image.
  • the wide-angle image and the telephoto image are synchronized so that their shooting times match.
  • the flying object detection graphic representing the position of the flying object in the image detected from the telephoto image is superimposed on the wide-angle image captured at the same shooting time as the telephoto image in which the flying object was detected.
  • FIGS. 4 and 5 each show examples of detection result reflection images.
  • the detection result reflection image is an image in which a flying object detection graphic 8, which is a double circle, is superimposed on a wide-angle image.
  • the example of FIG. 5 is an example of a detection result reflection image when multiple types of flying objects are detected.
  • the detection result reflection image is an image in which a star-shaped flying object detection graphic 8 corresponding to the type of flying object detected, and a double circle flying object detection graphic 8 corresponding to another type of flying object detected, are superimposed on a wide-angle image. Note that in FIGS. 4 and 5, the detected flying object is indicated by the symbol "9." Furthermore, when multiple types of flying objects are detected, the same flying object detection graphic 8 may be superimposed on the wide-angle image regardless of the type of flying object detected.
  • the alert level may differ depending on the flight location and type of the detected flying object.
  • the type of flying object detection graphic 8 corresponding to the alert level may be predetermined, and the generation unit 53 may generate an image reflecting the detection results by superimposing the flying object detection graphic 8 of the type corresponding to the alert level on the wide-angle image.
  • the alert level is calculated, for example, by the detection unit 52. In other words, if different alert levels are set depending on the distance from the installation location of the wide-angle camera 2 and the telephoto camera 3 (camera installation location), data relating to the distance from the camera installation location to the flying object and the alert level is stored in the storage device 40 as alert level judgment information used to determine the alert level.
  • alert level judgment information data relating to the type of flying object and the alert level is stored in the storage device 40 as alert level judgment information. Furthermore, if the alert level is set based on the combination of the distance from the camera installation location to the flying object and the type of flying object, data relating to the alert level is stored in the storage device 40 as alert level judgment information.
  • the detection unit 52 calculates the alert level of the detected flying object using one or both of the distance from the camera installation location to the detected flying object and the type of flying object, as well as the alert level determination information stored in the storage device 40.
  • the generation unit 53 then generates an image reflecting the detection results by superimposing an flying object detection graphic 8 of a type corresponding to the calculated alert level onto the wide-angle image as described above. Note that the method for calculating the distance between the detected flying object and the camera installation location from the wide-angle image or telephoto image is not limited here, and therefore its description is omitted.
  • the image reflecting the detection results generated by the generation unit 53 may further include text indicating information such as the type of aircraft detected and the alert level.
  • the output unit 54 outputs the generated image reflecting the detection results.
  • An example of an output destination is the display device 7 shown in Figure 1.
  • the display device 7 is a device that notifies (provides) information by displaying the information on a screen using text and images.
  • the display device 7 receives the image reflecting the detection results from the output unit 54 and provides the image reflecting the detection results to, for example, a user of the flying object detection system (monitoring system).
  • the display device 7 may be, for example, a display device of a terminal device carried by a user of the flying object detection system (monitoring system).
  • the captured images (wide-angle image, telephoto image) output by the wide-angle camera 2 and the telephoto camera 3 are moving images
  • the detection process for the flying object is performed using a frame image selected from the multiple frame images that make up the captured image.
  • the detection result reflection image generated by reflecting the results of this detection process is, for example, included in the moving image, which is a wide-angle image, by replacing the frame image of the original wide-angle image, and is output to the output destination by the output unit 54.
  • the display device 7 displays the moving image (wide-angle image) output by the output unit 54 in this way. In other words, the output unit 54 can be said to control the display operation of the display device 7.
  • the output unit 54 may perform display control to enlarge the flying object 9 in response to the request, as shown in FIG. 6.
  • the method by which the user inputs the enlarged display request is not limited here, but an example is a method in which the enlarged display request is input to the detection device 5 by clicking the flying object detection graphic 8 with the cursor 71 as shown in FIG. 6.
  • the manner in which the flying object is enlarged is not limited here and may be an appropriately set display manner. For example, as shown in FIG. 6, a window image in which the flying object is enlarged may be superimposed on the wide-angle image, or the wide-angle image and the enlarged image of the flying object may be displayed side by side.
  • the detection device 5 of the first embodiment is configured as described above. Next, an example of the operation of the detection device 5 to detect an airborne object will be described with reference to Figure 7.
  • Figure 7 is a flowchart explaining an example of the operation of the detection device 5 to detect an airborne object. Figure 7 can also be considered a diagram explaining the method for detecting an airborne object in the detection device 5.
  • the storage device 40 in the detection device 5 stores various information (data) used for processing by the arithmetic unit (processor) 50 as described above.
  • the acquisition unit 51 of the detection device 5 acquires wide-angle images from the wide-angle camera 2 and telephoto images from the telephoto camera 3 (step 101 in Figure 7).
  • the wide-angle images and telephoto images are moving images.
  • the detection unit 52 determines whether the target flying object has been detected in at least one of the wide-angle image and the telephoto image through the detection process (step 103). If the target flying object is detected, the detection unit 52 calculates the position in the wide-angle image or telephoto image in which the flying object is detected as detected photographing information. Furthermore, if the photographed image in which the flying object is detected is a telephoto image, the detection unit 52 calculates the position in the wide-angle image of the flying object detected from the telephoto image through processing to associate the telephoto image with the wide-angle image.
  • the generation unit 53 uses information from the detection process by the detection unit 52 to superimpose an airborne object detection graphic 8 representing the position of the detected airborne object in the image onto the wide-angle image, thereby generating an image reflecting the detection result (step 104).
  • the output unit 54 then outputs the image reflecting the detection result (step 105).
  • the output unit 54 since the wide-angle image is a moving image, the output unit 54 outputs a wide-angle image including the detection result reflection image (frame image), for example, to the display device 7.
  • the airborne object detection process described above is performed sequentially on each of multiple frame images selected from the multiple frame images in each of the wide-angle image and the telephoto image.
  • the generation unit 53 When the detection unit 52 detects an airborne object, the generation unit 53 generates an image reflecting the detection result. When the detection result reflection image (frame image) is generated, the output unit 54 outputs a wide-angle image including the detection result reflection image, for example, to the display device 7.
  • the flying object detection process for that frame image is terminated, and the system prepares for the flying object detection process for the next frame image to be selected.
  • the detection device 5 of the first embodiment and the flying object detection system 1 equipped with the detection device 5 use telephoto images to detect target flying objects in the distant parts of the detection area.
  • the target flying object can be captured at a detectable size even if it is far from the camera installation location. Therefore, by using telephoto images, the flying object detection system 1 can prevent situations where the target flying object captured in the captured image is too small to be detected due to its distance from the camera installation location, resulting in the flying object being missed.
  • the flying object detection system 1 uses wide-angle images to detect target flying objects in parts of the detection area other than the distant part.
  • a wide-angle type imaging device is able to capture parts of the detection area that are not clearly visible in a telephoto image (in other words, parts of the detection area other than the distant part).
  • the flying object detection system 1 can detect target flying objects in parts of the detection area other than the distant part while preventing missed detections.
  • the detection device 5 and the flying object detection system 1 equipped with the detection device 5 can prevent the detection accuracy of flying objects in the detection area from decreasing due to the distance from the imaging device that captures the detection area.
  • the output unit 54 outputs a detection result reflection image based on a wide-angle image.
  • the output unit 54 may output the telephoto image to the requestor in response to the request.
  • the output unit 54 outputs a telephoto image synchronized with the detection result reflection image to be output.
  • the telephoto image output in this manner may be displayed on the display device 7 in place of the detection result reflection image, or the detection result reflection image and the telephoto image may be displayed side by side on the display device 7.
  • FIG. 9 is a schematic diagram showing an example of the capture ranges in the detection areas of the wide-angle camera 2 and the telephoto camera 3, viewed from the zenith.
  • the application of the flying object detection system in the present disclosure is not limited to surveillance systems.
  • the flying object detection system in the present disclosure may be applied to detect unmanned aircraft that are subject to management as flying objects to be detected.
  • an air vehicle that is a man-made object was used as an example of an air vehicle to be detected, but, for example, a bird may also be set as an air vehicle to be detected.
  • a bird collides with an aircraft, there is a risk of a serious accident, such as the aircraft crashing. If a bird flies near an aircraft, there is a risk of this happening, and there is concern that it may interfere with the operation of the unmanned aircraft. For this reason, for example, when the air vehicle detection system of the present disclosure is applied to an unmanned aircraft traffic management system, it will also detect birds as an air vehicle to be detected.
  • the output unit 54 outputs an image reflecting the detection result to the display device 7 using the generation unit 53.
  • the output unit 54 may output information indicating that the airborne object has been detected (airborne object detection information) to a predetermined notification destination other than the display device.
  • the notification destination is a computer device of a monitoring system to which the airborne object detection system is applied.
  • the airborne object detection information include text information of a message indicating that an airborne object has been detected, and control information for an alarm sound to notify the detection of an airborne object by sound.
  • the flying object detection device may also have a configuration such as that shown in FIG. 11. That is, the flying object detection device 10 is, for example, a computer device, and includes an acquisition unit 11, a detection unit 12, a generation unit 13, and an output unit 14 as functional units realized by executing a computer program.
  • the acquisition unit 11 acquires telephoto images, which are images captured by a telephoto type imaging device that captures distant parts of the detection area that are far from the installation location, out of multiple types of imaging devices with different angles of view that are installed in a common installation location to capture images of the detection area.
  • the acquisition unit 11 also acquires wide-angle images, which are images captured by a wide-angle type imaging device out of the multiple types of imaging devices.
  • the detection unit 12 detects target flying objects in the distant part of the detection area from the telephoto image.
  • the detection unit 12 also detects target flying objects in the part of the detection area other than the distant part from the wide-angle image.
  • the generation unit 13 generates an image reflecting the detection results by superimposing information indicating the position of the detected flying object in the image onto the wide-angle image.
  • the output unit 14 outputs the image reflecting the detection results. Note that the acquisition unit 51, detection unit 52, generation unit 53, and output unit 54 of the detection device 5 in the first embodiment described above are examples of the acquisition unit 11, detection unit 12, generation unit 13, and output unit 14.
  • the flying object detection device 10 has the configuration described above. Together with the imaging devices 20 and 30, as shown by the dotted lines in Figure 11, the flying object detection device 10 forms an flying object detection system.
  • Figure 12 is a flowchart explaining an example of the operation of the flying object detection device 10.
  • Figure 12 can also be considered a diagram explaining an example of a flying object detection method used by the flying object detection device 10.
  • the detection unit 12 executes a detection process to detect the target flying object from each of the wide-angle image and the telephoto image (step 202). Then, if the target flying object is detected, the generation unit 13 generates an image reflecting the detection result by superimposing information representing the position of the detected flying object in the image on the wide-angle image (step 203). The output unit 14 outputs the generated image reflecting the detection result (step 204).
  • the flying object detection device 10 performs detection processing of the target flying object using telephoto images (i.e., images captured by a telephoto-type camera) for the distant parts of the detection area. Therefore, even if the target flying object is flying in the distant parts and located far from the installation location of the camera, the flying object detection device 10 can prevent the target flying object from being missed due to appearing small in the captured image, because the target flying object is magnified and captured in the telephoto image.
  • telephoto images i.e., images captured by a telephoto-type camera
  • the flying object detection device 10 performs detection processing of the target flying object using wide-angle images, which are images captured by a wide-angle type camera device.
  • wide-angle images are images captured by a wide-angle type camera device.
  • a wide-angle type camera device has a wide field of view that can compensate for the blind spots of a telephoto type camera device, and by focusing on detection area portions closer to the installation location of the camera device than the distant portion, it is possible to clearly capture the target flying object in that detection area portion.
  • the flying object detection device 10 can compensate for the shortcomings of using telephoto images, thereby preventing the detection accuracy of flying objects in the detection area from decreasing due to the distance from the imaging device capturing the detection area.
  • the distant part of the detection area is photographed by a plurality of telephoto type photographing devices whose fields of view are shifted from each other, the acquisition unit acquires telephoto images captured by each of the plurality of telephoto type imaging devices, The flying object detection device described in Appendix 1, wherein the detection unit detects the target flying object using telephoto images taken by each of the multiple telephoto type imaging devices in the distant part of the detection area.
  • the detection unit is an aircraft detection device described in Appendix 1 that detects multiple types of aircraft to be detected separately.
  • the generating unit further generates an information-attached telephoto image by superimposing information representing the detection result detected from the wide-angle image by the detecting unit on the telephoto image.
  • the output unit is an aerial vehicle detection device described in Appendix 1, which performs display control to enlarge and display an image of the detected aerial vehicle in response to a request to enlarge and display the image of the aerial vehicle on a display device.
  • the output unit outputs information indicating that an air vehicle has been detected to a predetermined notification destination other than a display device.
  • the telephoto and wide-angle images are acquired by the acquisition unit using the telephoto type imaging device and the wide-angle type imaging device, whose cameras are set so that when a flying object of the same size is captured in a telephoto image and a wide-angle image in a distant part of the detection area and in a part of the detection area other than the distant part, the difference in size of the flying object of the detection object captured in the telephoto image and the wide-angle image falls within a predetermined range;
  • the detection unit performs the detection process for the flying object using the same detection model generated by learning captured images of the flying object to be detected for each of the telephoto and wide-angle images.
  • Appendix 8 a telephoto type imaging device that captures an image of a distant part of the detection area that is far from the installation location, among a plurality of types of imaging devices with different angles of view that are installed at a common installation location for capturing an image of the detection area; a wide-angle type imaging device among the plurality of types of imaging devices that captures images of a detection area including a portion that is not captured by the telephoto type imaging device; and An aircraft detection system comprising the aircraft detection device described in Appendix 1, which uses images captured by the telephoto type imaging device and the wide-angle type imaging device, respectively.
  • a telephoto image is acquired which is an image captured by a telephoto type imaging device which captures a distant portion of the detection area that is far from the installation location
  • a wide-angle image is acquired which is an image captured by a wide-angle type imaging device among the plurality of types of imaging devices
  • a process of acquiring a telephoto image which is an image captured by a telephoto type imaging device that captures a distant portion of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location to capture the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices;

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

In order to prevent the detection accuracy of a flying body in a detection region from decreasing due to the distance of said flying body from an imaging device that captures an image of the detection region, this flying body detecting device comprises the following functions. The flying body detecting device acquires a telephoto image, which is a captured image captured by, from among a plurality of types of imaging devices having different angles of view installed at a common installation location for capturing images of a detection region, a telephoto-type imaging device for capturing an image of a distant portion distant from the installation location in the detection region. The flying body detecting device acquires a wide-angle image, which is a captured image captured by a wide-angle-type imaging device from among the plurality of types of imaging devices. The flying body detecting device detects, from the telephoto image, a detection target flying body in a distant portion in the detection region, and detects, from the wide-angle image, a detection target flying body in a detection region portion other than the distant portion. The flying body detecting device generates a detection result reflecting image by superimposing, on the wide-angle image, information indicating the position of the detected flying body in the image.

Description

飛行体検知装置、飛行体検知システム、飛行体検知方法およびプログラム記憶媒体Flying object detection device, flying object detection system, flying object detection method, and program storage medium

 本開示は、飛行体を検知する飛行体検知装置、飛行体検知システム、飛行体検知方法およびプログラム記憶媒体に関する。 This disclosure relates to an airborne object detection device, an airborne object detection system, an airborne object detection method, and a program storage medium for detecting airborne objects.

 ドローンやUAV(Unmanned Aerial Vehicle)等とも称される無人航空機(小型無人機)には、飛行禁止空域(例えば、空港や重要施設などの上空およびその周辺領域)が定められており、無人航空機が飛行禁止空域を飛行するためには飛行許可が必要である。しかしながら、無人航空機の利活用の増加により、飛行禁止空域に無許可で侵入してしまう無人航空機の増加が懸念されている。 No-fly zones (for example, the airspace above airports and important facilities and the surrounding areas) have been designated for unmanned aerial vehicles (small unmanned aircraft), also known as drones and UAVs (Unmanned Aerial Vehicles), and a flight permit is required for unmanned aircraft to fly in these no-fly zones. However, with the increasing use of unmanned aircraft, there are concerns that an increasing number of unmanned aircraft will enter these no-fly zones without permission.

 特許文献1(特開2007-116666号公報)には、数km以上にも及ぶ監視エリアを効率的に監視し、監視エリア内の動体を的確に捕捉して撮影する技術が開示されている。すなわち、特許文献1に示されている技術では、広角カメラで監視エリア全体が撮影される。広角カメラによる撮影画像から動体が検知されると、その動体の方向に望遠カメラの光軸を向ける望遠カメラの姿勢制御が行われる。つまり、望遠カメラは動体を追尾しながら撮影する。 Patent Document 1 (JP 2007-116666 A) discloses technology for efficiently monitoring surveillance areas spanning several kilometers or more, and accurately capturing and photographing moving objects within the surveillance area. That is, with the technology shown in Patent Document 1, the entire surveillance area is photographed with a wide-angle camera. When a moving object is detected in an image captured by the wide-angle camera, the attitude of the telephoto camera is controlled to point the optical axis of the telephoto camera in the direction of the moving object. In other words, the telephoto camera photographs the moving object while tracking it.

特開2007-116666号公報Japanese Patent Application Laid-Open No. 2007-116666

 特許文献1に示されている技術では、まず、広角カメラによる撮影画像から監視エリア全体における動体の有無が検知される。このため、広角カメラから遠い監視エリア部分に動体が移動していたとしても、広角カメラによる撮影画像ではその遠くの動体は小さく映ることから、動体をうまく検知できないことが考えられる。換言すれば、撮影装置から遠い監視エリア部分については、動体の検知精度が低下してしまう。 In the technology disclosed in Patent Document 1, the presence or absence of moving objects in the entire monitoring area is first detected from images captured by a wide-angle camera. As a result, even if a moving object moves into a part of the monitoring area far from the wide-angle camera, it may not be possible to properly detect the moving object because the distant moving object appears small in the image captured by the wide-angle camera. In other words, the accuracy of detecting moving objects decreases in parts of the monitoring area far from the camera.

 本開示はそのような課題を解決するために考え出された。すなわち、本開示における主な目的は、飛行体を検知する検知領域における飛行体の検知精度が、当該検知領域を撮影する撮影装置からの距離に起因して低下することを防止する技術を提供することにある。 This disclosure was conceived to solve such problems. In other words, the main purpose of this disclosure is to provide technology that prevents the detection accuracy of an aerial vehicle in a detection area from decreasing due to the distance from the imaging device that captures the detection area.

 上述した目的を達成するために、本開示における飛行体検知装置は、その一態様として、
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得する取得部と、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知する検知部と、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成する生成部と、
 検知結果反映画像を出力する出力部と
を備える。
In order to achieve the above-mentioned object, the flying object detection device of the present disclosure, as one aspect thereof,
an acquisition unit that acquires a telephoto image, which is an image captured by a telephoto type imaging device that captures a distant part of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location to capture images of the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices;
a detection unit that detects a target flying object in a distant part of the detection area from a telephoto image and detects a target flying object in a part of the detection area other than the distant part from a wide-angle image;
a generation unit that generates an image reflecting the detection result by superimposing information indicating the position of the detected flying object in the image on the wide-angle image;
and an output unit that outputs an image reflecting the detection result.

 また、本開示における飛行体検知システムは、その一態様として、
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置と、
 前記望遠タイプの撮影装置によって撮影されない検知領域部分をも含めて撮影する前記複数タイプの撮影装置のうちの広角タイプの撮影装置と、
 前記望遠タイプの撮影装置および広角タイプの撮影装置によりそれぞれ撮影された撮影画像を用いる上述した飛行体検知装置と
を備える。
In addition, one aspect of the flying object detection system of the present disclosure is
a telephoto type imaging device that captures an image of a distant part of the detection area that is far from the installation location, among a plurality of types of imaging devices with different angles of view that are installed at a common installation location for capturing an image of the detection area;
a wide-angle type imaging device among the plurality of types of imaging devices that captures images of a detection area including a portion that is not captured by the telephoto type imaging device; and
and the above-mentioned flying object detection device that uses images captured by the telephoto type imaging device and the wide-angle type imaging device.

 さらに、本開示における飛行体検知方法は、その一態様として、
 コンピュータによって、
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得し、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知し、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成し、
 検知結果反映画像を出力する。
Furthermore, in one aspect, the flying object detection method of the present disclosure includes:
By computer,
Among a plurality of types of imaging devices with different angles of view installed at a common installation location for imaging the detection area, a telephoto image is acquired which is an image captured by a telephoto type imaging device which captures a distant portion of the detection area that is far from the installation location, and a wide-angle image is acquired which is an image captured by a wide-angle type imaging device among the plurality of types of imaging devices,
Detecting a target flying object in a distant part of the detection area from a telephoto image, and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image;
generating an image reflecting the detection results by superimposing information representing the position of the detected flying object in the image on the wide-angle image;
An image reflecting the detection results is output.

 さらに、本開示におけるプログラム記憶媒体は、その一態様として、
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得する処理と、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知する処理と、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成する処理と、
 検知結果反映画像を出力する処理と
をコンピュータに実行させるコンピュータプログラムを記憶する。
Furthermore, in one aspect, the program storage medium of the present disclosure includes:
a process of acquiring a telephoto image, which is an image captured by a telephoto type imaging device that captures a distant portion of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location to capture the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices;
A process of detecting a target flying object in a distant part of the detection area from a telephoto image, and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image;
A process of generating an image reflecting the detection result by superimposing information representing the position of the detected flying object in the image on the wide-angle image;
The computer program for causing a computer to execute a process of outputting an image reflecting the detection result is stored.

 本開示によれば、飛行体を検知する検知領域における飛行体の検知精度が、当該検知領域を撮影する撮影装置からの距離に起因して低下することを防止できる。 According to the present disclosure, it is possible to prevent the detection accuracy of an aerial vehicle in a detection area from decreasing due to the distance from the imaging device that captures the detection area.

本開示における飛行体検知装置の一実施形態の構成を説明する図である。1 is a diagram illustrating the configuration of an embodiment of an air vehicle detection device according to the present disclosure. 本開示における飛行体検知システムを構成する撮影装置について説明する図である。FIG. 2 is a diagram illustrating an imaging device that constitutes the flying object detection system according to the present disclosure. 図2と共に、撮影装置について説明する図である。2 is a diagram illustrating the imaging device. FIG. 検知結果反映画像の一例を表す図である。FIG. 10 is a diagram illustrating an example of a detection result reflection image. 検知結果反映画像の別の一例を表す図である。FIG. 10 is a diagram illustrating another example of the detection result reflection image. 検知結果反映画像のさらに別の一例を表す図である。FIG. 10 is a diagram illustrating yet another example of the detection result reflection image. 飛行体検知装置における飛行体検知に係る動作の一例を説明するフローチャートである。10 is a flowchart illustrating an example of an operation related to detecting a flying object in the flying object detection device. 飛行体検知装置における生成部の変形例を説明する図である。10A and 10B are diagrams illustrating modified examples of the generation unit in the flying object detection device. その他の実施形態を説明する図である。10A and 10B are diagrams illustrating other embodiments. 別のその他の実施形態を説明する図である。FIG. 10 is a diagram illustrating another embodiment. 飛行体検知装置のさらにまた別のその他の実施形態を説明する図である。10A and 10B are diagrams illustrating still another embodiment of the flying object detection device. 飛行体検知装置の別の飛行体検知に係る動作の一例を説明するフローチャートである。10 is a flowchart illustrating an example of an operation of the flying object detection device for detecting another flying object.

 以下に、図面を参照しながら本開示における実施形態を説明する。 Embodiments of this disclosure are described below with reference to the drawings.

 <第1実施形態>
 本開示に係る第1実施形態の飛行体検知システム1は、撮影装置による撮影画像を用いて、検知領域における検知対象の飛行体を検知するシステムであり、図1に表されるように複数タイプの撮影装置2,3と、飛行体検知装置(以下、略して検知装置とも称する)5とを備えている。検知装置5は、撮影装置2,3による撮影画像を画像解析することにより、検知領域における飛行体を検知するコンピュータ装置である。
First Embodiment
The flying object detection system 1 of the first embodiment according to the present disclosure is a system that detects a target flying object in a detection area using images captured by an imaging device, and includes multiple types of imaging devices 2 and 3 and an flying object detection device (hereinafter also simply referred to as a detection device) 5, as shown in Fig. 1. The detection device 5 is a computer device that detects flying objects in the detection area by image analysis of the images captured by the imaging devices 2 and 3.

 第1実施形態では、飛行体検知システム1は監視システムに適用される。飛行体検知システム1が適用される監視システムは、例えば、空港や原子力発電所などの重要施設の上空やその周辺領域を監視するシステムであり、その監視領域は、無人航空機の飛行が法令等により禁止されている飛行禁止空域を含む。飛行体検知システム1は、ここでは、そのような監視システムに適用されることから、飛行体検知システム1の検知領域は、適用される監視システムの監視領域である。 In the first embodiment, the flying object detection system 1 is applied to a surveillance system. A surveillance system to which the flying object detection system 1 is applied is, for example, a system that monitors the airspace above and surrounding areas of important facilities such as airports and nuclear power plants, and the surveillance area includes no-fly zones where the flight of unmanned aerial vehicles is prohibited by law. Because the flying object detection system 1 is applied to such a surveillance system, the detection area of the flying object detection system 1 is the surveillance area of the surveillance system to which it is applied.

 飛行体検知システム1が検知する検知対象の飛行体は、例えば、適用される監視システムが監視している施設の種類やその周辺環境などを考慮して、システム設計者などにより予め定められる。検知対象として設定される飛行体の例としては、無人飛行機、無人回転翼航空機、無人飛行船というような遠隔操作又は自動操縦により飛行させることができるもの(無人航空機)や、気球、ハンググライダー、パラグライダーというような特定航空用機器を用いて人が飛行するものが挙げられる。このように、検知対象の飛行体として様々な飛行体が設定され得るが、第1実施形態では、検知対象の飛行体として、いわゆるドローン(無人航空機)が設定されているとする。 The target air vehicles to be detected by the air vehicle detection system 1 are determined in advance by a system designer, taking into consideration, for example, the type of facility being monitored by the applicable monitoring system and its surrounding environment. Examples of air vehicles set as detection targets include unmanned aerial vehicles (unmanned aerial vehicles) such as unmanned airplanes, unmanned rotorcraft, and unmanned airships, which can be flown by remote control or automatic piloting, as well as balloons, hang gliders, and paragliders, which are flown by humans using specific aviation equipment. In this way, various air vehicles can be set as detection target air vehicles, but in the first embodiment, so-called drones (unmanned aerial vehicles) are set as detection target air vehicles.

 飛行体検知システム1を構成する撮影装置としては、画角が異なる複数タイプの撮影装置が採用される。飛行体検知システム1に採用される撮影装置の一つのタイプは、望遠レンズを備えた望遠タイプの撮影装置である。また、飛行体検知システム1に採用される撮影装置の別のタイプは、広角レンズを備えた広角タイプの撮影装置である。なお、撮影装置に備えられるレンズは、画角の違いによって“標準レンズ”、“広角レンズ”、“望遠レンズ”という3つのタイプに分類できる。“標準レンズ”は人間の視野に近い画角と言われている45度~50度程度の画角を持つレンズである。“広角レンズ”は“標準レンズ”よりも広い画角、例えば60度以上のレンズである。“望遠レンズ”は“標準レンズ”よりも狭い画角、例えば30度以下のレンズである。 Several types of imaging devices with different angles of view are used as the imaging devices that make up the flying object detection system 1. One type of imaging device used in the flying object detection system 1 is a telephoto type imaging device equipped with a telephoto lens. Another type of imaging device used in the flying object detection system 1 is a wide-angle type imaging device equipped with a wide-angle lens. The lenses provided in the imaging devices can be classified into three types depending on the angle of view: "standard lens," "wide-angle lens," and "telephoto lens." A "standard lens" is a lens with an angle of view of approximately 45 to 50 degrees, which is said to be close to the human field of view. A "wide-angle lens" is a lens with a wider angle of view than a "standard lens," for example, 60 degrees or more. A "telephoto lens" is a lens with a narrower angle of view than a "standard lens," for example, 30 degrees or less.

 以下の説明では、広角タイプである撮影装置を広角カメラ2とも称し、望遠タイプである撮影装置を望遠カメラ3とも称する。 In the following description, a wide-angle type imaging device will also be referred to as a wide-angle camera 2, and a telephoto type imaging device will also be referred to as a telephoto camera 3.

 飛行体検知システム1では、広角カメラ2および望遠カメラ3は、検知領域を撮影できる共通の設置場所に、撮影方向が固定された状態で設置される。例えば、広角カメラ2および望遠カメラ3は、図2に表されるような検知領域を臨むことができる高さH(y)が10メートル程度の場所に設置される。図2には、そのように設置された広角カメラ2と望遠カメラ3とのそれぞれの検知領域における撮影範囲の一例が地表面に沿う方向から見た模式的な図でもって表されている。図2では、広角カメラ2の視野のうちの検知領域を撮影している撮影範囲部分がハッチング部分Zwにより表されている。また、検知領域において望遠カメラ3により撮影される撮影範囲部分が薄く塗りつぶされた部分Ztにより表されている。なお、無人航空機(ドローンなど)には、地表から150メートル以上の高度の空域は飛行禁止であるというような飛行の高度制限がある。図2の例では、そのような飛行体(無人航空機)の飛行の高度制限が考慮された検知領域が表されている。 In the flying object detection system 1, the wide-angle camera 2 and telephoto camera 3 are installed in a common installation location where they can capture images of the detection area, with the image capture direction fixed. For example, the wide-angle camera 2 and telephoto camera 3 are installed at a height H(y) of approximately 10 meters, allowing them to view the detection area as shown in Figure 2. Figure 2 shows a schematic diagram of an example of the capture range of the detection area of the wide-angle camera 2 and telephoto camera 3 installed in this manner, viewed from a direction along the ground surface. In Figure 2, the portion of the field of view of the wide-angle camera 2 that captures the detection area is represented by the hatched area Zw. Furthermore, the portion of the detection area that is captured by the telephoto camera 3 is represented by the lightly shaded area Zt. Note that unmanned aerial vehicles (such as drones) are subject to altitude restrictions, such as prohibiting flight in airspace above 150 meters above the ground. The example in Figure 2 shows a detection area that takes into account altitude restrictions for such flying objects (unmanned aerial vehicles).

 望遠カメラ3は、検知領域において望遠カメラ3の設置場所から遠方となる遠方部分を撮影する。例えば、望遠カメラ3の設置場所からの距離D(z)が800メートル程度というような検知領域の遠方部分を撮影するように望遠カメラ3のカメラ設定(倍率など)が成されている。ここでは、望遠カメラ3により撮影された撮影画像は望遠画像とも称される。望遠画像は、検知装置5によって、検知領域の遠方部分における検知対象の飛行体の検知に用いられる。 Telephoto camera 3 captures images of the distant part of the detection area that is far from the installation location of telephoto camera 3. For example, the camera settings (magnification, etc.) of telephoto camera 3 are configured to capture images of the distant part of the detection area, such as a distance D(z) of approximately 800 meters from the installation location of telephoto camera 3. Here, the image captured by telephoto camera 3 is also referred to as a telephoto image. The telephoto image is used by detection device 5 to detect the target flying object in the distant part of the detection area.

 ところで、望遠カメラ3は、遠くのものを拡大して撮影できる。換言すれば、望遠カメラ3の視野は狭い。このため、望遠カメラ3が撮影できる遠方部分の範囲は限られている。図3には、広角カメラ2と望遠カメラ3とのそれぞれの検知領域における撮影範囲の一例が天頂から見た模式的な図でもって表されている。この図3においても、図2と同様に、広角カメラ2の視野のうちの検知領域を撮影している撮影範囲部分がハッチング部分Zwにより表されている。また、検知領域において望遠カメラ3により撮影される撮影範囲部分が薄く塗りつぶされた部分Ztにより表されている。 By the way, telephoto camera 3 can take magnified images of distant objects. In other words, the field of view of telephoto camera 3 is narrow. For this reason, the range of distant areas that telephoto camera 3 can capture is limited. Figure 3 shows a schematic diagram of an example of the capture range in each detection area of wide-angle camera 2 and telephoto camera 3, viewed from the zenith. In Figure 3, as in Figure 2, the portion of the field of view of wide-angle camera 2 that captures the detection area is represented by the hatched portion Zw. Furthermore, the portion of the detection area that is captured by telephoto camera 3 is represented by the lightly shaded portion Zt.

 図3に表されているように、1台の望遠カメラ3では検知領域の遠方部分の全体を撮影できない場合がある。この場合には、飛行体検知システム1では、複数の望遠カメラ3が同じ設置場所に設置され、これら望遠カメラ3の撮影範囲をずらして遠方部分の全体が複数の望遠カメラ3により撮影されるように当該複数の望遠カメラ3のそれぞれの撮影方向が設定される。つまり、検知領域における遠方部分は、視野を互いにずらした複数の望遠タイプの撮影装置により撮影される。なお、図3の例では、複数の望遠カメラ3は、撮影範囲を地表面に沿う方向にずらすようにそれぞれの撮影方向が設定されているが、地表面からの検知領域の高さH(d)や、望遠カメラ3の視野の広さによっては、複数の望遠カメラ3は、撮影範囲を高さ方向にずらすようにそれぞれの撮影方向が設定される場合もある。また、望遠カメラ3の設置数は、検知領域における遠方部分の広さや望遠カメラ3の視野の広さによるものであり、限定されない。 As shown in Figure 3, there are cases where a single telephoto camera 3 is unable to capture the entire distant portion of the detection area. In such cases, in the flying object detection system 1, multiple telephoto cameras 3 are installed in the same installation location, and the shooting directions of the multiple telephoto cameras 3 are set so that the shooting ranges of these telephoto cameras 3 are shifted and the entire distant portion is captured by the multiple telephoto cameras 3. In other words, the distant portion of the detection area is captured by multiple telephoto-type shooting devices with mutually shifted fields of view. Note that in the example of Figure 3, the shooting directions of the multiple telephoto cameras 3 are set so that the shooting ranges are shifted in a direction along the ground surface, but depending on the height H(d) of the detection area from the ground surface and the width of the field of view of the telephoto cameras 3, the shooting directions of the multiple telephoto cameras 3 may be set so that the shooting ranges are shifted in the vertical direction. Furthermore, the number of telephoto cameras 3 installed depends on the width of the distant portion of the detection area and the width of the field of view of the telephoto cameras 3, and is not limited.

 広角カメラ2は、望遠カメラ3によって撮影されない検知領域部分を補うような視野を有する。ここでは、広角カメラ2により撮影された撮影画像は広角画像とも称される。広角画像は、望遠カメラ3により撮影される遠方部分よりも設置場所に近い検知領域部分(遠方部分以外の検知領域部分(以下、広角カメラ担当部分とも称する))における検知対象の飛行体の検知に用いられる。このため、その検知領域の広角カメラ担当部分における飛行体が明確に映るように広角カメラ2のカメラ設定が成されている。なお、検知領域の広さによっては、複数の広角カメラ2が同じ設置場所に設置され、これら広角カメラ2の撮影範囲をずらして検知領域全体が複数の広角カメラ2により撮影されるようにそれぞれの撮影方向が設定される場合も有り得る。 Wide-angle camera 2 has a field of view that compensates for the portions of the detection area not captured by telephoto camera 3. Here, the image captured by wide-angle camera 2 is also referred to as a wide-angle image. The wide-angle image is used to detect flying objects in portions of the detection area that are closer to the installation location than the distant portions captured by telephoto camera 3 (detection area portions other than the distant portions (hereinafter also referred to as the portion covered by the wide-angle camera)). For this reason, the camera settings of wide-angle camera 2 are configured so that flying objects in the portion of the detection area covered by the wide-angle camera are clearly visible. Depending on the size of the detection area, multiple wide-angle cameras 2 may be installed in the same installation location, and the shooting directions of these wide-angle cameras 2 may be set so that the shooting ranges of these wide-angle cameras 2 are shifted and the entire detection area is captured by multiple wide-angle cameras 2.

 ここで、検知領域における広角カメラ担当部分と遠方部分とのそれぞれを飛行している同じ大きさの検知対象の飛行体が広角カメラ2と望遠カメラ3のそれぞれにより撮影されたとする。この飛行体検知システム1では、このような場合において、望遠画像に映る検知対象の飛行体の大きさと、広角画像に映る検知対象の飛行体の大きさとの違いが予め定められた範囲(許容範囲)内となるようにカメラ設定が成されている。好ましくは、望遠画像に映る検知対象の飛行体の大きさが広角画像に映る検知対象の飛行体の大きさと同様になるようにカメラ設定が成される。このようなカメラ設定は、検知装置5における検知対象の飛行体の検知処理の負荷軽減に関係してくる。検知処理の負荷軽減を考慮して、上述した望遠画像と広角画像のそれぞれに映る飛行体の大きさの違いに関連する許容範囲が定められる。 Here, suppose that wide-angle camera 2 and telephoto camera 3 capture images of flying objects of the same size as the target of detection, flying in the wide-angle camera's portion and the distant portion of the detection area. In this flying object detection system 1, in such a case, the cameras are set so that the difference between the size of the target of detection object captured in the telephoto image and the size of the target of detection object captured in the wide-angle image is within a predetermined range (tolerance). Preferably, the cameras are set so that the size of the target of detection object captured in the telephoto image is the same as the size of the target of detection object captured in the wide-angle image. Such camera settings are related to reducing the load on the detection device 5 in the detection process for the target flying object. Taking into account the reduction in the load of the detection process, the tolerance range related to the difference in size of the target of detection object captured in the telephoto image and the wide-angle image described above is set.

 検知装置5はコンピュータ装置であり、広角カメラ2による広角画像および望遠カメラ3による望遠画像を解析することにより、検知対象の飛行体を検知する。すなわち、検知装置5は、図1に表されるように、広角カメラ2および望遠カメラ3のそれぞれと直接的又は間接的に接続されており、演算装置50と記憶装置40を備えている。記憶装置40は、データや、コンピュータプログラム(以下、プログラムとも称する)41を記憶する記憶媒体を備えている。記憶装置には、磁気ディスク装置や、半導体メモリ素子などの複数の種類があり、さらに、半導体メモリ素子には、RAM(Random Access Memory)やROM(Read Only Memory)などの複数の種類があるというように、多数の種類がある。コンピュータ装置には用途に応じた複数種の記憶装置が備えられるが、ここでは、それら記憶装置を区別せずにまとめて記憶装置40として称することとする。さらにまた、検知装置5が備える記憶装置40の種類や数は限定されず、その説明は省略される。なお、検知装置5は、記憶装置であるデータベース6と接続する場合もある。この場合には、検知装置5は、データベース6への情報の書き込みやデータベース6からの情報の読み出しを行うことがあるが、ここでは説明の煩雑さを避けるために、そのような場合があってもその説明を省略することとする。 The detection device 5 is a computer device that detects target flying objects by analyzing wide-angle images captured by the wide-angle camera 2 and telephoto images captured by the telephoto camera 3. That is, as shown in FIG. 1, the detection device 5 is directly or indirectly connected to each of the wide-angle camera 2 and the telephoto camera 3, and includes an arithmetic unit 50 and a storage device 40. The storage device 40 includes a storage medium for storing data and a computer program (hereinafter also referred to as a program) 41. There are multiple types of storage devices, such as magnetic disk drives and semiconductor memory devices. Furthermore, there are many types of semiconductor memory devices, such as RAM (Random Access Memory) and ROM (Read Only Memory). A computer device may be equipped with multiple types of storage devices depending on their intended use, but these storage devices will be collectively referred to as the storage device 40. Furthermore, the types and number of storage devices 40 included in the detection device 5 are not limited, and a description thereof will be omitted. The detection device 5 may also be connected to a database 6, which is a storage device. In this case, the detection device 5 may write information to or read information from the database 6, but to avoid complicating the explanation, we will omit the explanation here even if such a case occurs.

 演算装置50は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)などのプロセッサにより構成される。当該演算装置50は、記憶装置40に記憶されているプログラム41を読み出して実行することにより、当該プログラム41に基づいた機能を持つことができる。ここでは、演算装置50は、飛行体の検知に係る機能部として、取得部51と、検知部52と、生成部53と、出力部54とを有する。 The arithmetic device 50 is composed of processors such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit). The arithmetic device 50 can have functions based on a program 41 by reading and executing the program 41 stored in the storage device 40. Here, the arithmetic device 50 has an acquisition unit 51, a detection unit 52, a generation unit 53, and an output unit 54 as functional units related to detecting flying objects.

 取得部51は、広角カメラ2による広角画像および望遠カメラ3による望遠画像をそれぞれ取得する。ここでは、取得部51が広角画像および望遠画像を取得する手法は限定されず、例えば、取得部51は、広角カメラ2と望遠カメラ3のそれぞれから撮影画像(広角画像や望遠画像)を取得してもよいし、広角カメラ2と望遠カメラ3のそれぞれからデータベース(図示せず)に一旦、格納された広角画像や望遠画像をデータベースから読み出すことにより、広角画像や望遠画像を取得してもよい。 The acquisition unit 51 acquires wide-angle images captured by the wide-angle camera 2 and telephoto images captured by the telephoto camera 3. The method by which the acquisition unit 51 acquires wide-angle images and telephoto images is not limited here; for example, the acquisition unit 51 may acquire captured images (wide-angle images and telephoto images) from the wide-angle camera 2 and telephoto camera 3, respectively, or may acquire wide-angle images and telephoto images by reading them from a database (not shown) that has temporarily stored the wide-angle images and telephoto images from the wide-angle camera 2 and telephoto camera 3, respectively.

 検知部52は、検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、広角カメラ担当部分(遠方部分以外の検知領域部分)における検知対象の飛行体を広角画像から検知する。撮影画像(広角画像や望遠画像)から検知対象の飛行体を検知する手法は様々に提案されており、ここでは、限定されないが、一例として、AI(Artificial Intelligence)技術を用いる検知手法が挙げられる。この場合、AI技術により生成された検知モデルが用いられる。この検知モデルは、検知対象の飛行体の画像を教師データとして学習することにより生成される。検知モデルへの入力情報は撮影画像(広角画像や望遠画像)であり、検知モデルからの出力情報は、入力された撮影画像における検知対象の飛行体の有無を表す情報と、検知対象の飛行体が検知された場合には検知された飛行体の撮影画像における位置を表す情報(以下、検知位置情報とも称する)とを含む情報である。 The detection unit 52 detects target flying objects in the distant part of the detection area from the telephoto image, and detects target flying objects in the part of the detection area covered by the wide-angle camera (the part of the detection area other than the distant part) from the wide-angle image. Various methods have been proposed for detecting target flying objects from captured images (wide-angle images or telephoto images), and one example, although not limited to these, is a detection method that uses AI (Artificial Intelligence) technology. In this case, a detection model generated using AI technology is used. This detection model is generated by learning from images of the target flying object as training data. The input information to the detection model is the captured image (wide-angle image or telephoto image), and the output information from the detection model includes information indicating the presence or absence of the target flying object in the input captured image, and, if the target flying object is detected, information indicating the position of the detected flying object in the captured image (hereinafter also referred to as detection position information).

 この飛行体検知システム1では、前述したように、同じ大きさの検知対象の飛行体が検知領域における広角カメラ担当部分と遠方部分のそれぞれで撮影された場合に広角画像と望遠画像のそれぞれに映っている飛行体の像の大きさが予め定められた範囲(許容範囲)内となるように広角カメラ2と望遠カメラ3のカメラ設定が成されている。つまり、広角画像の画像解析と望遠画像の画像解析を同じ検知モデルで処理できるようにカメラ設定が成されている。これにより、検知部52による検知処理では、広角画像の画像解析と望遠画像の画像解析に同じ検知モデルが用いられる。なお、検知対象の飛行体として複数種の飛行体が設定されている場合には、それら複数種の飛行体の画像を学習させることにより、それら複数種の飛行体をそれぞれ区別して検知する検知モデルが生成される。検知部52は、当該検知モデルを用いてもよい。この場合には、検知部52は、検知した飛行体の種類を表す情報をも出力可能となる。 In this flying object detection system 1, as described above, the wide-angle camera 2 and telephoto camera 3 are set so that when a flying object of the same size to be detected is photographed in the wide-angle camera's portion and the distant portion of the detection area, the size of the image of the flying object captured in each of the wide-angle and telephoto images will be within a predetermined range (acceptable range). In other words, the cameras are set so that the image analysis of the wide-angle image and the telephoto image can be performed using the same detection model. As a result, in the detection process by the detection unit 52, the same detection model is used for the image analysis of the wide-angle image and the telephoto image. Note that if multiple types of flying objects are set as flying objects to be detected, a detection model that can distinguish and detect each of these multiple types of flying objects is generated by learning images of these multiple types of flying objects. The detection unit 52 may use this detection model. In this case, the detection unit 52 can also output information indicating the type of flying object detected.

 検知部52は、そのような検知モデルを用いて、広角画像と望遠画像のそれぞれから検知対象の飛行体を検知する。また、検知部52は、望遠画像から飛行体が検知された場合には、当該飛行体は広角画像にも映っているはずであるから、広角画像と望遠画像の飛行体の対応付けを行う。この処理には、同じ実空間が映っている画像部分を関連付ける望遠画像と広角画像との位置関係データが用いられる。このデータは、予め生成されて記憶装置40に格納される。 The detection unit 52 uses this detection model to detect the target flying object from both the wide-angle image and the telephoto image. Furthermore, if the detection unit 52 detects a flying object from the telephoto image, the flying object should also appear in the wide-angle image, and so the detection unit 52 associates the flying object in the wide-angle image with the telephoto image. This process uses positional relationship data between the telephoto image and the wide-angle image, which associates image portions that show the same real space. This data is generated in advance and stored in the storage device 40.

 検知部52による検知結果を表す情報は、例えば、検知処理された撮影画像を撮影した撮影装置(広角カメラ2や望遠カメラ)を識別する情報と、検知処理された撮影画像のフレーム番号を表す情報と、撮影時刻の情報とが関連付けられて記憶装置40に格納される。 Information representing the detection results by the detection unit 52 is stored in the storage device 40 in association with, for example, information identifying the imaging device (wide-angle camera 2 or telephoto camera) that captured the captured image subjected to the detection process, information representing the frame number of the captured image subjected to the detection process, and information about the time of capture.

 飛行体検知システム1では、前述したように広角画像も望遠画像も同じ検知モデルを用いて検知対象の飛行体の検知処理が実行される。このため、広角画像と望遠画像が別々の検知モデルにより処理される場合に比べて、検知装置5の処理の負荷を軽くでき、また、飛行体の検知処理に要する時間の短縮を図ることができる。このことは、広角カメラ2と望遠カメラ3による撮影画像を用いた飛行体のリアルタイムでの検知の実現に寄与する。 As mentioned above, in the flying object detection system 1, the detection process for the target flying object is performed using the same detection model for both wide-angle images and telephoto images. This reduces the processing load on the detection device 5 compared to when wide-angle images and telephoto images are processed using separate detection models, and also shortens the time required for the detection process for the flying object. This contributes to real-time detection of flying objects using images captured by the wide-angle camera 2 and telephoto camera 3.

 生成部53は、検知結果反映画像を生成する。検知結果反映画像は、検知された飛行体の画像での位置を表す情報を広角画像に重畳させた画像である。ここでは、検知された飛行体の画像での位置を表す情報は、例えば、図形により表される。検知された飛行体の画像での位置を表す図形(以下、飛行体検知図形とも称する)は、ここでは限定されず、例えば、丸や三角形や四角形でもよいし、記号でもよいし、飛行体を模したマークでもよく、システム設計者などにより適宜に設定される。検知結果反映画像の生成処理で用いられる情報は、広角画像から検知された飛行体の広角画像での位置の情報(つまり、検知位置情報)や、望遠画像と広角画像の対応付け処理による望遠画像から検知された飛行体の広角画像での位置の情報である。また、広角画像と望遠画像は、撮影時刻が合うように同期がとられている。これにより、望遠画像から検知された飛行体の画像での位置を表す飛行体検知図形は、その飛行体が検知された望遠画像の撮影時刻と同じ撮影時刻に撮影された広角画像に重畳される。 The generation unit 53 generates an image reflecting the detection results. The image reflecting the detection results is an image in which information representing the position of the detected flying object in the image is superimposed on the wide-angle image. Here, the information representing the position of the detected flying object in the image is represented, for example, by a graphic. The graphic representing the position of the detected flying object in the image (hereinafter also referred to as the flying object detection graphic) is not limited here and may be, for example, a circle, triangle, or square, or a symbol or a mark resembling an flying object, and is set appropriately by a system designer, etc. The information used in the process of generating the detection result reflection image is information about the position of the flying object in the wide-angle image detected from the wide-angle image (i.e., detection position information), and information about the position of the flying object in the wide-angle image detected from the telephoto image by matching the telephoto image with the wide-angle image. In addition, the wide-angle image and the telephoto image are synchronized so that their shooting times match. As a result, the flying object detection graphic representing the position of the flying object in the image detected from the telephoto image is superimposed on the wide-angle image captured at the same shooting time as the telephoto image in which the flying object was detected.

 図4と図5には、それぞれ、検知結果反映画像の例が表されている。図4の例では、検知結果反映画像は、二重丸である飛行体検知図形8が広角画像に重畳されている画像である。図5の例は、複数種の飛行体が検知された場合の検知結果反映画像の例である。当該検知結果反映画像は、検知された飛行体の種類に応じた星型である飛行体検知図形8と、検知された別の種類の飛行体の種類に応じた二重丸である飛行体検知図形8とが広角画像に重畳されている画像である。なお、図4および図5において、検知された飛行体には符号“9”が付されている。また、複数種の飛行体が検知された場合に、検知された飛行体の種類によらずに同じ飛行体検知図形8が広角画像に重畳されてもよい。 FIGS. 4 and 5 each show examples of detection result reflection images. In the example of FIG. 4, the detection result reflection image is an image in which a flying object detection graphic 8, which is a double circle, is superimposed on a wide-angle image. The example of FIG. 5 is an example of a detection result reflection image when multiple types of flying objects are detected. The detection result reflection image is an image in which a star-shaped flying object detection graphic 8 corresponding to the type of flying object detected, and a double circle flying object detection graphic 8 corresponding to another type of flying object detected, are superimposed on a wide-angle image. Note that in FIGS. 4 and 5, the detected flying object is indicated by the symbol "9." Furthermore, when multiple types of flying objects are detected, the same flying object detection graphic 8 may be superimposed on the wide-angle image regardless of the type of flying object detected.

 また、検知された飛行体の飛行位置や種類などによって警戒度が異なる場合がある。このような場合には、警戒度に応じた飛行体検知図形8の種類が予め定められ、警戒度に応じた種類の飛行体検知図形8が広角画像に重畳されることによって、検知結果反映画像が生成部53により生成されてもよい。なお、ここでは、警戒度は、例えば、検知部52により算出される。すなわち、広角カメラ2および望遠カメラ3の設置場所(カメラ設置場所)からの距離によって異なる警戒度が設定される場合には、警戒度の判断に用いる警戒度判断情報として、カメラ設置場所から飛行体までの距離とその警戒度との関係データが記憶装置40に格納される。また、検知対象の飛行体の種類によって警戒度が設定される場合には、飛行体の種類とその警戒度との関係データが警戒度判断情報として記憶装置40に格納される。さらに、カメラ設置場所から飛行体までの距離と飛行体の種類との組み合わせによる警戒度が設定される場合には、カメラ設置場所から飛行体までの距離と飛行体の種類との組み合わせと、その警戒度との関係データが警戒度判断情報として記憶装置40に格納される。検知部52は、検知対象の飛行体が検知された場合に、カメラ設置場所からその検知された飛行体までの距離と飛行体の種類との一方又は両方と、記憶装置40に格納されている警戒度判断情報とを用いて、検知された飛行体の警戒度を算出する。そして、生成部53は、算出された警戒度に応じた種類の飛行体検知図形8を前述の如く広角画像に重畳することによって、検知結果反映画像を生成する。なお、広角画像や望遠画像から検知された飛行体とカメラ設置場所との間の距離を算出する手法はここでは限定されないので、その説明は省略される。 )。 English: Furthermore, the alert level may differ depending on the flight location and type of the detected flying object. In such cases, the type of flying object detection graphic 8 corresponding to the alert level may be predetermined, and the generation unit 53 may generate an image reflecting the detection results by superimposing the flying object detection graphic 8 of the type corresponding to the alert level on the wide-angle image. Note that here, the alert level is calculated, for example, by the detection unit 52. In other words, if different alert levels are set depending on the distance from the installation location of the wide-angle camera 2 and the telephoto camera 3 (camera installation location), data relating to the distance from the camera installation location to the flying object and the alert level is stored in the storage device 40 as alert level judgment information used to determine the alert level. Furthermore, if the alert level is set depending on the type of flying object to be detected, data relating to the type of flying object and the alert level is stored in the storage device 40 as alert level judgment information. Furthermore, if the alert level is set based on the combination of the distance from the camera installation location to the flying object and the type of flying object, data relating to the alert level is stored in the storage device 40 as alert level judgment information. When a target flying object is detected, the detection unit 52 calculates the alert level of the detected flying object using one or both of the distance from the camera installation location to the detected flying object and the type of flying object, as well as the alert level determination information stored in the storage device 40. The generation unit 53 then generates an image reflecting the detection results by superimposing an flying object detection graphic 8 of a type corresponding to the calculated alert level onto the wide-angle image as described above. Note that the method for calculating the distance between the detected flying object and the camera installation location from the wide-angle image or telephoto image is not limited here, and therefore its description is omitted.

 さらに、生成部53による検知結果反映画像には、検知された飛行体の種類や、警戒度などの情報を表す文字がさらに含まれていてもよい。 Furthermore, the image reflecting the detection results generated by the generation unit 53 may further include text indicating information such as the type of aircraft detected and the alert level.

 出力部54は、生成された検知結果反映画像を出力する。出力先の一例としては、図1に表されるような表示装置7が挙げられる。表示装置7は、文字や画像により情報を画面に表示することにより、情報を報知(提供)する装置である。ここでは、表示装置7は、出力部54から検知結果反映画像を受けて、例えば飛行体検知システム(監視システム)の利用者(ユーザ)に検知結果反映画像を提供する。なお、表示装置7は、例えば、飛行体検知システム(監視システム)の利用者(ユーザ)が所持している端末装置の表示装置であってもよい。 The output unit 54 outputs the generated image reflecting the detection results. An example of an output destination is the display device 7 shown in Figure 1. The display device 7 is a device that notifies (provides) information by displaying the information on a screen using text and images. Here, the display device 7 receives the image reflecting the detection results from the output unit 54 and provides the image reflecting the detection results to, for example, a user of the flying object detection system (monitoring system). Note that the display device 7 may be, for example, a display device of a terminal device carried by a user of the flying object detection system (monitoring system).

 ところで、ここでは、広角カメラ2および望遠カメラ3が出力する撮影画像(広角画像、望遠画像)は動画であり、撮影画像を構成する複数のフレーム画像の中から選択されたフレーム画像が用いられて飛行体の検知処理が行われる。この検知処理の結果が反映されて生成された検知結果反映画像は、例えば、基になった広角画像のフレーム画像と置換されることにより広角画像である動画に含まれて出力部54によって出力先に向けて出力される。表示装置7は、そのように出力部54によって出力された動画(広角画像)を表示する。換言すれば、出力部54は、表示装置7の表示動作を制御しているとも言える。 In this case, the captured images (wide-angle image, telephoto image) output by the wide-angle camera 2 and the telephoto camera 3 are moving images, and the detection process for the flying object is performed using a frame image selected from the multiple frame images that make up the captured image. The detection result reflection image generated by reflecting the results of this detection process is, for example, included in the moving image, which is a wide-angle image, by replacing the frame image of the original wide-angle image, and is output to the output destination by the output unit 54. The display device 7 displays the moving image (wide-angle image) output by the output unit 54 in this way. In other words, the output unit 54 can be said to control the display operation of the display device 7.

 なお、例えばユーザによって、検知された飛行体の画像を表示装置7に拡大表示する要求(拡大表示要求)が検知装置5に入力された場合に、出力部54は、その要求に応じて、図6に表されるように、飛行体9を拡大表示する表示制御を行ってもよい。ユーザによる拡大表示要求の入力手法は、ここでは限定されないが、例えば、図6に表されるようなカーソル71による飛行体検知図形8のクリック操作により拡大表示要求が検知装置5に入力される手法が挙げられる。また、飛行体の拡大表示の態様について、ここでは限定されず、適宜に設定された表示態様であってよく、例えば、図6に表されるような、飛行体が拡大表示されている窓画像が広角画像に重畳されてもよいし、広角画像と、飛行体の拡大表示画像とが並んで表示されてもよい。 For example, when a user inputs a request (enlarged display request) to the detection device 5 to enlarge an image of a detected flying object on the display device 7, the output unit 54 may perform display control to enlarge the flying object 9 in response to the request, as shown in FIG. 6. The method by which the user inputs the enlarged display request is not limited here, but an example is a method in which the enlarged display request is input to the detection device 5 by clicking the flying object detection graphic 8 with the cursor 71 as shown in FIG. 6. Furthermore, the manner in which the flying object is enlarged is not limited here and may be an appropriately set display manner. For example, as shown in FIG. 6, a window image in which the flying object is enlarged may be superimposed on the wide-angle image, or the wide-angle image and the enlarged image of the flying object may be displayed side by side.

 第1実施形態の検知装置5は上述したように構成されている。次に、検知装置5における飛行体の検知に関する動作の一例を、図7を参照しながら説明する。図7は、検知装置5における飛行体の検知に関する動作の一例を説明するフローチャートである。また、図7は、検知装置5における飛行体検知方法を説明する図であるとも言える。 The detection device 5 of the first embodiment is configured as described above. Next, an example of the operation of the detection device 5 to detect an airborne object will be described with reference to Figure 7. Figure 7 is a flowchart explaining an example of the operation of the detection device 5 to detect an airborne object. Figure 7 can also be considered a diagram explaining the method for detecting an airborne object in the detection device 5.

 例えば、検知装置5における記憶装置40には、前述したような演算装置(プロセッサ)50の処理に用いる様々な情報(データ)が格納されているとする。検知装置5の取得部51は、広角カメラ2による広角画像および望遠カメラ3による望遠画像をそれぞれ取得する(図7におけるステップ101)。以下の説明では、広角画像および望遠画像は動画であるとする。 For example, let us say that the storage device 40 in the detection device 5 stores various information (data) used for processing by the arithmetic unit (processor) 50 as described above. The acquisition unit 51 of the detection device 5 acquires wide-angle images from the wide-angle camera 2 and telephoto images from the telephoto camera 3 (step 101 in Figure 7). In the following explanation, it is assumed that the wide-angle images and telephoto images are moving images.

 検知部52は、取得した広角画像と望遠画像のそれぞれについて、検知対象の飛行体を検知する検知処理を実行する(ステップ102)。ここでは、広角画像と望遠画像のそれぞれを構成するフレーム画像のうち、予め定められたフレーム数ごとに選択されたフレーム画像に検知処理が実行される。 The detection unit 52 executes a detection process to detect the target flying object for each of the acquired wide-angle and telephoto images (step 102). Here, the detection process is executed on frame images selected for each predetermined number of frames from the frame images that make up each of the wide-angle and telephoto images.

 そして、検知部52は、検知処理により広角画像と望遠画像の少なくとも一方において検知対象の飛行体が検知されたか否かが判断される(ステップ103)。検知対象の飛行体が検知された場合には、検知部52は、飛行体が検知された広角画像や望遠画像において、検知された飛行体の像が映っている位置を検知撮影情報として算出する。さらに、飛行体が検知された撮影画像が望遠画像である場合には、検知部52は、望遠画像と広角画像との対応付けの処理により、望遠画像から検知された飛行体の広角画像での位置を算出する。 Then, the detection unit 52 determines whether the target flying object has been detected in at least one of the wide-angle image and the telephoto image through the detection process (step 103). If the target flying object is detected, the detection unit 52 calculates the position in the wide-angle image or telephoto image in which the flying object is detected as detected photographing information. Furthermore, if the photographed image in which the flying object is detected is a telephoto image, the detection unit 52 calculates the position in the wide-angle image of the flying object detected from the telephoto image through processing to associate the telephoto image with the wide-angle image.

 然る後に、生成部53が、検知部52の検知処理による情報を用いて、検知された飛行体の画像での位置を表す飛行体検知図形8を広角画像に重畳することにより、検知結果反映画像を生成する(ステップ104)。そして、出力部54は、検知結果反映画像を出力する(ステップ105)。ここでは、広角画像は動画であるから、検知結果反映画像(フレーム画像)を含む広角画像が出力部54によって例えば表示装置7に出力される。つまり、ここでは、広角画像と望遠画像のそれぞれにおける複数のフレーム画像の中から選択された複数のフレーム画像のそれぞれにおいて、順次、前述したような飛行体の検知処理が実行される。検知部52により飛行体が検知された場合には検知結果反映画像が生成部53により生成される。そして、検知結果反映画像(フレーム画像)が生成された場合には、検知結果反映画像を含む広角画像が出力部54によって例えば表示装置7に出力される。 Then, the generation unit 53 uses information from the detection process by the detection unit 52 to superimpose an airborne object detection graphic 8 representing the position of the detected airborne object in the image onto the wide-angle image, thereby generating an image reflecting the detection result (step 104). The output unit 54 then outputs the image reflecting the detection result (step 105). Here, since the wide-angle image is a moving image, the output unit 54 outputs a wide-angle image including the detection result reflection image (frame image), for example, to the display device 7. In other words, here, the airborne object detection process described above is performed sequentially on each of multiple frame images selected from the multiple frame images in each of the wide-angle image and the telephoto image. When the detection unit 52 detects an airborne object, the generation unit 53 generates an image reflecting the detection result. When the detection result reflection image (frame image) is generated, the output unit 54 outputs a wide-angle image including the detection result reflection image, for example, to the display device 7.

 一方、検知対象の飛行体が、選択されたフレーム画像から検知されなかった場合には、当該フレーム画像についての飛行体の検知処理を終了し、次に選択されるフレーム画像についての飛行体の検知処理に備える。 On the other hand, if the target flying object is not detected in the selected frame image, the flying object detection process for that frame image is terminated, and the system prepares for the flying object detection process for the next frame image to be selected.

 第1実施形態の検知装置5および当該検知装置5を備える飛行体検知システム1は、上述したように、検知領域の遠方部分における検知対象の飛行体の検知に望遠画像を用いる。望遠画像においては、カメラ設置場所から遠方であっても、検知可能な大きさでもって検知対象の飛行体を映すことができる。このため、飛行体検知システム1は、カメラ設置場所から遠方であることに起因して撮影画像に映る検知対象の飛行体が検知できないほどに小さくて当該飛行体の検知が漏れてしまうというような事態を、望遠画像を用いることによって抑制できる。 As described above, the detection device 5 of the first embodiment and the flying object detection system 1 equipped with the detection device 5 use telephoto images to detect target flying objects in the distant parts of the detection area. In telephoto images, the target flying object can be captured at a detectable size even if it is far from the camera installation location. Therefore, by using telephoto images, the flying object detection system 1 can prevent situations where the target flying object captured in the captured image is too small to be detected due to its distance from the camera installation location, resulting in the flying object being missed.

 また、飛行体検知システム1は、遠方部分以外の検知領域部分における検知対象の飛行体の検知に広角画像を用いる。広角タイプの撮影装置は検知領域において望遠画像にうまく映らない部分(換言すれば遠方部分以外の検知領域部分)を映すことが可能である。このような広角タイプの撮影装置による広角画像を用いることによって、飛行体検知システム1は、遠方部分以外の検知領域部分における検知対象の飛行体を、検知漏れを防止しつつ検知できる。 Furthermore, the flying object detection system 1 uses wide-angle images to detect target flying objects in parts of the detection area other than the distant part. A wide-angle type imaging device is able to capture parts of the detection area that are not clearly visible in a telephoto image (in other words, parts of the detection area other than the distant part). By using wide-angle images from such a wide-angle type imaging device, the flying object detection system 1 can detect target flying objects in parts of the detection area other than the distant part while preventing missed detections.

 よって、広角画像および望遠画像を用いることにより、検知装置5および当該検知装置5を備える飛行体検知システム1は、検知領域における飛行体の検知精度が、当該検知領域を撮影する撮影装置からの距離に起因して低下することを防止できる。 Therefore, by using wide-angle images and telephoto images, the detection device 5 and the flying object detection system 1 equipped with the detection device 5 can prevent the detection accuracy of flying objects in the detection area from decreasing due to the distance from the imaging device that captures the detection area.

 <その他の実施形態>
 本開示は第1実施形態に限定されず、様々な実施の態様を採り得る。例えば、第1実施形態では、出力部54は、広角画像が基となっている検知結果反映画像を出力している。これに加えて、例えば、飛行体検知システム1のユーザにより、望遠画像の提供が要求された場合に、出力部54は、その要求に応じて望遠画像を要求元に出力してもよい。この場合には、例えば、出力部54は、出力する検知結果反映画像と同期している望遠画像を出力する。そのように出力された望遠画像は、例えば、表示装置7において、検知結果反映画像に代わって表示装置に表示されてもよいし、検知結果反映画像と、望遠画像とが並べられて表示装置7に表示されてもよい。
<Other embodiments>
The present disclosure is not limited to the first embodiment and may be embodied in various ways. For example, in the first embodiment, the output unit 54 outputs a detection result reflection image based on a wide-angle image. In addition, for example, when a user of the flying object detection system 1 requests the provision of a telephoto image, the output unit 54 may output the telephoto image to the requestor in response to the request. In this case, for example, the output unit 54 outputs a telephoto image synchronized with the detection result reflection image to be output. The telephoto image output in this manner may be displayed on the display device 7 in place of the detection result reflection image, or the detection result reflection image and the telephoto image may be displayed side by side on the display device 7.

 また、望遠画像が出力される場合には、生成部53は、次のような情報付き望遠画像を生成してもよい。つまり、広角画像から検知された検知結果の情報は、望遠画像を見ただけでは分からない。情報付き望遠画像とは、そのような広角画像から検知された検知結果が重畳された望遠画像である。望遠画像に重畳される広角画像からの検知結果を表す情報の態様は、例えば、図8に表されるような文字であってもよいし、広角画像から検知された飛行体の画像であってもよい。このような情報付き望遠画像は、当該望遠画像には映っていない広角画像からの検知結果の情報をも提供できる。 Furthermore, when a telephoto image is output, the generation unit 53 may generate a telephoto image with information such as the following. In other words, information on the detection results detected from the wide-angle image cannot be known just by looking at the telephoto image. A telephoto image with information is a telephoto image on which the detection results detected from such a wide-angle image are superimposed. The form of the information representing the detection results from the wide-angle image superimposed on the telephoto image may be, for example, text as shown in FIG. 8, or an image of an aircraft detected from the wide-angle image. Such a telephoto image with information can also provide information on the detection results from the wide-angle image that is not shown in the telephoto image.

 さらに、図3の例では、検知領域の遠方部分を撮影するために2台の望遠カメラ3を用いているが、検知領域の遠方部分の広さや望遠カメラ3の視野によっては、例えば図9に表されるように、遠方部分を撮影するために3台以上の望遠カメラ3を用いてもよい。なお、図9は、図3と同様に、広角カメラ2と望遠カメラ3とのそれぞれの検知領域における撮影範囲の一例を、天頂から見た模式的な図でもって表す図である。 Furthermore, in the example of Figure 3, two telephoto cameras 3 are used to capture images of the distant parts of the detection area, but depending on the size of the distant parts of the detection area and the field of view of the telephoto cameras 3, three or more telephoto cameras 3 may be used to capture images of the distant parts, as shown in Figure 9, for example. Note that, like Figure 3, Figure 9 is a schematic diagram showing an example of the capture ranges in the detection areas of the wide-angle camera 2 and the telephoto camera 3, viewed from the zenith.

 さらに、第1実施形態では、飛行体検知システムが監視システムに適用される例を示しているが、本開示における飛行体検知システムの適用は監視システムに限定されない。例えば、ドローンなどの無人航空機を用いた物流の拠点周辺領域における無人航空機の運航管理システムにおいて、管理対象の無人航空機を検知対象の飛行体として検知するために、本開示における飛行体検知システムが適用されてもよい。 Furthermore, while the first embodiment shows an example in which the flying object detection system is applied to a surveillance system, the application of the flying object detection system in the present disclosure is not limited to surveillance systems. For example, in an unmanned aircraft traffic management system in the area surrounding a logistics hub using unmanned aircraft such as drones, the flying object detection system in the present disclosure may be applied to detect unmanned aircraft that are subject to management as flying objects to be detected.

 また、第1実施形態では、検知対象の飛行体として、人工物である飛行体を例に挙げたが、例えば、検知対象の飛行体として、鳥が設定されていてもよい。つまり、鳥が航空機などに衝突すると、航空機の落下というような重大な事故が発生するおそれがある。鳥が航空機の近くを飛んでいると、そのような危険性があり、無人航空機の運航に支障を来すことが懸念される。このことから、例えば、本開示における飛行体検知システムが無人航空機の運航管理システムに適用される場合には、検知対象の飛行体として鳥も検知することとする。そして、検知対象の飛行体である管理対象の航空機の周囲に鳥(検知対象の飛行体)を検知した場合には、鳥が近くを飛んでいて当該鳥に因る危険性があることを知らせる情報を広角画像に重畳することにより、例えば図10に表されるような検知結果反映画像を生成し、出力してもよい。 Furthermore, in the first embodiment, an air vehicle that is a man-made object was used as an example of an air vehicle to be detected, but, for example, a bird may also be set as an air vehicle to be detected. In other words, if a bird collides with an aircraft, there is a risk of a serious accident, such as the aircraft crashing. If a bird flies near an aircraft, there is a risk of this happening, and there is concern that it may interfere with the operation of the unmanned aircraft. For this reason, for example, when the air vehicle detection system of the present disclosure is applied to an unmanned aircraft traffic management system, it will also detect birds as an air vehicle to be detected. Then, when a bird (air vehicle to be detected) is detected around a managed aircraft that is an air vehicle to be detected, information indicating that a bird is flying nearby and that there is a risk posed by the bird may be superimposed on the wide-angle image, thereby generating and outputting an image reflecting the detection results, such as that shown in FIG. 10.

 さらに、第1実施形態では、出力部54は、生成部53により検知結果反映画像を表示装置7に出力している。これに加えて、出力部54は、検知部52により飛行体が検知された場合に、飛行体が検知されたことを表す情報(飛行体検知情報)を、表示装置以外の予め定められた通知先に出力してもよい。通知先としては、飛行体検知システムが適用されている監視システムのコンピュータ装置が一例として挙げられる。飛行体検知情報としては、飛行体が検知されたことを表すメッセージの文字情報や、飛行体の検知を音で知らせるための報知音の制御情報などが例として挙げられる。 Furthermore, in the first embodiment, the output unit 54 outputs an image reflecting the detection result to the display device 7 using the generation unit 53. In addition to this, when an airborne object is detected by the detection unit 52, the output unit 54 may output information indicating that the airborne object has been detected (airborne object detection information) to a predetermined notification destination other than the display device. One example of the notification destination is a computer device of a monitoring system to which the airborne object detection system is applied. Examples of the airborne object detection information include text information of a message indicating that an airborne object has been detected, and control information for an alarm sound to notify the detection of an airborne object by sound.

 さらに、飛行体検知装置は、例えば、図11に表されるような構成をも採り得る。すなわち、飛行体検知装置10は、例えばコンピュータ装置であり、コンピュータプログラムを実行することにより実現される機能部として、取得部11、検知部12、生成部13および出力部14を備える。取得部11は、検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、検知領域において設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像を取得する。また、取得部11は、複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像を取得する。 Furthermore, the flying object detection device may also have a configuration such as that shown in FIG. 11. That is, the flying object detection device 10 is, for example, a computer device, and includes an acquisition unit 11, a detection unit 12, a generation unit 13, and an output unit 14 as functional units realized by executing a computer program. The acquisition unit 11 acquires telephoto images, which are images captured by a telephoto type imaging device that captures distant parts of the detection area that are far from the installation location, out of multiple types of imaging devices with different angles of view that are installed in a common installation location to capture images of the detection area. The acquisition unit 11 also acquires wide-angle images, which are images captured by a wide-angle type imaging device out of the multiple types of imaging devices.

 検知部12は、検知領域における遠方部分について検知対象の飛行体を望遠画像から検知する。また、検知部12は、遠方部分以外の検知領域部分について検知対象の飛行体を広角画像から検知する。 The detection unit 12 detects target flying objects in the distant part of the detection area from the telephoto image. The detection unit 12 also detects target flying objects in the part of the detection area other than the distant part from the wide-angle image.

 生成部13は、検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成する。出力部14は、検知結果反映画像を出力する。なお、前述した第1実施形態における検知装置5の取得部51、検知部52、生成部53および出力部54は、取得部11、検知部12、生成部13および出力部14の一例である。 The generation unit 13 generates an image reflecting the detection results by superimposing information indicating the position of the detected flying object in the image onto the wide-angle image. The output unit 14 outputs the image reflecting the detection results. Note that the acquisition unit 51, detection unit 52, generation unit 53, and output unit 54 of the detection device 5 in the first embodiment described above are examples of the acquisition unit 11, detection unit 12, generation unit 13, and output unit 14.

 飛行体検知装置10は上述したような構成を備えている。飛行体検知装置10は、図11において点線により表されているような撮影装置20,30と共に、飛行体検知システムを構築する。 The flying object detection device 10 has the configuration described above. Together with the imaging devices 20 and 30, as shown by the dotted lines in Figure 11, the flying object detection device 10 forms an flying object detection system.

 次に、飛行体検知装置10における動作の一例を、図12を参照しながら説明する。図12は、飛行体検知装置10の動作の一例を説明するフローチャートである。また、図12は、飛行体検知装置10の飛行体検知方法の一例を説明する図であるとも言える。 Next, an example of the operation of the flying object detection device 10 will be described with reference to Figure 12. Figure 12 is a flowchart explaining an example of the operation of the flying object detection device 10. Figure 12 can also be considered a diagram explaining an example of a flying object detection method used by the flying object detection device 10.

 例えば、取得部11が広角画像と望遠画像を取得すると(ステップ201)、検知部12が広角画像と望遠画像のそれぞれから検知対象の飛行体を検知する検知処理を実行する(ステップ202)。そして、検知対象の飛行体が検知された場合には、生成部13は、検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成する(ステップ203)。出力部14は、生成された検知結果反映画像を出力する(ステップ204)。 For example, when the acquisition unit 11 acquires a wide-angle image and a telephoto image (step 201), the detection unit 12 executes a detection process to detect the target flying object from each of the wide-angle image and the telephoto image (step 202). Then, if the target flying object is detected, the generation unit 13 generates an image reflecting the detection result by superimposing information representing the position of the detected flying object in the image on the wide-angle image (step 203). The output unit 14 outputs the generated image reflecting the detection result (step 204).

 飛行体検知装置10は、上述したように、検知領域における遠方部分に関しては望遠画像(つまり、望遠タイプの撮影装置により撮影された撮影画像)を用いて検知対象の飛行体の検知処理を実行する。このため、飛行体検知装置10は、遠方部分を飛行していて撮影装置の設置場所から離れたところに位置している検知対象の飛行体であっても望遠画像において拡大されて撮影されていることから、撮影画像に小さく映ることに起因した検知対象の飛行体の検知漏れを防止できる。 As described above, the flying object detection device 10 performs detection processing of the target flying object using telephoto images (i.e., images captured by a telephoto-type camera) for the distant parts of the detection area. Therefore, even if the target flying object is flying in the distant parts and located far from the installation location of the camera, the flying object detection device 10 can prevent the target flying object from being missed due to appearing small in the captured image, because the target flying object is magnified and captured in the telephoto image.

 また、飛行体検知装置10は、遠方部分以外の検知領域部分(換言すれば、遠方部分よりも撮影装置の設置場所に近い検知領域部分)については、広角タイプの撮影装置により撮影された撮影画像である広角画像を用いて検知対象の飛行体の検知処理を実行する。つまり、望遠タイプの撮影装置は、遠方部分にピントを合わせていることから、遠方部分よりも撮影装置の設置場所に近い検知領域部分においては、ピントが合っていなかったり、死角となる部分が広かったりして、望遠画像において飛行体をうまく撮影できない。これに対し、広角タイプの撮影装置は、望遠タイプの撮影装置の死角を補うことができる広い視野を持つことができるし、遠方部分よりも撮影装置の設置場所に近い検知領域部分にピントを合わせることにより当該検知領域部分における検知対象の飛行体を明瞭に撮影することが可能となる。このような広角タイプの撮影装置による広角画像を用いることにより、飛行体検知装置10は、望遠画像を用いることによる短所を、広角画像を用いることにより補うことができるから、飛行体検知において、検知領域における飛行体の検知精度が、当該検知領域を撮影する撮影装置からの距離に起因して低下することを防止できる。 Furthermore, for detection area portions other than the distant portion (in other words, detection area portions closer to the installation location of the camera device than the distant portion), the flying object detection device 10 performs detection processing of the target flying object using wide-angle images, which are images captured by a wide-angle type camera device. In other words, because a telephoto type camera device focuses on the distant portion, detection area portions closer to the installation location of the camera device than the distant portion may be out of focus or have large blind spots, making it impossible to capture the flying object properly in the telephoto image. In contrast, a wide-angle type camera device has a wide field of view that can compensate for the blind spots of a telephoto type camera device, and by focusing on detection area portions closer to the installation location of the camera device than the distant portion, it is possible to clearly capture the target flying object in that detection area portion. By using wide-angle images captured by such a wide-angle type imaging device, the flying object detection device 10 can compensate for the shortcomings of using telephoto images, thereby preventing the detection accuracy of flying objects in the detection area from decreasing due to the distance from the imaging device capturing the detection area.

 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られな。
[付記1]
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得する取得部と、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知する検知部と、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成する生成部と、
 検知結果反映画像を出力する出力部と
を備える飛行体検知装置。
[付記2]
 前記検知領域における遠方部分は、視野を互いにずらした複数の望遠タイプの撮影装置により撮影され、
 前記取得部は、前記複数の望遠タイプの撮影装置それぞれによる望遠画像を取得し、
 前記検知部は、前記検知領域の遠方部分について、前記複数の望遠タイプの撮影装置それぞれによる望遠画像を用いて検知対象の飛行体を検知する
付記1に記載の飛行体検知装置。
[付記3]
 前記検知部は、複数種の検知対象の飛行体をそれぞれ区別して検知する
付記1に記載の飛行体検知装置。
[付記4]
 前記生成部は、さらに、前記検知部によって広角画像から検知された検知結果を表す情報を、望遠画像に重畳させることにより、情報付き望遠画像を生成する
付記1に記載の飛行体検知装置。
[付記5]
 前記出力部は、検知された飛行体の画像を表示装置に拡大表示する要求に応じて当該飛行体の画像を拡大表示する表示制御を行う
付記1に記載の飛行体検知装置。
[付記6]
 前記出力部は、飛行体が検知されたことを表す情報を、表示装置以外の予め定められた通知先に出力する
付記1に記載の飛行体検知装置。
[付記7]
 前記検知領域の遠方部分と、遠方部分以外の検知領域部分とのそれぞれにおける同じ大きさの検知対象の飛行体が望遠画像と広角画像のそれぞれに映っている場合に当該望遠画像と広角画像のそれぞれに映っている検知対象の飛行体の大きさの違いが予め定められた範囲内となるようにカメラ設定が成されている前記望遠タイプの撮影装置および前記広角タイプの撮影装置による望遠画像と広角画像が前記取得部により取得され、
 前記検知部は、望遠画像と広角画像のそれぞれについて、検知対象の飛行体が撮影されている撮影画像を学習することにより生成された同じ検知モデルを用いて飛行体の検知処理を実行する
付記1に記載の飛行体検知装置。
[付記8]
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置と、
 前記望遠タイプの撮影装置によって撮影されない検知領域部分をも含めて撮影する前記複数タイプの撮影装置のうちの広角タイプの撮影装置と、
 前記望遠タイプの撮影装置および広角タイプの撮影装置によりそれぞれ撮影された撮影画像を用いる付記1に記載の飛行体検知装置と
を備える飛行体検知システム。
[付記9]
 コンピュータによって、
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得し、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知し、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成し、
 検知結果反映画像を出力する
飛行体検知方法。
[付記10]
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得する処理と、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知する処理と、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成する処理と、
 検知結果反映画像を出力する処理と
をコンピュータに実行させるコンピュータプログラムを記憶するプログラム記憶媒体。
A part or all of the above-described embodiments can be described as, but not limited to, the following supplementary notes.
[Appendix 1]
an acquisition unit that acquires a telephoto image, which is an image captured by a telephoto type imaging device that captures a distant part of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location to capture images of the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices;
a detection unit that detects a target flying object in a distant part of the detection area from a telephoto image and detects a target flying object in a part of the detection area other than the distant part from a wide-angle image;
a generation unit that generates an image reflecting the detection result by superimposing information indicating the position of the detected flying object in the image on the wide-angle image;
An aircraft detection device having an output unit that outputs an image reflecting the detection result.
[Appendix 2]
The distant part of the detection area is photographed by a plurality of telephoto type photographing devices whose fields of view are shifted from each other,
the acquisition unit acquires telephoto images captured by each of the plurality of telephoto type imaging devices,
The flying object detection device described in Appendix 1, wherein the detection unit detects the target flying object using telephoto images taken by each of the multiple telephoto type imaging devices in the distant part of the detection area.
[Appendix 3]
The detection unit is an aircraft detection device described in Appendix 1 that detects multiple types of aircraft to be detected separately.
[Appendix 4]
The generating unit further generates an information-attached telephoto image by superimposing information representing the detection result detected from the wide-angle image by the detecting unit on the telephoto image.
[Appendix 5]
The output unit is an aerial vehicle detection device described in Appendix 1, which performs display control to enlarge and display an image of the detected aerial vehicle in response to a request to enlarge and display the image of the aerial vehicle on a display device.
[Appendix 6]
The output unit outputs information indicating that an air vehicle has been detected to a predetermined notification destination other than a display device.
[Appendix 7]
The telephoto and wide-angle images are acquired by the acquisition unit using the telephoto type imaging device and the wide-angle type imaging device, whose cameras are set so that when a flying object of the same size is captured in a telephoto image and a wide-angle image in a distant part of the detection area and in a part of the detection area other than the distant part, the difference in size of the flying object of the detection object captured in the telephoto image and the wide-angle image falls within a predetermined range;
The detection unit performs the detection process for the flying object using the same detection model generated by learning captured images of the flying object to be detected for each of the telephoto and wide-angle images.
[Appendix 8]
a telephoto type imaging device that captures an image of a distant part of the detection area that is far from the installation location, among a plurality of types of imaging devices with different angles of view that are installed at a common installation location for capturing an image of the detection area;
a wide-angle type imaging device among the plurality of types of imaging devices that captures images of a detection area including a portion that is not captured by the telephoto type imaging device; and
An aircraft detection system comprising the aircraft detection device described in Appendix 1, which uses images captured by the telephoto type imaging device and the wide-angle type imaging device, respectively.
[Appendix 9]
By computer,
Among a plurality of types of imaging devices with different angles of view installed at a common installation location for imaging the detection area, a telephoto image is acquired which is an image captured by a telephoto type imaging device which captures a distant portion of the detection area that is far from the installation location, and a wide-angle image is acquired which is an image captured by a wide-angle type imaging device among the plurality of types of imaging devices,
Detecting a target flying object in a distant part of the detection area from a telephoto image, and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image;
generating an image reflecting the detection results by superimposing information representing the position of the detected flying object in the image on the wide-angle image;
A flying object detection method that outputs an image reflecting the detection results.
[Supplementary Note 10]
a process of acquiring a telephoto image, which is an image captured by a telephoto type imaging device that captures a distant portion of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location to capture the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices;
A process of detecting a target flying object in a distant part of the detection area from a telephoto image and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image;
A process of generating an image reflecting the detection result by superimposing information representing the position of the detected flying object in the image on the wide-angle image;
and a program storage medium for storing a computer program that causes a computer to execute a process of outputting an image that reflects the detection result.

 なお、上述した付記1に従属する付記2~付記7に記載した構成の一部または全ては、付記8~10のそれぞれに対しても付記2~付記7と同様の従属関係により従属し得る。さらには、付記1、付記8~10に限らず、上述した各実施の形態から逸脱しない範囲において、様々なハードウェア、ソフトウェア、ソフトウェアを記録するための種々の記録手段、またはシステムに対しても同様に、付記として記載した構成の一部または全てを従属させ得る。 In addition, some or all of the configurations described in Supplementary Notes 2 to 7, which are dependent on Supplementary Note 1, may also be dependent on Supplementary Notes 8 to 10 in the same dependent relationship as Supplementary Notes 2 to 7. Furthermore, not limited to Supplementary Notes 1 and Supplementary Notes 8 to 10, some or all of the configurations described as supplementary notes may also be dependent on various hardware, software, various recording means for recording software, or systems, as long as they do not deviate from the respective embodiments described above.

 以上、実施の形態を参照して本開示を説明したが、本開示は上述の実施の形態に限定されるものではない。本開示の構成や詳細には、本開示のスコープ内で当業者が理解し得る様々な変更をすることができる。そして、各実施の形態は、適宜他の実施の形態と組み合わせることができる。 The present disclosure has been described above with reference to the embodiments, but the present disclosure is not limited to the above-described embodiments. Various modifications that can be understood by those skilled in the art can be made to the configuration and details of the present disclosure within the scope of the present disclosure. Furthermore, each embodiment can be combined with other embodiments as appropriate.

 この出願は、2024年2月5日に出願された日本出願特願2024-015839を基礎とする優先権を主張し、その開示の全てをここに取り込む。 This application claims priority based on Japanese Patent Application No. 2024-015839, filed February 5, 2024, the disclosure of which is incorporated herein in its entirety.

 1 飛行体検知システム
 2 広角カメラ
 3 望遠カメラ
 5 検知装置
 10 飛行体検知装置
 11,51 取得部
 12,52 検知部
 13,53 生成部
 14,54 出力部
REFERENCE SIGNS LIST 1 Flying object detection system 2 Wide-angle camera 3 Telephoto camera 5 Detection device 10 Flying object detection device 11, 51 Acquisition unit 12, 52 Detection unit 13, 53 Generation unit 14, 54 Output unit

Claims (20)

 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得する取得手段と、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知する検知手段と、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成する生成手段と、
 検知結果反映画像を出力する出力手段と
を備える飛行体検知装置。
an acquisition means for acquiring a telephoto image, which is an image captured by a telephoto type imaging device that captures a distant portion of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location for capturing images of the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices;
a detection means for detecting a target flying object in a distant part of the detection area from a telephoto image and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image;
a generating means for generating an image reflecting the detection result by superimposing information representing the position of the detected flying object in the image on the wide-angle image;
An air vehicle detection device having an output means for outputting an image reflecting the detection result.
 前記検知領域における遠方部分は、視野を互いにずらした複数の望遠タイプの撮影装置により撮影され、
 前記取得手段は、前記複数の望遠タイプの撮影装置それぞれによる望遠画像を取得し、
 前記検知手段は、前記検知領域の遠方部分について、前記複数の望遠タイプの撮影装置それぞれによる望遠画像を用いて検知対象の飛行体を検知する
請求項1に記載の飛行体検知装置。
The distant part of the detection area is photographed by a plurality of telephoto type photographing devices whose fields of view are shifted from each other,
the acquisition means acquires telephoto images taken by each of the plurality of telephoto type imaging devices,
2. The flying object detection device according to claim 1, wherein the detection means detects the target flying object in the distant part of the detection area using telephoto images taken by each of the plurality of telephoto type imaging devices.
 前記検知手段は、複数種の検知対象の飛行体をそれぞれ区別して検知する
請求項1又は請求項2に記載の飛行体検知装置。
3. The flying object detection device according to claim 1, wherein the detection means detects a plurality of types of flying objects as detection targets by distinguishing between each of the types.
 前記生成手段は、さらに、前記検知手段によって広角画像から検知された検知結果を表す情報を、望遠画像に重畳させることにより、情報付き望遠画像を生成する
請求項1乃至請求項3の何れか一項に記載の飛行体検知装置。
The generating means further generates an information-added telephoto image by superimposing information representing the detection result detected from the wide-angle image by the detection means onto the telephoto image. An aircraft detection device as described in any one of claims 1 to 3.
 前記出力手段は、検知された飛行体の画像を表示装置に拡大表示する要求に応じて当該飛行体の画像を拡大表示する表示制御を行う
請求項1乃至請求項4の何れか一項に記載の飛行体検知装置。
The flying object detection device according to any one of claims 1 to 4, wherein the output means performs display control to enlarge and display an image of the detected flying object on a display device in response to a request to enlarge and display the image of the flying object.
 前記出力手段は、飛行体が検知されたことを表す情報を、表示装置以外の予め定められた通知先に出力する
請求項1乃至請求項5の何れか一項に記載の飛行体検知装置。
The flying object detection device according to any one of claims 1 to 5, wherein the output means outputs information indicating that the flying object has been detected to a predetermined notification destination other than the display device.
 前記検知領域の遠方部分と、遠方部分以外の検知領域部分とのそれぞれにおける同じ大きさの検知対象の飛行体が望遠画像と広角画像のそれぞれに映っている場合に当該望遠画像と広角画像のそれぞれに映っている検知対象の飛行体の大きさの違いが予め定められた範囲内となるようにカメラ設定が成されている前記望遠タイプの撮影装置および前記広角タイプの撮影装置による望遠画像と広角画像が前記取得手段により取得され、
 前記検知手段は、望遠画像と広角画像のそれぞれについて、検知対象の飛行体が撮影されている撮影画像を学習することにより生成された同じ検知モデルを用いて飛行体の検知処理を実行する
請求項1乃至請求項6の何れか一項に記載の飛行体検知装置。
The telephoto and wide-angle images are acquired by the acquisition means using the telephoto type imaging device and the wide-angle type imaging device, whose cameras are set so that when a flying object of the same size is captured in a telephoto image and a wide-angle image in a distant part of the detection area and in a part of the detection area other than the distant part, the difference in size of the flying object of the detection object captured in the telephoto image and the wide-angle image falls within a predetermined range;
The detection means performs the detection process for the flying object using the same detection model generated by learning photographed images of the flying object to be detected for each of the telephoto image and the wide-angle image. An flying object detection device as described in any one of claims 1 to 6.
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置と、
 前記望遠タイプの撮影装置によって撮影されない検知領域部分をも含めて撮影する前記複数タイプの撮影装置のうちの広角タイプの撮影装置と、
 前記望遠タイプの撮影装置および広角タイプの撮影装置によりそれぞれ撮影された撮影画像を用いる請求項1乃至請求項7の何れか一項に記載の飛行体検知装置と
を備える飛行体検知システム。
a telephoto type imaging device that captures an image of a distant part of the detection area that is far from the installation location, among a plurality of types of imaging devices with different angles of view that are installed at a common installation location for capturing an image of the detection area;
a wide-angle type imaging device among the plurality of types of imaging devices that captures images of the detection area including a portion that is not captured by the telephoto type imaging device; and
8. A flying object detection system comprising the flying object detection device according to claim 1, which uses images captured by the telephoto type imaging device and the wide-angle type imaging device, respectively.
 コンピュータによって、
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得し、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知し、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成し、
 検知結果反映画像を出力する
飛行体検知方法。
By computer,
Among a plurality of types of imaging devices with different angles of view installed at a common installation location for imaging the detection area, a telephoto image is acquired which is an image captured by a telephoto type imaging device which captures a distant portion of the detection area that is far from the installation location, and a wide-angle image is acquired which is an image captured by a wide-angle type imaging device among the plurality of types of imaging devices,
Detecting a target flying object in a distant part of the detection area from a telephoto image, and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image;
generating an image reflecting the detection results by superimposing information representing the position of the detected flying object in the image on the wide-angle image;
A flying object detection method that outputs an image reflecting the detection results.
 検知領域を撮影する共通の設置場所に設置された画角が異なる複数タイプの撮影装置のうちの、前記検知領域において前記設置場所から遠方となる遠方部分を撮影する望遠タイプの撮影装置により撮影された撮影画像である望遠画像と、前記複数タイプの撮影装置のうちの広角タイプの撮影装置により撮影された撮影画像である広角画像とを取得する処理と、
 前記検知領域の遠方部分における検知対象の飛行体を望遠画像から検知し、遠方部分以外の検知領域部分における検知対象の飛行体を広角画像から検知する処理と、
 検知された飛行体の画像での位置を表す情報を広角画像に重畳させることにより、検知結果反映画像を生成する処理と、
 検知結果反映画像を出力する処理と
をコンピュータに実行させるコンピュータプログラムを記憶するプログラム記憶媒体。
a process of acquiring a telephoto image, which is an image captured by a telephoto type imaging device that captures a distant portion of the detection area that is far from the installation location, among multiple types of imaging devices with different angles of view that are installed at a common installation location to capture the detection area, and a wide-angle image, which is an image captured by a wide-angle type imaging device among the multiple types of imaging devices;
A process of detecting a target flying object in a distant part of the detection area from a telephoto image, and detecting a target flying object in a part of the detection area other than the distant part from a wide-angle image;
A process of generating an image reflecting the detection result by superimposing information representing the position of the detected flying object in the image on the wide-angle image;
and a program storage medium for storing a computer program that causes a computer to execute a process of outputting an image that reflects the detection result.
 前記検知領域における遠方部分は、視野を互いにずらした複数の望遠タイプの撮影装置により撮影され、
 前記コンピュータによって、前記複数の望遠タイプの撮影装置それぞれによる望遠画像をさらに取得し、
 前記コンピュータは、前記検知領域の遠方部分について、前記複数の望遠タイプの撮影装置それぞれによる望遠画像を用いて検知対象の飛行体を検知する
請求項9に記載の飛行体検知方法。
The distant part of the detection area is photographed by a plurality of telephoto type photographing devices whose fields of view are shifted from each other,
further acquiring telephoto images by each of the plurality of telephoto type imaging devices by the computer;
The flying object detection method according to claim 9, wherein the computer detects the target flying object using telephoto images captured by each of the plurality of telephoto type imaging devices in the distant part of the detection area.
 検知対象の飛行体を検知する場合に、複数種の検知対象の飛行体をそれぞれ区別して検知する
請求項9又は請求項11に記載の飛行体検知方法。
12. The flying object detection method according to claim 9 or 11, wherein when detecting a target flying object, a plurality of types of target flying objects are detected by distinguishing between each of them.
 前記コンピュータによって、前記広角画像から検知された検知結果を表す情報を、望遠画像に重畳させることにより、情報付き望遠画像をさらに生成する
請求項9、請求項11又は請求項12に記載の飛行体検知方法。
13. The flying object detection method according to claim 9, claim 11 or claim 12, wherein the computer further generates an information-added telephoto image by superimposing information representing the detection results detected from the wide-angle image onto the telephoto image.
 前記コンピュータは、検知された飛行体の画像を表示装置に拡大表示する要求に応じて当該飛行体の画像を拡大表示する表示制御を行う
請求項9、又は請求項11乃至請求項13の何れか一項に記載の飛行体検知方法。
The computer performs display control to enlarge and display an image of the detected flying object on a display device in response to a request to enlarge and display an image of the flying object.
 前記コンピュータは、飛行体が検知されたことを表す情報を、表示装置以外の予め定められた通知先にさらに出力する
請求項9、又は請求項11乃至請求項14の何れか一項に記載の飛行体検知方法。
The flying object detection method according to claim 9 or any one of claims 11 to 14, wherein the computer further outputs information indicating that the flying object has been detected to a predetermined notification destination other than a display device.
 前記検知領域における遠方部分は、視野を互いにずらした複数の望遠タイプの撮影装置により撮影され、
 前記複数の望遠タイプの撮影装置それぞれによる望遠画像を取得する処理と、
 前記検知領域の遠方部分について、前記複数の望遠タイプの撮影装置それぞれによる望遠画像を用いて検知対象の飛行体を検知する処理と
をコンピュータに実行させるコンピュータプログラムをさらに記憶する
請求項10に記載のプログラム記憶媒体。
The distant part of the detection area is photographed by a plurality of telephoto type photographing devices whose fields of view are shifted from each other,
A process of acquiring telephoto images by each of the plurality of telephoto type imaging devices;
The program storage medium of claim 10 further stores a computer program that causes a computer to execute a process of detecting the target flying object using telephoto images from each of the plurality of telephoto type imaging devices in the distant portion of the detection area.
 前記検知対象の飛行体を検知する処理では、複数種の検知対象の飛行体をそれぞれ区別して検知する
請求項10又は請求項16に記載のプログラム記憶媒体。
17. The program storage medium according to claim 10 or claim 16, wherein the process of detecting the target flying object comprises distinguishing and detecting each of a plurality of types of target flying objects.
 前記検知結果反映画像を生成する処理において、前記広角画像から検知された検知結果を表す情報を、望遠画像に重畳させることにより、情報付き望遠画像を生成する処理をコンピュータに実行させるコンピュータプログラムをさらに記憶する
請求項10、請求項16又は請求項17に記載のプログラム記憶媒体。
18. The program storage medium according to claim 10, claim 16, or claim 17, further storing a computer program that causes a computer to execute a process of generating an information-added telephoto image by superimposing information representing the detection result detected from the wide-angle image onto the telephoto image in the process of generating the detection result reflection image.
 検知された飛行体の画像を表示装置に拡大表示する要求に応じて当該飛行体の画像を拡大表示する表示制御を行う処理をコンピュータに実行させるコンピュータプログラムをさらに記憶する
請求項10、又は請求項16乃至請求項18の何れか一項に記載のプログラム記憶媒体。
A program storage medium as described in claim 10 or any one of claims 16 to 18, further storing a computer program that causes a computer to execute a process of performing display control to enlarge and display an image of a detected flying object in response to a request to enlarge and display the image of the flying object on a display device.
 飛行体が検知されたことを表す情報を、表示装置以外の予め定められた通知先に出力する処理をコンピュータに実行させるコンピュータプログラムをさらに記憶する
請求項10、又は請求項16乃至請求項19の何れか一項に記載のプログラム記憶媒体。
A program storage medium as described in claim 10 or any one of claims 16 to 19, further storing a computer program that causes a computer to execute a process of outputting information indicating that an aircraft has been detected to a predetermined notification destination other than a display device.
PCT/JP2025/001506 2024-02-05 2025-01-20 Flying body detecting device, flying body detecting system, flying body detecting method, and program storage medium Pending WO2025169696A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2024015839 2024-02-05
JP2024-015839 2024-02-05

Publications (1)

Publication Number Publication Date
WO2025169696A1 true WO2025169696A1 (en) 2025-08-14

Family

ID=96699844

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2025/001506 Pending WO2025169696A1 (en) 2024-02-05 2025-01-20 Flying body detecting device, flying body detecting system, flying body detecting method, and program storage medium

Country Status (1)

Country Link
WO (1) WO2025169696A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010141429A (en) * 2008-12-09 2010-06-24 Panasonic Electric Works Co Ltd Monitoring system
JP2013013050A (en) * 2011-05-27 2013-01-17 Ricoh Co Ltd Imaging apparatus and display method using imaging apparatus
US20160055400A1 (en) * 2014-08-21 2016-02-25 Boulder Imaging, Inc. Avian detection systems and methods
JP2017135548A (en) * 2016-01-27 2017-08-03 セコム株式会社 Flying object monitoring system
US20200144186A1 (en) * 2017-09-13 2020-05-07 Intel Corporation Active silicon bridge
CN114097764A (en) * 2021-12-22 2022-03-01 赵世高 Intelligent bird repelling system
JP2024003475A (en) * 2022-06-27 2024-01-15 オプテックス株式会社 Security systems and programs for security systems
JP2025010978A (en) * 2023-07-10 2025-01-23 キヤノン株式会社 Imaging device and control method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010141429A (en) * 2008-12-09 2010-06-24 Panasonic Electric Works Co Ltd Monitoring system
JP2013013050A (en) * 2011-05-27 2013-01-17 Ricoh Co Ltd Imaging apparatus and display method using imaging apparatus
US20160055400A1 (en) * 2014-08-21 2016-02-25 Boulder Imaging, Inc. Avian detection systems and methods
JP2017135548A (en) * 2016-01-27 2017-08-03 セコム株式会社 Flying object monitoring system
US20200144186A1 (en) * 2017-09-13 2020-05-07 Intel Corporation Active silicon bridge
CN114097764A (en) * 2021-12-22 2022-03-01 赵世高 Intelligent bird repelling system
JP2024003475A (en) * 2022-06-27 2024-01-15 オプテックス株式会社 Security systems and programs for security systems
JP2025010978A (en) * 2023-07-10 2025-01-23 キヤノン株式会社 Imaging device and control method

Similar Documents

Publication Publication Date Title
US11356599B2 (en) Human-automation collaborative tracker of fused object
Lai et al. Characterization of Sky‐region Morphological‐temporal Airborne Collision Detection
US11373409B2 (en) Photography system
WO2008105935A2 (en) Video surveillance system providing tracking of a moving object in a geospatial model and related methods
JP7293174B2 (en) Road Surrounding Object Monitoring Device, Road Surrounding Object Monitoring Program
CN107329478A (en) A kind of life detection car, wearable device and virtual reality detection system
CN112802100A (en) Intrusion detection method, device, equipment and computer readable storage medium
CN116558364B (en) An Interference Interception System and Method for Unknown Flying Vehicles
CN109708659A (en) A kind of distributed intelligence photoelectricity low latitude guard system
JP2022167343A (en) Earth surface condition grasping method, earth surface condition grasping device, and earth surface condition grasping program
US11257386B1 (en) Camera-based angle tracking of swarms for collision avoidance
US20240096099A1 (en) Intrusion determination device, intrusion detection system, intrusion determination method, and program storage medium
WO2025169696A1 (en) Flying body detecting device, flying body detecting system, flying body detecting method, and program storage medium
CN111343431B (en) Airport target detection system based on image rectification
JP2023129877A (en) Information processing device, information processing method, program, presentation control device and system
Rasheed et al. Development of Drone-Based Human Rescue Strategies Using Virtual Reality
WO2025169697A1 (en) Flying body detection device, flying body detection system, flying body detection method, and program storage medium
KR102569406B1 (en) System and server for monitoring green and red tide using multi-spectral camera, and method thereof
Briese et al. Deep learning with semi-synthetic training images for detection of non-cooperative UAVs
Gur et al. Image processing based approach for crime scene investigation using drone
US12469298B2 (en) Information processing apparatus and information processing method for generating crowd-avoidance path plans
Khan et al. Rotorcraft flight information inference from cockpit videos using deep learning
WO2025197288A1 (en) Flying object detection device, flying object detection system, flying object detection method, and program storage medium
CN120747800B (en) A method for inspecting airport navigation lights based on UAV technology
CN118552885B (en) Cableway intrusion event analysis and early warning method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25752118

Country of ref document: EP

Kind code of ref document: A1