[go: up one dir, main page]

WO2024195139A1 - Image analysis system, image analysis method, and recording medium - Google Patents

Image analysis system, image analysis method, and recording medium Download PDF

Info

Publication number
WO2024195139A1
WO2024195139A1 PCT/JP2023/011668 JP2023011668W WO2024195139A1 WO 2024195139 A1 WO2024195139 A1 WO 2024195139A1 JP 2023011668 W JP2023011668 W JP 2023011668W WO 2024195139 A1 WO2024195139 A1 WO 2024195139A1
Authority
WO
WIPO (PCT)
Prior art keywords
analyzed
detection
image
environmental data
accuracy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2023/011668
Other languages
French (fr)
Japanese (ja)
Inventor
知子 丸山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to JP2025508100A priority Critical patent/JPWO2024195139A5/en
Priority to PCT/JP2023/011668 priority patent/WO2024195139A1/en
Publication of WO2024195139A1 publication Critical patent/WO2024195139A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • This disclosure relates to image analysis systems, etc.
  • Satellite images taken by artificial satellites are widely used to analyze objects on the Earth's surface.
  • SAR Synthetic Aperture Radar
  • satellite images taken by SAR are suitable for analyzing objects on the Earth's surface.
  • satellite images taken by SAR may be used to identify objects that exist in the area being analyzed.
  • the mapping images taken by SAR are images expressed in monochrome gradations. In images expressed in monochrome gradations, for example, the contrast ratio between the sea or land and the object being analyzed is insufficient, and it may be difficult to distinguish the object in the satellite image. For this reason, for example, analysis of objects in satellite images may be performed using data other than satellite images.
  • the data analysis device of Patent Document 1 acquires satellite images and observation data of the earth's surface environment.
  • the data analysis device of Patent Document 1 generates analysis data by correcting the analysis data based on the satellite images, based on the observation data of the earth's surface environment.
  • Patent Document 1 The technology described in Patent Document 1 can make it difficult to identify objects that exist in the area being analyzed.
  • the present disclosure aims to provide an image analysis system etc. that can easily identify objects present in the area being analyzed in order to solve the above problems.
  • the image analysis system disclosed herein includes an image acquisition means for acquiring satellite images of the area to be analyzed, an environmental data acquisition means for acquiring environmental data of the area to be analyzed, a detection means for detecting the object to be analyzed that appears in the satellite images, an estimation means for estimating the accuracy of detection based on the environmental data, and an output means for outputting the detection results and information indicating the accuracy of detection.
  • the image analysis method disclosed herein acquires satellite images of the area to be analyzed, acquires environmental data of the area to be analyzed, detects the object to be analyzed that appears in the satellite images, estimates the accuracy of detection based on the environmental data, and outputs the detection results and information indicating the accuracy of detection.
  • the recording medium of the present disclosure non-temporarily records an image analysis program that causes a computer to execute the following processes: acquiring satellite images of an area to be analyzed; acquiring environmental data of the area to be analyzed; detecting an object to be analyzed that appears in the satellite images; estimating the accuracy of detection based on the environmental data; and outputting the detection results and information indicating the accuracy of detection.
  • the present disclosure makes it easier to identify objects present in the area being analyzed.
  • FIG. 1 is a diagram illustrating an example of a configuration according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram illustrating a form in which the earth's surface is imaged by an artificial satellite.
  • FIG. 2 is a diagram illustrating an example of a satellite image according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating an example of a satellite image according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating an example of a satellite image according to an embodiment of the present disclosure.
  • FIG. 1 is a diagram illustrating an example of a configuration of an image analysis system according to an embodiment of the present disclosure.
  • FIG. 13 is a diagram illustrating an example of a display screen of an analysis result according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating an example of an operation flow of the image analysis system according to an embodiment of the present disclosure.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of an image analysis system according to an embodiment
  • FIG. 1 is a diagram showing an example of the configuration of an analysis system.
  • the analysis system includes, for example, an image analysis system 10, a terminal device 20, a satellite image management server 30, and an environmental data management server 40.
  • the image analysis system 10 is connected to the terminal device 20, for example, via a network.
  • the image analysis system 10 is connected to the satellite image management server 30, for example, via a network.
  • the image analysis system 10 is connected to the environmental data management server 40, for example, via a network.
  • the image analysis system 10 is, for example, a system that analyzes satellite images.
  • a satellite image is, for example, an image of the earth's surface captured by an imaging device mounted on an artificial satellite.
  • a satellite image is, for example, captured by a synthetic aperture radar (SAR).
  • a satellite image may be captured by an imaging device other than a SAR.
  • a satellite image is, for example, used to detect an object captured in the satellite image.
  • An object captured in the satellite image is detected, for example, by image recognition.
  • an image captured by a SAR is a monochrome image based on electromagnetic waves reflected by the earth's surface, so the accuracy of object detection by image recognition may not be sufficient.
  • an object other than a ship may be detected as a ship by image recognition.
  • an analysis is performed using an image captured by a SAR, for example, after an object is detected by image recognition, the person in charge of analysis may identify the object captured in the image.
  • the analyst for example, visually checks the images captured by the SAR to identify objects that appear in the images captured by the SAR.
  • FIG. 2 is a diagram showing a schematic example of an image of the earth's surface captured by a satellite.
  • a ship is sailing on the sea.
  • the area in which the ship is sailing is captured by an imaging device mounted on the satellite.
  • the image of the earth's surface is captured by, for example, a SAR.
  • the satellite image captured by the imaging device mounted on the satellite is output to a satellite image management server 30 via, for example, a ground station.
  • environmental data is observed by an observation sensor installed on the sea.
  • the observation sensor installed on the sea is, for example, an observation buoy that observes water temperature and waves.
  • the waves are, for example, wave height, wave direction, and wave period.
  • the observation sensor is not limited to an observation buoy.
  • the items of environmental data observed by the observation sensor are not limited to those mentioned above.
  • the observed environmental data is output to, for example, an environmental data management server 40.
  • the image analysis system 10 acquires satellite images captured by an artificial satellite from, for example, the satellite image management server 30.
  • the image analysis system 10 also acquires environmental data observed by an observation sensor from, for example, the environmental data management server 40.
  • the image analysis system 10 detects an object to be analyzed that appears in the acquired satellite image, for example, using an image analysis model.
  • the image analysis system 10 estimates the accuracy of detection of the object to be analyzed in the detection result by the image recognition model, for example, based on the acquired environmental data of the area to be analyzed.
  • the accuracy of detection of the object to be analyzed is, for example, an index indicating the accuracy of the detection result of the object.
  • the accuracy of detection of the object to be analyzed is an index indicating the accuracy of whether the object detected by the image recognition model is the object to be analyzed.
  • the accuracy of detection of the object to be analyzed is an index indicating the possibility that the object is a ship.
  • the image recognition model is a learning model that detects objects captured in satellite images. The image recognition model will be explained later.
  • the environmental data is, for example, data related to the environment that may affect the presence or absence of an object.
  • the environmental data is, for example, observational data obtained by observing the environment that may affect the presence or absence of an object.
  • the environmental data may be data related to topography.
  • the environmental data may be data on water depth.
  • the water depth data may be data obtained by observing changes in water depth due to tides.
  • the navigation or anchoring of the ship may be affected by water depth, tides, waves, wind speed, and topography.
  • a large ship cannot navigate in shallow water.
  • a normal small ship avoids places with fast tides and high waves because it is dangerous to navigate in places with fast tides and high waves.
  • Objects other than the ship are, for example, marine life.
  • the presence or absence of marine life may be affected, for example, by the environment.
  • the presence or absence of marine life may be affected, for example, by water depth, water temperature, air temperature, tides, waves, wind speed, and topography.
  • Environmental items that may affect the presence of marine life are not limited to the above.
  • the image analysis system 10 estimates the accuracy of detection by image recognition based on environmental data regarding the presence of marine life.
  • the image analysis system 10 can estimate that the accuracy of the detection of the ship in the detection results by the image recognition model is high. Also, for example, when it is estimated from the environmental data that the possibility of it being marine life is high, the image analysis system 10 estimates that the accuracy of the detection of the ship in the detection results by the image recognition model is low.
  • the image analysis system 10 outputs, for example, to the terminal device 20, the detection result of the object to be analyzed that appears in the satellite image and the detection accuracy.
  • the terminal device 20 outputs, for example, the detection result of the object to be analyzed that appears in the satellite image and the detection accuracy to a display device not shown.
  • the analyst for example, refers to the detection result of the object to be analyzed that appears in the satellite image displayed on the display device and the detection accuracy to identify the object. For example, when a ship to be analyzed is detected and the estimation result of the detection accuracy based on the environmental data indicates that it is highly likely that it is not a ship, the analyst refers to the estimation result of the detection accuracy to determine whether it is a ship.
  • the analyst refers to a satellite image when it was identified as an object other than a ship in a past analysis to determine whether the object detected as a ship is a ship.
  • the analyst identifies the object by taking into account the possibility of the presence of a suspicious ship. This is because if a ship is present in an area in the environmental data where the presence of ships is unlikely, it may be navigating or anchoring with an unusual intention.
  • An unusual intention means, for example, a purpose that is different from the usual purpose in the sea area in which the ship is navigating or anchoring.
  • the task of identifying the object shown in the satellite image can be made easier.
  • Figure 3 shows an example of a satellite image of the earth's surface captured by SAR.
  • the example of the satellite image in Figure 3 is an example of a satellite image of the area around a bay.
  • the example of the satellite image in Figure 3 is an example of a satellite image in which there are no ships or other objects in the captured area.
  • the areas with low brightness are land areas.
  • the areas with high brightness are sea areas.
  • FIG. 4 is an example of a satellite image captured by SAR of the same area as the example satellite image of FIG. 3, where a ship and objects other than ships are present in the same area.
  • a ship and other objects are captured in an elliptical shape in the sea area.
  • objects are also captured in an elliptical and circular shape in the land area.
  • the image analysis system 10 detects, for example, an object to be analyzed that is captured in the satellite image of the example of FIG. 4. If the object to be analyzed is a ship, the image analysis system 10 detects the ship captured in the satellite image of the example of FIG. 4, for example, using an image recognition model. Then, when a ship is detected from the satellite image of the example of FIG. 4, the image analysis system 10 estimates the likelihood that the detected object is a ship based on environmental data for the area captured in the satellite image of the example of FIG. 4.
  • the example of the display screen of FIG. 5 is an example of a display screen showing the detection result of the object to be analyzed and the estimation result of the detection accuracy.
  • the example of the display screen of FIG. 5 is an example of a display screen showing the detection result of the object to be analyzed and the estimation result of the detection accuracy when the object to be analyzed captured in the satellite image of the example of FIG. 4 is detected.
  • the area where the object to be analyzed is detected is surrounded by a solid or dashed rectangle.
  • the area surrounded by the solid rectangle is, for example, an area where the detection accuracy is equal to or higher than the standard.
  • the area surrounded by the dashed rectangle is, for example, an area where the detection accuracy is lower than the standard.
  • the person in charge of analysis can grasp, for example, an area where the detection accuracy by the image recognition model is low and a more detailed analysis is required.
  • the person in charge of analysis can identify the object captured in the satellite image by performing a detailed analysis of the area surrounded by the dashed line, taking into account that the estimation result of the accuracy based on the environmental data shows that the object is unlikely to be a ship.
  • Figure 6 is a diagram showing an example of the configuration of the image analysis system 10.
  • the image analysis system 10 basically comprises a satellite image acquisition unit 12, an environmental data acquisition unit 13, a detection unit 14, an estimation unit 15, and an output unit 16.
  • the image analysis system 10 also comprises, for example, an acquisition unit 11 and a storage unit 17.
  • the acquisition unit 11 acquires information about the analysis target.
  • the acquisition unit 11 acquires information about the analysis target, for example, from the terminal device 20.
  • the information about the analysis target is, for example, information indicating at least one of the region of the analysis target and the object of the analysis target.
  • the area to be analyzed is, for example, information indicating the range of the area to be analyzed.
  • the area to be analyzed may be set in advance. For example, if the area to be analyzed by the image analysis system 10 is only a specific area, the acquisition unit 11 does not need to acquire information indicating the area to be analyzed.
  • the information on the object to be analyzed is, for example, information indicating what is to be detected.
  • the object to be analyzed is, for example, a ship.
  • the ship to be analyzed is, for example, a ship that is sailing or anchored.
  • the ship to be analyzed may be both a ship that is sailing and a ship that is anchored.
  • the ship to be analyzed may also be a ship that is located on land.
  • the object to be analyzed may also be an aircraft, vehicle, structure, or stored item on land.
  • the object to be analyzed is not limited to the above.
  • the object to be analyzed may also be set in advance. For example, when the image analysis system 10 is used only to analyze the presence or absence of a ship, the acquisition unit 11 does not need to newly acquire information indicating that the presence of a ship is to be detected.
  • the information indicating the object to be analyzed may be the type of the object to be analyzed.
  • the type of object to be analyzed is, for example, the type of ship.
  • the type of ship may be, for example, a large ship, a medium-sized ship, or a small ship.
  • the type of ship may be, for example, a tanker, a passenger ship, a cargo ship, a ferry, a work boat, or a fishing boat.
  • the type of ship may be, for example, a ship capable of carrying an aircraft squadron, a conventional ship, or a submarine.
  • the types of ships are not limited to the above.
  • the types of objects are not limited to the above.
  • the information on the subject of analysis may include information indicating the timing to be analyzed.
  • the information indicating the timing to be analyzed is, for example, information indicating the timing when the satellite image to be analyzed was captured and the timing when the environmental data was observed.
  • the information indicating the timing to be analyzed is set by date, time, day, or period.
  • the period to be analyzed is set, for example, using the first day of the period and the last day of the period. How the period to be analyzed is set is not limited to the above.
  • the information on the subject of analysis may also be information indicating which area within the region to be analyzed is to be analyzed.
  • the information on the subject of analysis is not limited to the above.
  • the information on the subject of analysis is set, for example, by the person in charge of analysis.
  • the satellite image acquisition unit 12 acquires satellite images of the area to be analyzed.
  • the satellite image acquisition unit 12 acquires satellite images to which information on the location where the image was taken and the date and time of image capture is added, for example.
  • the information on the location where the image was taken is, for example, the latitude and longitude of the location at the center of the image.
  • the information on the location where the image was taken is not limited to the above, and may be any information for identifying the location where the image was taken.
  • the satellite image acquisition unit 12 acquires satellite images of the area to be analyzed, for example, via the satellite image management server 30.
  • the satellite image acquisition unit 12 may also acquire satellite images of the area to be analyzed via a storage medium.
  • the satellite image acquisition unit 12 may acquire a satellite image of the region indicated by the information indicating the region to be analyzed.
  • the satellite image acquisition unit 12 may acquire a satellite image captured at the timing indicated by the information indicating the timing of the region to be analyzed.
  • the satellite image is, for example, an image captured by a SAR mounted on an artificial satellite.
  • the satellite image is not limited to an image captured by a SAR.
  • the satellite image acquisition unit 12 may acquire satellite images captured by a plurality of imaging methods.
  • the satellite image acquisition unit 12 acquires, for example, a satellite image captured by a SAR and an optical image in the visible light region.
  • the satellite image acquisition unit 12 may acquire a satellite image in the infrared region.
  • the satellite images captured by a plurality of imaging methods are not limited to satellite images captured over the same range, so long as each of the images includes the area to be analyzed.
  • the environmental data acquisition unit 13 acquires environmental data for the area to be analyzed.
  • the environmental data acquisition unit 13 acquires environmental data to which, for example, the observation point and the observation date and time are added.
  • the information on the observation point is, for example, the latitude and longitude of the point where the environmental data was observed.
  • the information on the observation point is not limited to the above.
  • the environmental data may also be time series data of observed values of the environment.
  • the environmental data acquisition unit 13 acquires environmental data for the area to be analyzed, for example, from the environmental data management server 40.
  • the environmental data acquisition unit 13 may acquire environmental data for the area to be analyzed from multiple environmental data management servers 40.
  • the environmental data acquisition unit 13 may acquire environmental data as environmental data from an environmental data providing server operated by the Japan Meteorological Agency or other government agency.
  • the environmental data is, for example, data on at least one of the following items: water depth, ocean weather, and topography.
  • Ocean weather data is data on at least one of the following items: water temperature, air temperature, waves, and wind speed. Ocean weather is not limited to the above.
  • the environmental data acquisition unit 13 may acquire environmental data for items that are set based on the object to be analyzed. For example, when the object to be analyzed is a ship, the environmental data acquisition unit 13 acquires data on water depth, waves, and topography, which are necessary for estimating the presence or absence of a ship, as environmental data.
  • the items for estimating the presence or absence of a ship are set, for example, by the person in charge of analysis or the operator of the image analysis system 10.
  • the environmental data acquisition unit 13 may acquire environmental data for items that are set based on at least one of the region and timing of the analysis target. For example, if the analysis target is a region where marine life exists in winter, when analyzing satellite images captured in winter, the environmental data acquisition unit 13 acquires environmental data for items necessary to estimate the presence or absence of marine life, and which was observed in winter.
  • the environmental data acquisition unit 13 may acquire environmental data observed in the region indicated by the information indicating the region to be analyzed.
  • the environmental data acquisition unit 13 may acquire environmental data observed at the timing indicated by the information indicating the timing of the analysis target.
  • the environmental data acquisition unit 13 may acquire environmental data such that the environmental data includes environmental data corresponding to the time when the satellite image was captured and the period of observation.
  • the time when the satellite image was captured and the period of observation of the environmental data correspond means that the time when the satellite image was captured are included within a period within the range of fluctuations that can occur under the assumed conditions of fluctuations in the environmental data. For example, if the tidal current is constant during the winter season, the time when the satellite image was captured and the data of the tidal current at any time during the winter season may correspond to the time when the satellite image was captured and the period of observation.
  • the environmental data acquisition unit 13 acquires data of the average value of the environmental data for a predetermined period including the time when the satellite image was captured.
  • the predetermined period is set, for example, by the person in charge of analysis.
  • the environmental data acquisition unit 13 acquires data of wave height such that the date and time when the satellite image was captured and the date and time when the observation were made are the same.
  • Data that are the same include, for example, data that differs within a range that can be considered to be the same in the processing by the estimation unit 15.
  • the environmental data is data that does not normally change, such as topography
  • the environmental data acquisition unit 13 may acquire environmental data at any time when no changes have occurred. The timing at which observed environmental data is acquired is not limited to the above.
  • the environmental data acquisition unit 13 may acquire information that identifies ships that are present within the area being analyzed.
  • the environmental data acquisition unit 13 may acquire identification signals of ships by AIS (Automatic Identification System) in the area being analyzed.
  • the environmental data acquisition unit 13 acquires identification signals of ships by AIS from a monitoring server that monitors the navigation of ships in the area being analyzed.
  • the information that identifies ships that are present within the area being analyzed is not limited to identification signals of ships by AIS.
  • the detection unit 14 detects an object to be analyzed that appears in a satellite image.
  • the detection unit 14 detects an object to be analyzed that appears in a satellite image, for example, using an image recognition model.
  • the detection unit 14 detects an area in the satellite image in which the object to be analyzed appears, for example, using an image recognition model.
  • the detection unit 14 may further detect the type of object to be analyzed in the satellite image, using an image recognition model.
  • the image recognition model is, for example, a learning model that uses a satellite image as input and estimates objects that appear in the satellite image.
  • the detection unit 14 may use the image recognition model to detect objects other than the object being analyzed that appear in the satellite image. For example, when the object being analyzed is a ship, if marine life is present in the area being analyzed, the detection unit 14 may use the image recognition model to detect marine life that appears in the satellite image.
  • the image recognition model is generated, for example, by learning the relationship between a satellite image and an object shown in the satellite image.
  • the image recognition model is generated, for example, by learning the relationship between a satellite image and an area in which an object to be analyzed appears in the satellite image.
  • the image recognition model may be generated by learning the relationship between a satellite image and the name of an object shown in the satellite image.
  • the image recognition model may be generated by learning the relationship between a satellite image and an area in which an object appears and the name of the object shown.
  • the image recognition model is generated, for example, by deep learning using a neural network.
  • the learning data and learning algorithms used to generate the image recognition model are not limited to those described above.
  • the image recognition model is generated, for example, in a system external to the image analysis system 10.
  • the image recognition model may be generated in a generation unit (not shown) included in the image analysis system 10.
  • the detection unit 14 may detect the object to be analyzed based on the brightness change of the satellite image. For example, when a satellite image shows only an area of ocean and there is no change in the topography, the detection unit 14 detects the contour of the object shown in the satellite image based on the brightness change between pixels of the satellite image. The detection unit 14 then identifies whether the object shown in the satellite image is the object to be analyzed based on at least one of the size of the area surrounded by the contour and the shape of the contour. If the detection unit 14 identifies the object shown in the image as the object to be analyzed, it detects, for example, that the object to be analyzed is present in the area surrounded by the contour. How the object to be analyzed is detected is not limited to the above.
  • the estimation unit 15 estimates the accuracy of detection based on the environmental data. For example, based on the environmental data, the estimation unit 15 estimates an index indicating the likelihood that the object detected by the detection unit 14 is the object to be analyzed as the accuracy of detection. For example, based on the environmental data, the estimation unit 15 estimates the accuracy of detection of the object to be analyzed in the detection result by the image recognition model. That is, based on the environmental data, the estimation unit 15 estimates the likelihood of the detection result by the image recognition model. For example, based on the environmental data, the estimation unit 15 estimates the presence or absence of the object to be analyzed. Then, the estimation unit 15 estimates the accuracy of detection based on the estimated presence or absence of the object to be analyzed.
  • the estimation unit 15 may estimate the accuracy of detection based on whether the object detected by the image recognition model is an object other than the analysis target. For example, the estimation unit 15 estimates a candidate for the object when the object detected by the image recognition model is an object other than the analysis target. Then, the estimation unit 15 estimates the accuracy of detection based on the possibility that the candidate object may exist. For example, when the environmental data is suitable for the existence of the candidate object, the possibility that the candidate object may exist increases. When the possibility of the existence of an object other than the analysis target is high, the likelihood that the object detected by the image recognition model is the object to be analyzed decreases.
  • the estimation unit 15 estimates a candidate for the object other than the ship when the object detected by the image recognition model is other than the ship.
  • the estimation unit 15 estimates, for example, marine life as a candidate for the object other than the ship.
  • the estimation unit 15 estimates, for example, that the object detected as a ship by the image recognition model is unlikely to be a ship. In other words, the estimation unit 15 estimates that the accuracy of the detection result that the image recognition model detects as a ship is low.
  • the estimation unit 15 estimates the accuracy of detection, for example, based on the detection result of the image recognition model and the presence or absence of the object to be analyzed in the area to be analyzed estimated using the environmental data.
  • the presence or absence of the object to be analyzed in the area to be analyzed estimated using the environmental data is, for example, the possibility that the object to be analyzed may exist in the area to be analyzed estimated based on the environmental data.
  • the estimation unit 15 estimates the presence or absence of the object to be analyzed in the area to be analyzed, for example, based on a criterion that defines the relationship between the object to be analyzed and the environmental data.
  • the estimation unit 15 estimates the accuracy of the detection result based on the estimation result of the presence or absence of the object to be analyzed.
  • the criterion that defines the relationship between the object to be analyzed and the environmental data is a criterion for estimating the presence or absence of the object to be analyzed based on the environmental data.
  • the criterion that defines the relationship between the object to be analyzed and the environmental data is a criterion for determining in what kind of environment the object to be analyzed is likely to exist.
  • the criterion that defines the relationship between the object to be analyzed and the environmental data may also be a criterion for determining in what kind of environment the object to be analyzed is likely not to exist.
  • the estimation unit 15 estimates the accuracy of detection based on the result of the image recognition model detecting the ship and the possibility that the ship may exist in the area to be analyzed estimated using the environmental data. For example, when the image recognition model detects a ship, which is the object to be analyzed, and the topographical data included in the environmental data indicates a topographical topography that is not suitable for anchoring, the estimation unit 15 estimates that the detected object is unlikely to be a ship because of the topography that is not suitable for anchoring a ship.
  • the estimation unit 15 estimates that the detected object is unlikely to be a ship because of the water depth that is not suitable for navigation and anchoring a ship. Also, for example, when the image recognition model detects multiple ships approaching each other, which are the object to be analyzed, and the ocean current and wave data included in the environmental data indicate values that are not suitable for the approach of ships, the estimation unit 15 estimates that the detected objects are unlikely to be ships.
  • the criteria that define the relationship between the object to be analyzed and the environmental data are not limited to the above. The criteria that define the relationship between the object to be analyzed and the environmental data are set, for example, by the person in charge of analysis or the operator of the image analysis system 10.
  • the estimation unit 15 estimates the accuracy of detection based on a score indicating the possibility of the presence of an object estimated using environmental data, for example.
  • the estimation unit 15 estimates a score indicating the possibility of the presence of an object to be analyzed, for example, based on environmental data. Then, the estimation unit 15 estimates the accuracy of detection based on the estimated score.
  • the score indicating the possibility of the presence of an object is, for example, an index indicating whether an environment is suitable for an object to exist.
  • a score indicating the possibility of a ship to exist is an index indicating whether an environment is suitable for a ship to exist.
  • An environment in which a ship exists is, for example, an environment related to water depth, topography, and waves that is suitable for at least one of ship navigation and anchoring.
  • the estimation unit 15 calculates a score for each item of environmental data. Then, the estimation unit 15 estimates a score indicating the possibility of the presence of an object by, for example, adding up the scores for each item of environmental data.
  • the relationship between the value of environmental data and the score is set, for example, as a table for each item of environmental data.
  • the relationship between the value of environmental data and the score may be set using a function with the value of environmental data as an explanatory variable and the score as a target variable.
  • the relationship between the environmental data value and the score is set, for example, by the person in charge of analysis or the operator of the image analysis system 10.
  • the estimation unit 15 may estimate a score indicating the possibility that an object other than the object to be analyzed exists based on the environmental data.
  • An object other than the object to be analyzed is, for example, an object that the image recognition model may mistakenly recognize as the object to be analyzed.
  • the estimation unit 15 estimates the accuracy of detection based on, for example, the score indicating the possibility that an object other than the object to be analyzed exists.
  • the estimation unit 15 estimates a score indicating the possibility that an object exists for an object other than the object to be analyzed that may exist in the area to be analyzed.
  • the possibility that an object exists in the area to be analyzed means, for example, that an object has existed in the area to be analyzed in the past.
  • the possibility that an object exists in the area to be analyzed may mean that an object exists frequently in an area having at least one of the same topography and environment as the area to be analyzed, or similar thereto.
  • the estimation unit 15 estimates a score indicating the possibility that marine life exists. For example, when the score of marine life is higher than that of a ship, the estimation unit 15 estimates that the object detected as a ship is highly likely to be marine life. That is, if the score for marine life is higher than that for ships, the estimation unit 15 estimates that the object detected as a ship is unlikely to be a ship.
  • the formula for estimating the accuracy of detection is not limited to the above. Furthermore, how the accuracy of detection is estimated is not limited to the above.
  • the estimation unit 15 estimates the accuracy of detection, for example, based on environmental factors that affect the accuracy of the detection result of the image recognition model estimated using the environmental data.
  • the estimation unit 15 may also estimate the accuracy of detection, based on environmental factors that affect the presence of an object that the image recognition model estimated using the environmental data may mistakenly recognize as the object to be analyzed.
  • the environmental factors that affect the accuracy of the detection result of the image recognition model that detects the object to be analyzed are, for example, items of environmental data that have a large effect on the presence or absence of the object.
  • the estimation unit 15 estimates, for example, that the item with the highest score is the environmental factor that affects the accuracy of the detection result.
  • the estimation unit 15 may estimate, for example, that the item whose score meets a criterion is the environmental factor that affects the accuracy of the detection result.
  • the estimation unit 15 may also estimate that the items with the highest scores up to a predetermined rank are the environmental factors that affect the accuracy of the detection result.
  • the estimation unit 15 may also estimate, based on the environmental data, environmental factors that affect the likelihood of the detection result of the image recognition model that detects the object to be analyzed as the estimated reason for the detection accuracy. For example, the estimation unit 15 estimates environmental factors that affect the possibility of the existence of the object to be analyzed estimated using the environmental data as the estimated reason for the detection accuracy. For example, the estimation unit 15 estimates environmental factors that affect the possibility of the existence of an object other than the object to be analyzed estimated using the environmental data as the estimated reason for the detection accuracy. When the object to be analyzed is a ship, the estimation unit 15 estimates the estimated reason for the detection accuracy based on environmental factors that affect the possibility of the existence of a ship or environmental factors that affect the possibility of the existence of an object other than a ship.
  • the estimation unit 15 may estimate the accuracy of detection based on objects present in the area to be analyzed that are estimated using an estimation model that estimates the objects present from environmental data.
  • the estimation model is, for example, a learning model that learns the relationship between environmental data and existing objects.
  • the estimation model is, for example, generated by deep learning using a neural network.
  • the estimation unit 15 may estimate the accuracy of detection based on an object present in the area to be analyzed, which is estimated using an estimation model capable of identifying the reason for estimation.
  • the estimation model capable of outputting the reason for estimation may be generated using, for example, a learning algorithm based on factorized asymptotic Bayesian inference.
  • the learning device performs case classification using a decision tree-type rule with the data of each item of environmental data as input data and the existing object as correct answer data.
  • the learning device then generates a learning model that predicts the degree of realization using a linear model that combines different explanatory variables in each case.
  • the learning device generates the learning model by sequentially performing the processes of optimizing the case classification conditions of the data, generating a prediction model by optimizing the combination of explanatory variables, and deleting unnecessary prediction models.
  • the estimation model may also be a learning model that identifies the reason for estimation based on the change in the estimation result relative to the amount of variation of each item of environmental data.
  • the learning algorithm that generates the estimation model is not limited to the above.
  • the estimation model is also generated, for example, in a system external to the image analysis system 10.
  • the estimation model may be generated in a generation unit (not shown) included in the image analysis system 10.
  • the estimation unit 15 may estimate the accuracy of detection based on areas where the object to be analyzed is not present, which is estimated using environmental data.
  • the estimation unit 15 estimates areas where the ship is not present, for example, based on environmental data.
  • the estimation unit 15 estimates that the accuracy of detection is low when a ship is detected in an area where it is estimated that no ship is present.
  • the estimation unit 15 estimates a score indicating the possibility that the object to be analyzed is present for each area, based on environmental data. Then, the estimation unit 15 estimates that the accuracy of detection is low, for example, when a ship is detected in an area where the score is below a standard.
  • the estimation unit 15 may estimate the accuracy of detection based on an area where a specific object other than the object to be analyzed, which is estimated using the environmental data, does not exist.
  • the specific object is, for example, an object that is expected to be erroneously recognized as the object to be analyzed by the image recognition model.
  • the estimation unit 15 estimates a score indicating the possibility that a specific object exists for each area based on the environmental data. Then, the estimation unit 15 estimates that a specific object does not exist in an area where the score is below a standard, for example.
  • the estimation unit 15 estimates that the accuracy of detection is high, for example, when the object to be analyzed is detected in an area where a specific object does not exist.
  • the estimation unit 15 may estimate an area where a specific object other than the object to be analyzed is likely to exist based on the environmental data. For example, when the object to be analyzed is detected in an area where a specific object is likely to exist, the estimation unit 15 estimates that the accuracy of detection is low. The estimation unit 15 estimates a score indicating the possibility that a specific object exists for each area based on the environmental data. Then, the estimation unit 15 estimates that an area where a score is equal to or higher than a standard is an area where a specific object is likely to exist.
  • the specified object is set, for example, by the person in charge of the analysis.
  • the estimation unit 15 may further use the ship's identification signal to estimate the accuracy of detection.
  • the estimation unit 15 estimates the accuracy of detection based on, for example, the AIS identification signal.
  • the estimation unit 15 estimates that the accuracy of detection is high, for example, when the type of ship indicated by the detection result matches the type of ship indicated by the AIS identification signal.
  • the estimation unit 15 may estimate that the accuracy of detection is high, for example, the more items that match between the detection result and the information indicated by the AIS identification signal. For example, the estimation unit 15 estimates that the accuracy of detection is higher when the type of ship and course match in the detection result and the information indicated by the AIS identification signal than when only the type of ship matches.
  • the output unit 16 outputs the detection result and information indicating the accuracy of the detection.
  • the output unit 16 outputs the detection result and information indicating the accuracy of the detection to, for example, the terminal device 20.
  • the output unit 16 outputs, for example, a satellite image, the detection result of the object to be analyzed in the satellite image, and information indicating the accuracy of the detection.
  • the output unit 16 outputs the detection result indicating the area in which the object to be analyzed is detected by surrounding the area in which the object to be analyzed is detected on the satellite image with a figure.
  • the figure surrounding the area in which the object to be analyzed is detected is, for example, a rectangle.
  • the figure surrounding the area in which the object to be analyzed is detected is not limited to, for example, a rectangle.
  • the output unit 16 outputs, for example, a satellite image in which the line surrounding the area in which the object to be analyzed is detected on the satellite image is changed according to the level of the accuracy of the detection.
  • the output unit 16 outputs, for example, a satellite image in which at least one of the shape, thickness, and color of the line surrounding the area in which the object to be analyzed is detected on the satellite image is changed according to the level of the accuracy of the detection.
  • the output unit 16 may output a satellite image in which the shape of a figure surrounding an area in which the object to be analyzed is detected on the satellite image is changed according to the level of detection accuracy.
  • the output unit 16 may output a numerical value indicating the level of detection accuracy by superimposing it on the satellite image.
  • the output unit 16 may also output a numerical value, letter, or symbol indicating the level of detection accuracy by superimposing it on the satellite image.
  • the form in which the detection accuracy is output is not limited to the above.
  • the output unit 16 may output candidates for objects other than the object to be analyzed that are estimated by the estimation unit 15 based on the environmental data.
  • the output unit 16 outputs information on candidates for objects other than the object to be analyzed that are estimated by the estimation unit 15 based on the environmental data, in association with an area in which the image recognition model detects the object.
  • the output unit 16 outputs names of candidates for objects other than the object to be analyzed that are estimated by the estimation unit 15 based on the environmental data, in association with an area in which the image recognition model detects the object.
  • the output unit 16 may further output the reason for estimating the accuracy of detection.
  • the output unit 16 outputs the reason for estimating the accuracy of detection that the estimation unit 15 estimates based on the environmental data.
  • the output unit 16 outputs the reason for estimating that the detected object is the object to be analyzed.
  • the output unit 16 may output the reason for estimating that the detected object is an object other than the object to be analyzed.
  • the output unit 16 may output, as a reference image, a satellite image in which an object of the same type as the detected object is identified. For example, if the detected object is a large ship, the output unit 16 may output, as a reference image, a satellite image in which a large ship has been identified in a past analysis as showing the large ship. The output unit 16 may output, as a reference image, a satellite image of an object other than the analysis target that may be present in the area being analyzed. For example, if the analysis target is a ship, the output unit 16 may output a satellite image of an object other than a ship.
  • the output unit 16 may output a satellite image captured using a method different from that of the satellite image being analyzed. For example, if the satellite image being analyzed is an image captured by a SAR, the output unit 16 may further output an optical image in the visible light region that includes the same range as the range in which the image captured by the SAR was captured.
  • the output unit 16 may, for example, output an image of an enlarged area selected by the person in charge of analysis.
  • the output unit 16 detects, for example, that an area on the display screen in which the object to be analyzed is detected has been selected by the operation of the person in charge of analysis, it outputs an image of an enlarged area of the selected area.
  • the output unit 16 may also output an image of the area selected by the person in charge of analysis that has a higher resolution than other areas.
  • the output unit 16 may also output, as a reference image, a satellite image that has been identified in a past analysis as containing the object to be analyzed in the area selected by the person in charge of analysis.
  • the output unit 16 may also output, as a reference image, a satellite image that has been identified in a past analysis as containing a candidate object other than the object to be analyzed in the area selected by the person in charge of analysis.
  • the reason for the estimation that the object is other than the object being analyzed is displayed as "temperature” and "tides.”
  • the observation data for temperature and tides, which are part of the environmental data, are suitable for the presence of a seal, so it is shown that the possibility that it is a ship is low and the possibility that it is a seal is high.
  • the storage unit 17 stores, for example, data used in the analysis of satellite images.
  • the storage unit 17 stores, for example, satellite images acquired by the satellite image acquisition unit 12.
  • the storage unit 17 stores, for example, environmental data acquired by the environmental data acquisition unit 13.
  • the storage unit 17 stores, for example, a table showing the relationship between environmental data and a score indicating the possibility of the presence of an object.
  • the storage unit 17 stores, for example, an image recognition model.
  • the storage unit 17 stores, for example, an estimation model.
  • the image recognition model and the estimation model may be stored in a storage means other than the storage unit 17.
  • the storage unit 17 stores, for example, the detection results of the image recognition model.
  • the storage unit 17 may store, for example, an estimation reason for the accuracy of detection.
  • the storage unit 17 may store, as a reference image, a satellite image identified in a past analysis as containing an object to be analyzed.
  • the storage unit 17 may store, as a reference image, a satellite image identified in a past analysis as containing an object other than the object to be analyzed.
  • the terminal device 20 is, for example, a terminal device used by an analyst to analyze satellite images.
  • the terminal device 20 acquires satellite images, detection results of objects captured in the satellite images, and the accuracy of detection, for example, from the output unit 16 of the image analysis system 10.
  • the terminal device 20 then outputs the satellite images, detection results of objects captured in the satellite images, and the accuracy of detection, for example, to a display device not shown.
  • the terminal device 20 acquires an estimated reason for the accuracy of detection, for example, from the output unit 16 of the image analysis system 10.
  • the terminal device 20 then outputs an estimated reason for the accuracy of detection, for example, to a display device not shown.
  • the terminal device 20 may, for example, acquire information about the analysis target input by a user's operation.
  • the terminal device 20 outputs the information about the analysis target to the acquisition unit 11 of the image analysis system 10, for example.
  • the terminal device 20 may be, for example, a personal computer, a tablet computer, or a smartphone.
  • the terminal device 20 is not limited to the above examples.
  • the satellite image management server 30 manages, for example, images of the earth's surface captured by an imaging device mounted on an artificial satellite.
  • the satellite image management server 30 acquires satellite images of the earth's surface captured by an imaging device mounted on an artificial satellite, for example, from a ground station that communicates with the artificial satellite.
  • the satellite images are associated with, for example, the image capture date and time and the image capture location.
  • the satellite image management server 30 acquires satellite images of the earth's surface captured by an imaging device mounted on an artificial satellite, for example, from a ground station that communicates with the artificial satellite.
  • the satellite image management server 30 then associates and stores the acquired satellite images with the image capture date and time and the image capture location.
  • the satellite image management server 30 may identify the image capture location from the position of the satellite at the time of image capture and imaging-related parameters added to the satellite image.
  • the imaging-related parameters are, for example, the direction in which electromagnetic waves in the band used for image capture are transmitted, the transmission angle of the electromagnetic waves relative to the earth's surface, and the reception accuracy of the electromagnetic waves reflected from the earth's surface.
  • the satellite image management server 30 receives a request for a satellite image from the image analysis system 10, it outputs the requested satellite image and the image capture date and time and capture location of the satellite image to the satellite image acquisition unit 12 of the image analysis system 10.
  • satellite images may be stored in different servers for each management entity of the artificial satellite that captured the image. The number of satellite image management servers 30 may be set as appropriate.
  • the environmental data management server 40 manages environmental data.
  • the environmental data management server 40 for example, acquires environmental data.
  • the environmental data management server 40 then stores the acquired environmental data in association with information on the observation date and time and the observation point.
  • the environmental data management server 40 stores, for example, observation data on water depth, water temperature, air temperature, tides, wave height, and wind speed.
  • the environmental data management server 40 also stores, for example, topographical data.
  • the environmental data management server 40 receives a request for environmental data from the image analysis system 10, for example, it outputs the requested environmental data and the observation date and time and observation point of the environmental data to the environmental data acquisition unit 13 of the image analysis system 10.
  • the environmental data may be stored in a different server for each observer of the environmental data.
  • the number of environmental data management servers 40 may be set as appropriate.
  • Figure 8 shows an example of the operation flow when the image analysis system 10 detects an object captured in a satellite image and estimates the accuracy of the detection.
  • the satellite image acquisition unit 12 acquires satellite images of the area to be analyzed (step S11).
  • the satellite image acquisition unit 12 acquires satellite images of the area to be analyzed, for example, from the satellite image management server 30.
  • the environmental data acquisition unit 13 also acquires environmental data for the area to be analyzed (step S12).
  • the environmental data acquisition unit 13 acquires the environmental data for the area to be analyzed from, for example, the environmental data management server 40.
  • the detection unit 14 detects the object to be analyzed that appears in the satellite image (step S13).
  • the detection unit 14 detects the object to be analyzed that appears in the satellite image, for example, using an image recognition model.
  • the output unit 16 outputs the result of detection of the object to be analyzed and the accuracy of detection (step S16).
  • the output unit 16 outputs the result of detection of the object to be analyzed and the accuracy of detection to, for example, the terminal device 20.
  • step S17 When the detection result of the object to be analyzed and the detection accuracy are output, if there is an image in which the process to detect the object to be analyzed has not been performed (No in step S17), the process returns to step S13, and the detection unit 14 detects the object to be analyzed that is captured in a satellite image in which the detection of the object to be analyzed has not been performed.
  • step S16 when the detection result of the object to be analyzed and the detection accuracy are output, if the process of detecting the object to be analyzed has been performed for all satellite images (Yes in step S17), the image analysis system 10 ends the process of detecting the object in the satellite image and estimating the detection accuracy.
  • step S14 If the object to be analyzed is not detected in the satellite image in step S14 (No in step S14), or there is a satellite image for which processing to detect the object to be analyzed has not been performed (No in step S17), the process returns to step S13, and the detection unit 14 detects the object to be analyzed in other satellite images for which analysis has not been performed.
  • step S14 If the object to be analyzed is not detected in the satellite image in step S14 (No in step S14), and the process of detecting the object to be analyzed has been performed for all satellite images (Yes in step S17), the image analysis system 10 ends the process of detecting the object in the satellite image and estimating the accuracy of detection.
  • the image analysis system 10 detects an object to be analyzed from a satellite image of the area to be analyzed.
  • the image analysis system 10 estimates the accuracy of detection of the object to be analyzed based on the environmental data of the area to be analyzed.
  • the image analysis system 10 then outputs the result of detection of the object to be analyzed and information indicating the accuracy of detection. In this way, by outputting the result of detection of the object to be analyzed and information indicating the accuracy of detection, the person in charge of analyzing the satellite image can confirm whether the object shown in the satellite image is the object to be analyzed while referring to the information indicating the accuracy of detection output as the accuracy of detection.
  • the person in charge of analysis can confirm whether the object shown in the satellite image is the object to be analyzed while taking into account the possibility of marine life being present. Also, if there is a high possibility that there is no ship based on the topography and water currents, the person in charge of analysis can analyze the satellite image while taking into account that if it is a ship, it is an unusual ship. In this way, by analyzing the satellite image using the result of detection of the object to be analyzed and information indicating the accuracy of detection, it becomes easier to identify objects present in the area to be analyzed. Therefore, by using the image analysis system 10, it is possible to easily analyze objects present in the area being analyzed.
  • FIG. 9 shows an example of the configuration of a computer 100 that executes a computer program that performs each process in the image analysis system 10.
  • the computer 100 comprises a CPU (Central Processing Unit) 101, memory 102, a storage device 103, an input/output I/F (Interface) 104, and a communication I/F 105.
  • CPU Central Processing Unit
  • I/F Interface
  • the CPU 101 reads out and executes computer programs for performing each process from the storage device 103.
  • the CPU 101 may be configured by a combination of multiple CPUs.
  • the CPU 101 may also be configured by a combination of a CPU and another type of processor.
  • the CPU 101 may be configured by a combination of a CPU and a GPU (Graphics Processing Unit).
  • the memory 102 is configured by a DRAM (Dynamic Random Access Memory) or the like, and temporarily stores the computer programs executed by the CPU 101 and data being processed.
  • the storage device 103 stores the computer programs executed by the CPU 101.
  • the storage device 103 is configured by, for example, a non-volatile semiconductor storage device. Other storage devices such as a hard disk drive may be used for the storage device 103.
  • the computer programs used to execute each process can also be distributed by storing them on a computer-readable recording medium that non-temporarily records data.
  • a computer-readable recording medium for example, a magnetic tape for recording data or a magnetic disk such as a hard disk can be used.
  • an optical disk such as a CD-ROM (Compact Disc Read Only Memory) can also be used as the recording medium.
  • a non-volatile semiconductor memory device can also be used as the recording medium.
  • An image acquisition means for acquiring a satellite image of an area to be analyzed;
  • An environmental data acquisition means for acquiring environmental data of an area to be analyzed;
  • a detection means for detecting an object to be analyzed that is captured in a satellite image;
  • an estimation means for estimating a degree of accuracy of the detection based on the environmental data;
  • an output means for outputting a result of the detection and information indicating a degree of accuracy of the detection.
  • the detection accuracy is an index indicating the likelihood that the object detected by the detection means is the object to be analyzed. 2.
  • the estimation means estimates the accuracy of the detection based on environmental factors that affect the accuracy of a detection result of an image recognition model that detects the object to be analyzed, the detection result being estimated using environmental data. 3.
  • the estimation means estimates the accuracy of the detection based on environmental factors that affect the presence of an object that the image recognition model estimated using environmental data may mistakenly recognize as the object to be analyzed. 4.
  • the estimation means estimates a degree of accuracy of the detection based on a detection result of the image recognition model and the presence or absence of an object to be analyzed estimated using the environmental data. 4. The image analysis system of claim 3.
  • the object to be analyzed is a ship,
  • the estimation means estimates the accuracy of the detection based on an environmental factor indicating that the object detected by the image recognition model is other than the ship. 6.
  • the estimation means estimates the accuracy of the detection based on an area where the ship is not present, which is estimated using the environmental data. 7.
  • the output means outputs a satellite image relating to the object other than the ship estimated by the estimation means.
  • the environmental data is data on at least one of air temperature, water temperature, water flow, water depth, and topography. 9. An image analysis system according to any one of claims 6 to 8.
  • the estimation means further uses an identification signal of the vessel to estimate the accuracy of the detection.
  • the output means further outputs a reason for the estimation of the accuracy of the detection.
  • the output means outputs a satellite image in which a line surrounding an area in which the object to be analyzed is detected on the satellite image is changed according to a level of accuracy of the detection. 12.
  • An image analysis system according to any one of claims 1 to 11.
  • [Appendix 14] Acquiring satellite images of the area to be analyzed; Obtaining environmental data for the area to be analyzed; A process for detecting an object to be analyzed that is captured in satellite imagery; A process of estimating the accuracy of the detection based on the environmental data; and outputting a result of the detection and information indicating the accuracy of the detection.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)

Abstract

This image analysis system is provided with a satellite image acquisition unit, an environmental data acquisition unit, a detection unit, an estimation unit, and an output unit. The satellite image acquisition unit acquires a satellite image of an area to be analyzed. The environmental data acquisition unit acquires environmental data of the area to be analyzed. The detection unit detects an object to be analyzed in the satellite image. The estimation unit estimates the accuracy of detection, on the basis of the environmental data. The output unit outputs a result of the detection and information indicating the accuracy of the detection.

Description

画像分析システム、画像分析方法および記録媒体Image analysis system, image analysis method, and recording medium

 本開示は、画像分析システム等に関するものである。 This disclosure relates to image analysis systems, etc.

 地表の物体の分析に人工衛星から撮像された衛星画像が広く用いられている。例えば、SAR(Synthetic Aperture Radar)は、太陽光の有無および雲の影響を受けず観測可能なことから、SARによって撮像された衛星画像は、地表の物体の分析に適している。このため、SARによって撮像された衛星画像は、分析対象の地域に存在する物体の特定に用いられることがある。一方で、地表での電磁波の反射を観測することによって撮像するため、例えば、SARによって撮像された写像画像は、モノクロの階調で表現された画像となる。モノクロの階調で表現された画像では、例えば、海または陸と、分析対象の物体とのコントラスト比が十分ではなく、衛星画像に写っている物体の判別が難しい場合がある。このため、例えば、衛星画像に写っている物体の分析は、衛星画像以外のデータをさらに用いて行われることがある。 Satellite images taken by artificial satellites are widely used to analyze objects on the Earth's surface. For example, SAR (Synthetic Aperture Radar) can make observations regardless of the presence or absence of sunlight and is not affected by clouds, so satellite images taken by SAR are suitable for analyzing objects on the Earth's surface. For this reason, satellite images taken by SAR may be used to identify objects that exist in the area being analyzed. On the other hand, since images are taken by observing the reflection of electromagnetic waves on the Earth's surface, for example, the mapping images taken by SAR are images expressed in monochrome gradations. In images expressed in monochrome gradations, for example, the contrast ratio between the sea or land and the object being analyzed is insufficient, and it may be difficult to distinguish the object in the satellite image. For this reason, for example, analysis of objects in satellite images may be performed using data other than satellite images.

 特許文献1のデータ解析装置は、衛星画像と、地表の環境の観測データを取得する。特許文献1のデータ解析装置は、地表の環境の観測データに基づいて、衛星画像に基づいた解析データを補正した解析データを生成する。 The data analysis device of Patent Document 1 acquires satellite images and observation data of the earth's surface environment. The data analysis device of Patent Document 1 generates analysis data by correcting the analysis data based on the satellite images, based on the observation data of the earth's surface environment.

国際公開第2022/107619号International Publication No. 2022/107619

 特許文献1に記載された技術では、分析対象の地域に存在する物体の特定が難しい場合がある。 The technology described in Patent Document 1 can make it difficult to identify objects that exist in the area being analyzed.

 本開示は、上記の課題を解決するため、分析対象の地域に存在する物体の特定を容易にすることができる画像分析システム等を提供することを目的とする。 The present disclosure aims to provide an image analysis system etc. that can easily identify objects present in the area being analyzed in order to solve the above problems.

 上記の課題を解決するため、本開示の画像分析システムは、分析対象の地域を撮像した衛星画像を取得する画像取得手段と、分析対象の地域の環境データを取得する環境データ取得手段と、衛星画像に写っている分析対象の物体を検出する検出手段と、環境データを基に、検出の確度を推定する推定手段と、検出の結果と、検出の確度を示す情報とを出力する出力手段とを備える。 In order to solve the above problems, the image analysis system disclosed herein includes an image acquisition means for acquiring satellite images of the area to be analyzed, an environmental data acquisition means for acquiring environmental data of the area to be analyzed, a detection means for detecting the object to be analyzed that appears in the satellite images, an estimation means for estimating the accuracy of detection based on the environmental data, and an output means for outputting the detection results and information indicating the accuracy of detection.

 本開示の画像分析方法は、分析対象の地域を撮像した衛星画像を取得し、分析対象の地域の環境データを取得し、衛星画像に写っている分析対象の物体を検出し、環境データを基に、検出の確度を推定し、検出の結果と、検出の確度を示す情報とを出力する。 The image analysis method disclosed herein acquires satellite images of the area to be analyzed, acquires environmental data of the area to be analyzed, detects the object to be analyzed that appears in the satellite images, estimates the accuracy of detection based on the environmental data, and outputs the detection results and information indicating the accuracy of detection.

 本開示の記録媒体は、分析対象の地域を撮像した衛星画像を取得する処理と、分析対象の地域の環境データを取得する処理と、衛星画像に写っている分析対象の物体を検出する処理と、環境データを基に、検出の確度を推定する処理と、検出の結果と、検出の確度を示す情報とを出力する処理とをコンピュータに実行させる画像分析プログラムを非一時的に記録する。 The recording medium of the present disclosure non-temporarily records an image analysis program that causes a computer to execute the following processes: acquiring satellite images of an area to be analyzed; acquiring environmental data of the area to be analyzed; detecting an object to be analyzed that appears in the satellite images; estimating the accuracy of detection based on the environmental data; and outputting the detection results and information indicating the accuracy of detection.

 本開示によると、分析対象の地域に存在する物体の特定を容易にすることができる。 The present disclosure makes it easier to identify objects present in the area being analyzed.

本開示の実施形態における構成の例を示す図である。FIG. 1 is a diagram illustrating an example of a configuration according to an embodiment of the present disclosure. 人工衛星によって地表を撮像する形態を模式的に示す図である。FIG. 1 is a diagram illustrating a form in which the earth's surface is imaged by an artificial satellite. 本開示の実施形態における衛星画像の例を示す図である。FIG. 2 is a diagram illustrating an example of a satellite image according to an embodiment of the present disclosure. 本開示の実施形態における衛星画像の例を示す図である。FIG. 2 is a diagram illustrating an example of a satellite image according to an embodiment of the present disclosure. 本開示の実施形態における衛星画像の例を示す図である。FIG. 2 is a diagram illustrating an example of a satellite image according to an embodiment of the present disclosure. 本開示の実施形態の画像分析システムの構成の例を示す図である。FIG. 1 is a diagram illustrating an example of a configuration of an image analysis system according to an embodiment of the present disclosure. 本開示の実施形態における分析結果の表示画面の例を示す図である。FIG. 13 is a diagram illustrating an example of a display screen of an analysis result according to an embodiment of the present disclosure. 本開示の実施形態における画像分析システムの動作フローの例を示す図である。FIG. 2 is a diagram illustrating an example of an operation flow of the image analysis system according to an embodiment of the present disclosure. 本開示の実施形態における画像分析システムのハードウェア構成の例を示す図である。FIG. 2 is a diagram illustrating an example of a hardware configuration of an image analysis system according to an embodiment of the present disclosure.

 本開示の実施形態について図を参照して詳細に説明する。図1は、分析システムの構成の例を示す図である。分析システムは、例えば、画像分析システム10と、端末装置20と、衛星画像管理サーバ30と、環境データ管理サーバ40を備える。画像分析システム10は、例えば、ネットワークを介して、端末装置20と接続する。画像分析システム10は、例えば、ネットワークを介して、衛星画像管理サーバ30と接続する。画像分析システム10は、例えば、ネットワークを介して、環境データ管理サーバ40と接続する。また、端末装置20は、複数であってもよい。端末装置20の数は、適宜、設定され得る。 Embodiments of the present disclosure will be described in detail with reference to the drawings. FIG. 1 is a diagram showing an example of the configuration of an analysis system. The analysis system includes, for example, an image analysis system 10, a terminal device 20, a satellite image management server 30, and an environmental data management server 40. The image analysis system 10 is connected to the terminal device 20, for example, via a network. The image analysis system 10 is connected to the satellite image management server 30, for example, via a network. The image analysis system 10 is connected to the environmental data management server 40, for example, via a network. There may be multiple terminal devices 20. The number of terminal devices 20 may be set as appropriate.

 画像分析システム10は、例えば、衛星画像の分析を行うシステムである。衛星画像は、例えば、人工衛星に搭載された撮像装置によって地表を撮像した画像である。衛星画像は、例えば、SAR(Synthetic Aperture Radar)によって撮像される。衛星画像は、SAR以外の撮像装置によって撮像されてもよい。衛星画像は、例えば、衛星画像に写っている物体の検出に用いられる。衛星画像に写っている物体の検出は、例えば、画像認識によって行われる。しかし、例えば、SARによって撮像された画像は、地表で反射した電磁波を基にしたモノクロ画像であるため、画像認識による物体の検出の精度が十分ではない場合がある。例えば、分析対象が船舶である場合に、画像認識によって、船舶以外の物体が船舶として検出される場合がある。このため、SARによって撮像された画像を用いて分析を行う場合には、例えば、画像認識による物体の検出後に、分析の担当者によって画像に写っている物体の特定が行われるときがある。分析の担当者は、例えば、SARで撮像された画像を目視で確認することで、SARによって撮像された画像に写っている物体を特定する。 The image analysis system 10 is, for example, a system that analyzes satellite images. A satellite image is, for example, an image of the earth's surface captured by an imaging device mounted on an artificial satellite. A satellite image is, for example, captured by a synthetic aperture radar (SAR). A satellite image may be captured by an imaging device other than a SAR. A satellite image is, for example, used to detect an object captured in the satellite image. An object captured in the satellite image is detected, for example, by image recognition. However, for example, an image captured by a SAR is a monochrome image based on electromagnetic waves reflected by the earth's surface, so the accuracy of object detection by image recognition may not be sufficient. For example, when the subject of analysis is a ship, an object other than a ship may be detected as a ship by image recognition. For this reason, when an analysis is performed using an image captured by a SAR, for example, after an object is detected by image recognition, the person in charge of analysis may identify the object captured in the image. The analyst, for example, visually checks the images captured by the SAR to identify objects that appear in the images captured by the SAR.

 図2は、人工衛星よって地表を撮像する形態の例を模式的に示す図である。図2の例では、船舶が海上を航行している。図2の例では、人工衛星に搭載された撮像装置によって船舶が航行している領域が撮像されている。地表の撮像は、例えば、SARによって行われる。人工衛星に搭載された撮像装置によって撮像された衛星画像は、例えば、地上局を介して、衛星画像管理サーバ―30に出力される。また、図2の例では、海上に設置された観測用センサーによって環境データが観測されている。海上に設置された観測用センサーは、例えば、水温および波浪を観測する観測ブイである。波浪は、例えば、波高、波向および波の周期である。観測用センサーは、観測ブイに限られない。また、観測用センサーによって観測する環境データの項目は、上記に限られない。観測された環境データは、例えば、環境データ管理サーバ40に出力される。 FIG. 2 is a diagram showing a schematic example of an image of the earth's surface captured by a satellite. In the example of FIG. 2, a ship is sailing on the sea. In the example of FIG. 2, the area in which the ship is sailing is captured by an imaging device mounted on the satellite. The image of the earth's surface is captured by, for example, a SAR. The satellite image captured by the imaging device mounted on the satellite is output to a satellite image management server 30 via, for example, a ground station. In addition, in the example of FIG. 2, environmental data is observed by an observation sensor installed on the sea. The observation sensor installed on the sea is, for example, an observation buoy that observes water temperature and waves. The waves are, for example, wave height, wave direction, and wave period. The observation sensor is not limited to an observation buoy. In addition, the items of environmental data observed by the observation sensor are not limited to those mentioned above. The observed environmental data is output to, for example, an environmental data management server 40.

 画像分析システム10は、例えば、衛星画像管理サーバ―30から、人工衛星によって撮像された衛星画像を取得する。また、画像分析システム10は、例えば、環境データ管理サーバ40から、観測用センサーが観測した環境データを取得する。画像分析システム10は、例えば、画像分析モデルを用いて、取得した衛星画像に写っている分析対象の物体を検出する。そして、画像分析システム10は、例えば、取得した分析対象の地域の環境データを基に、画像認識モデルによる検出結果における、分析対象の物体の検出の確度を推定する。分析対象の物体の検出の確度は、例えば、物体の検出結果の確からしさを示す指標である。すなわち、分析対象の物体の検出の確度は、画像認識モデルよって検出された物体が、分析対象の物体であるかの確からしさを示す指標である、例えば、分析対象の物体が船舶である場合に、画像分析システム10が衛星画像に船舶が写っていることを検出したとき、分析対象の物体の検出の確度は、物体が船舶である可能性を示す指標である。また、画像認識モデルは、衛星画像に写っている物体を検出する学習モデルである。画像認識モデルについては、後で説明する。 The image analysis system 10 acquires satellite images captured by an artificial satellite from, for example, the satellite image management server 30. The image analysis system 10 also acquires environmental data observed by an observation sensor from, for example, the environmental data management server 40. The image analysis system 10 detects an object to be analyzed that appears in the acquired satellite image, for example, using an image analysis model. The image analysis system 10 then estimates the accuracy of detection of the object to be analyzed in the detection result by the image recognition model, for example, based on the acquired environmental data of the area to be analyzed. The accuracy of detection of the object to be analyzed is, for example, an index indicating the accuracy of the detection result of the object. In other words, the accuracy of detection of the object to be analyzed is an index indicating the accuracy of whether the object detected by the image recognition model is the object to be analyzed. For example, when the object to be analyzed is a ship, when the image analysis system 10 detects that the ship is captured in the satellite image, the accuracy of detection of the object to be analyzed is an index indicating the possibility that the object is a ship. The image recognition model is a learning model that detects objects captured in satellite images. The image recognition model will be explained later.

 また、環境データは、例えば、物体の存在の有無に影響を与え得る環境に関するデータである。環境データは、例えば、物体の存在の有無に影響を与え得る環境を観測した観測データである。環境データは、地形に関するデータであってもよい。環境データは、水深のデータであってもよい。水深のデータは、潮汐による水深の変化を観測したデータであってもよい。例えば、分析対象が船舶である場合に、船舶の航行または停泊は、水深、潮流、波浪、風速および地形によって影響を受け得る。大型の船舶は、例えば、水深の浅い場所を航行することができない。また、通常の小型の船舶は、例えば、潮流が速く、波高が高い場所を航行すると危険なため、潮流が速く、波高が高い場所を避けて航行する。また、切り立った崖付近は、船舶の停泊には適さない。このように、分析対象の地点における環境データが示す状況によって、当該対象地点に船舶が存在する可能性が変わり得る。よって、分析対象が船舶である場合に、環境データによって船舶が存在する確からしさ推定することができる。このため、例えば、分析対象が船舶である場合に、画像分析システム10は、画像認識モデルが船舶を検出したときに、分析対象の地域の環境データを基に、検出結果における船舶の検出の確度を推定することができる。 Furthermore, the environmental data is, for example, data related to the environment that may affect the presence or absence of an object. The environmental data is, for example, observational data obtained by observing the environment that may affect the presence or absence of an object. The environmental data may be data related to topography. The environmental data may be data on water depth. The water depth data may be data obtained by observing changes in water depth due to tides. For example, when the subject of analysis is a ship, the navigation or anchoring of the ship may be affected by water depth, tides, waves, wind speed, and topography. For example, a large ship cannot navigate in shallow water. Also, a normal small ship avoids places with fast tides and high waves because it is dangerous to navigate in places with fast tides and high waves. Also, areas near steep cliffs are not suitable for anchoring a ship. In this way, the possibility of a ship being present at a target point may change depending on the situation indicated by the environmental data at the target point. Therefore, when the subject of analysis is a ship, the likelihood that the ship is present can be estimated using the environmental data. Therefore, for example, when the analysis subject is a ship, when the image recognition model detects a ship, the image analysis system 10 can estimate the accuracy of the detection of the ship in the detection results based on the environmental data of the area being analyzed.

 また、分析対象が船舶である場合に、例えば、船舶以外の物体と、船舶との判別が難しい場合がある。船舶以外の物体は、例えば、海洋生物である。海洋生物の存在の有無は、例えば、環境によって影響を受け得る。海洋生物の存在の有無は、例えば、水深、水温、気温、潮流、波浪、風速および地形によって影響を受け得る。海洋生物の存在に影響を与え得る環境の項目は、上記に限られない。例えば、分析対象の地域が、海洋生物が存在し得る海域である場合に、画像認識によって船舶が検出されたとき、画像分析システム10は、海洋生物の存在に関する環境データを基に、画像認識による検出の確度を推定する。例えば、環境データから海洋生物である可能性が低いことが推定される場合に、画像分析システム10は、画像認識モデルによる検出結果における船舶の検出の確度が高いと推定することができる。また、例えば、環境データから海洋生物である可能性が高いことが推定される場合に、画像分析システム10は、画像認識モデルによる検出結果における船舶の検出の確度が低いと推定する。 Furthermore, when the subject of analysis is a ship, for example, it may be difficult to distinguish between objects other than the ship and the ship. Objects other than the ship are, for example, marine life. The presence or absence of marine life may be affected, for example, by the environment. The presence or absence of marine life may be affected, for example, by water depth, water temperature, air temperature, tides, waves, wind speed, and topography. Environmental items that may affect the presence of marine life are not limited to the above. For example, when the area to be analyzed is an ocean area where marine life may exist, when a ship is detected by image recognition, the image analysis system 10 estimates the accuracy of detection by image recognition based on environmental data regarding the presence of marine life. For example, when it is estimated from the environmental data that the possibility of it being marine life is low, the image analysis system 10 can estimate that the accuracy of the detection of the ship in the detection results by the image recognition model is high. Also, for example, when it is estimated from the environmental data that the possibility of it being marine life is high, the image analysis system 10 estimates that the accuracy of the detection of the ship in the detection results by the image recognition model is low.

 画像分析システム10は、例えば、端末装置20に、衛星画像に写っている分析対象の物体の検出結果と、検出の確度を出力する。端末装置20は、例えば、図示しない表示装置に、衛星画像に写っている分析対象の物体の検出結果と、検出の確度を出力する。分析の担当者は、例えば、表示装置に表示される衛星画像に写っている分析対象の物体の検出結果と、検出の確度を参照して物体を特定する。例えば、分析対象である船舶が検出されている場合に、環境データを基にした検出の確度の推定結果が船舶では無い可能性が高いことを示しているとき、分析の担当者は、検出の確度の推定結果を参照して、船舶であるかを判断する。船舶として検出されている場合に、環境データを基にした推定結果が船舶では無い可能性が高いことを示しているとき、分析の担当者は、例えば、過去の分析において、船舶以外の物体であると特定したときの衛星画像を参照して、船舶として検出された物体が船舶であるかを判断する。船舶として検出されている場合に、環境データを基にした推定結果が船舶では無い可能性が高いことを示しているとき、分析の担当者は、例えば、不審な船舶が存在する可能性を考量して物体の特定を行う。環境データにおいて船舶が存在する可能性が低い領域に船舶が存在する場合には、通常とは異なる意図を持って航行または停泊している可能性があるからである。通常とは異なる意図とは、例えば、船舶が航行または停泊している海域における通常の目的とは異なる目的であることをいう。このように、分析対象の物体の検出結果と、検出結果における検出の確度を参照して衛星画像に写っている物体の特定を行うことで、衛星画像に写っている物体を特定する作業が容易になり得る。 The image analysis system 10 outputs, for example, to the terminal device 20, the detection result of the object to be analyzed that appears in the satellite image and the detection accuracy. The terminal device 20 outputs, for example, the detection result of the object to be analyzed that appears in the satellite image and the detection accuracy to a display device not shown. The analyst, for example, refers to the detection result of the object to be analyzed that appears in the satellite image displayed on the display device and the detection accuracy to identify the object. For example, when a ship to be analyzed is detected and the estimation result of the detection accuracy based on the environmental data indicates that it is highly likely that it is not a ship, the analyst refers to the estimation result of the detection accuracy to determine whether it is a ship. When the object is detected as a ship and the estimation result based on the environmental data indicates that it is highly likely that it is not a ship, the analyst, for example, refers to a satellite image when it was identified as an object other than a ship in a past analysis to determine whether the object detected as a ship is a ship. When the object is detected as a ship and the estimation result based on the environmental data indicates that it is highly likely that it is not a ship, the analyst, for example, identifies the object by taking into account the possibility of the presence of a suspicious ship. This is because if a ship is present in an area in the environmental data where the presence of ships is unlikely, it may be navigating or anchoring with an unusual intention. An unusual intention means, for example, a purpose that is different from the usual purpose in the sea area in which the ship is navigating or anchoring. In this way, by referring to the detection results of the object to be analyzed and the detection accuracy in the detection results to identify the object shown in the satellite image, the task of identifying the object shown in the satellite image can be made easier.

 図3は、SARによって地表を撮像した衛星画像の例を示す。図3の衛星画像の例は、湾の周辺を撮像した衛星画像の例である。図3の衛星画像の例は、撮像された領域に船舶およびその他の物体がまったく存在しない場合における衛星画像の例である。図3の衛星画像の例では、明度が低い部分が陸の領域である。また、図3の衛星画像の例では、明度が高い部分が海の領域である。 Figure 3 shows an example of a satellite image of the earth's surface captured by SAR. The example of the satellite image in Figure 3 is an example of a satellite image of the area around a bay. The example of the satellite image in Figure 3 is an example of a satellite image in which there are no ships or other objects in the captured area. In the example of the satellite image in Figure 3, the areas with low brightness are land areas. Also, in the example of the satellite image in Figure 3, the areas with high brightness are sea areas.

 図4は、図3の衛星画像の例と同じ領域において船舶および船舶以外の物体が存在する場合に、図3の衛星画像の例と同じ領域をSARによって撮像した衛星画像の例である。図4の画像の例では、海の領域に船舶およびその他の物体が楕円形状に写っている。図4の画像の例では、陸の領域にも物体が楕円形状および円形状で写っている。画像分析システム10は、例えば、図4の例の衛星画像に写っている分析対象の物体を検出する。分析対象の物体が船舶である場合に、画像分析システム10は、例えば、画像認識モデルを用いて、図4の例の衛星画像に写っている船舶を検出する。そして、図4の例の衛星画像から船舶を検出した場合に、画像分析システム10は、図4の例の衛星画像に写っている地域における環境データを基に、検出された物体が船舶である確度を推定する。 FIG. 4 is an example of a satellite image captured by SAR of the same area as the example satellite image of FIG. 3, where a ship and objects other than ships are present in the same area. In the example image of FIG. 4, a ship and other objects are captured in an elliptical shape in the sea area. In the example image of FIG. 4, objects are also captured in an elliptical and circular shape in the land area. The image analysis system 10 detects, for example, an object to be analyzed that is captured in the satellite image of the example of FIG. 4. If the object to be analyzed is a ship, the image analysis system 10 detects the ship captured in the satellite image of the example of FIG. 4, for example, using an image recognition model. Then, when a ship is detected from the satellite image of the example of FIG. 4, the image analysis system 10 estimates the likelihood that the detected object is a ship based on environmental data for the area captured in the satellite image of the example of FIG. 4.

 図5は、分析対象の物体の検出結果および検出の確度の推定結果を示す表示画面の例である。図5の表示画面の例では、図4の例の衛星画像に写っている分析対象の物体を検出した場合における、分析対象の物体の検出結果および検出の確度の推定結果を示す表示画面の例である。図5の表示画面の例では、分析対象の物体が検出された領域が実線または破線の矩形で囲われている。図5の表示画面の例において、実線の矩形で囲われている領域は、例えば、検出の確度が基準以上の領域である。図5の表示画面の例において、破線の矩形で囲われている領域は、例えば、検出の確度が基準未満の領域である。図5の表示画面の例のような画像を出力することで、分析の担当者は、例えば、画像認識モデルによる検出の確度が低く、より詳細な分析が必要な領域を把握することができる。分析の担当者は、例えば、環境データを基にした確度の推定結果では船舶である可能性が低いことを考慮して、破線に囲われた領域を詳細に分析することで、衛星画像に写っている物体を特定する。 5 is an example of a display screen showing the detection result of the object to be analyzed and the estimation result of the detection accuracy. The example of the display screen of FIG. 5 is an example of a display screen showing the detection result of the object to be analyzed and the estimation result of the detection accuracy when the object to be analyzed captured in the satellite image of the example of FIG. 4 is detected. In the example of the display screen of FIG. 5, the area where the object to be analyzed is detected is surrounded by a solid or dashed rectangle. In the example of the display screen of FIG. 5, the area surrounded by the solid rectangle is, for example, an area where the detection accuracy is equal to or higher than the standard. In the example of the display screen of FIG. 5, the area surrounded by the dashed rectangle is, for example, an area where the detection accuracy is lower than the standard. By outputting an image such as the example of the display screen of FIG. 5, the person in charge of analysis can grasp, for example, an area where the detection accuracy by the image recognition model is low and a more detailed analysis is required. For example, the person in charge of analysis can identify the object captured in the satellite image by performing a detailed analysis of the area surrounded by the dashed line, taking into account that the estimation result of the accuracy based on the environmental data shows that the object is unlikely to be a ship.

 ここで、画像分析システム10の構成の具体例について説明する。図6は、画像分析システム10の構成の例を示す図である。 Here, we will explain a specific example of the configuration of the image analysis system 10. Figure 6 is a diagram showing an example of the configuration of the image analysis system 10.

 画像分析システム10は、基本構成として、衛星画像取得部12と、環境データ取得部13と、検出部14と、推定部15と、出力部16を備える。また、画像分析システム10は、例えば、取得部11と、記憶部17を備える。 The image analysis system 10 basically comprises a satellite image acquisition unit 12, an environmental data acquisition unit 13, a detection unit 14, an estimation unit 15, and an output unit 16. The image analysis system 10 also comprises, for example, an acquisition unit 11 and a storage unit 17.

 取得部11は、例えば、分析対象に関する情報を取得する。取得部11は、例えば、端末装置20から、分析対象に関する情報を取得する。分析対象に関する情報は、例えば、分析対象の地域および分析対象の物体の少なくとも一方を示す情報である。 The acquisition unit 11, for example, acquires information about the analysis target. The acquisition unit 11 acquires information about the analysis target, for example, from the terminal device 20. The information about the analysis target is, for example, information indicating at least one of the region of the analysis target and the object of the analysis target.

 分析対象の地域は、例えば、分析対象の範囲を示す情報である。分析対象の地域は、あらかじめ設定されていてもよい。例えば、画像分析システム10が分析対象とする地域が特定の地域のみである場合に、取得部11は、分析対象の地域を示す情報を取得しなくてもよい。 The area to be analyzed is, for example, information indicating the range of the area to be analyzed. The area to be analyzed may be set in advance. For example, if the area to be analyzed by the image analysis system 10 is only a specific area, the acquisition unit 11 does not need to acquire information indicating the area to be analyzed.

 分析対象の物体に関する情報は、例えば、何の存在を検出するかを示す情報である。分析対象の物体は、例えば、船舶である。分析対象の物体が船舶である場合に、分析対象の船舶は、例えば、航行または停泊している船舶である。分析対象の船舶は、航行している船舶と停泊している船舶の両方であってもよい。また、分析対象の船舶は、陸上に置かれている船舶であってもよい。分析対象の物体は、陸上の航空機、車両、建造物または保管物であってもよい。分析対象の物体は、上記に限られない。また、分析対象の物体は、あらかじめ設定されていてもよい。例えば、画像分析システム10が船舶の存在有無の分析にのみに用いられる場合に、取得部11は、船舶の存在を検出することを示す情報を新たに取得しなくてもよい。 The information on the object to be analyzed is, for example, information indicating what is to be detected. The object to be analyzed is, for example, a ship. When the object to be analyzed is a ship, the ship to be analyzed is, for example, a ship that is sailing or anchored. The ship to be analyzed may be both a ship that is sailing and a ship that is anchored. The ship to be analyzed may also be a ship that is located on land. The object to be analyzed may also be an aircraft, vehicle, structure, or stored item on land. The object to be analyzed is not limited to the above. The object to be analyzed may also be set in advance. For example, when the image analysis system 10 is used only to analyze the presence or absence of a ship, the acquisition unit 11 does not need to newly acquire information indicating that the presence of a ship is to be detected.

 分析対象の物体を示す情報は、分析対象の物体の種別であってもよい。例えば、分析対象が船舶である場合に、分析の対象の物体の種別は、例えば、船舶の種別である。船舶の種別は、例えば、大型船、中型船または小型船のいずれであるかである。船舶の種別は、タンカー、旅客船、貨物船、フェリー、作業船または漁船のいずれであるかであってもよい。船舶の種別は、飛行隊を搭載可能な船舶、通常型の船舶または潜水艇のいずれであるかであってもよい。船舶の種別は、上記に限られない。また、物体の種別は、上記に限られない。 The information indicating the object to be analyzed may be the type of the object to be analyzed. For example, if the object to be analyzed is a ship, the type of object to be analyzed is, for example, the type of ship. The type of ship may be, for example, a large ship, a medium-sized ship, or a small ship. The type of ship may be, for example, a tanker, a passenger ship, a cargo ship, a ferry, a work boat, or a fishing boat. The type of ship may be, for example, a ship capable of carrying an aircraft squadron, a conventional ship, or a submarine. The types of ships are not limited to the above. Furthermore, the types of objects are not limited to the above.

 分析対象に関する情報は、分析対象となるタイミングを示す情報を含んでいてもよい。分析対象となるタイミングを示す情報は、例えば、分析対象となる衛星画像が撮像されたタイミングおよび環境データが観測されたタイミングを示す情報である。分析対象となるタイミングを示す情報は、日時、日または期間によって設定される。分析対象となる期間は、例えば、期間の最初の日と期間の最後の日を用いて設定される。分析対象となる期間をどのように設定するかは、上記に限られない。また、分析対象に関する情報は、分析対象の地域内におけるどの領域を分析するのかを示す情報であってもよい。分析対象に関する情報は、上記に限られない。また、分析対象に関する情報は、例えば、分析の担当者によって設定される。 The information on the subject of analysis may include information indicating the timing to be analyzed. The information indicating the timing to be analyzed is, for example, information indicating the timing when the satellite image to be analyzed was captured and the timing when the environmental data was observed. The information indicating the timing to be analyzed is set by date, time, day, or period. The period to be analyzed is set, for example, using the first day of the period and the last day of the period. How the period to be analyzed is set is not limited to the above. The information on the subject of analysis may also be information indicating which area within the region to be analyzed is to be analyzed. The information on the subject of analysis is not limited to the above. The information on the subject of analysis is set, for example, by the person in charge of analysis.

 衛星画像取得部12は、分析対象の地域を撮像した衛星画像を取得する。衛星画像取得部12は、例えば、撮像した地点および撮像した日時に関する情報が付加されている衛星画像を取得する。撮像した地点の情報は、例えば、画像の中心部の地点における緯度および経度である。撮像した地点の情報は、上記に限らず、撮影した地点を特定するための情報であればよい。衛星画像取得部12は、例えば、衛星画像管理サーバ30を介して、分析対象の地域を撮像した衛星画像を取得する。また、衛星画像取得部12は、記憶媒体を介して、分析対象の地域を撮像した衛星画像を取得してもよい。 The satellite image acquisition unit 12 acquires satellite images of the area to be analyzed. The satellite image acquisition unit 12 acquires satellite images to which information on the location where the image was taken and the date and time of image capture is added, for example. The information on the location where the image was taken is, for example, the latitude and longitude of the location at the center of the image. The information on the location where the image was taken is not limited to the above, and may be any information for identifying the location where the image was taken. The satellite image acquisition unit 12 acquires satellite images of the area to be analyzed, for example, via the satellite image management server 30. The satellite image acquisition unit 12 may also acquire satellite images of the area to be analyzed via a storage medium.

 取得部11によって分析対象の地域を示す情報が取得されている場合に、衛星画像取得部12は、分析対象の地域を示す情報によって示される地域を撮像した衛星画像を取得してもよい。取得部11によって分析対象のタイミングを示す情報が取得されている場合に、衛星画像取得部12は、分析対象のタイミングを示す情報によって示されるタイミングにおいて撮像された衛星画像を取得してもよい。 When the acquisition unit 11 has acquired information indicating the region to be analyzed, the satellite image acquisition unit 12 may acquire a satellite image of the region indicated by the information indicating the region to be analyzed. When the acquisition unit 11 has acquired information indicating the timing of the region to be analyzed, the satellite image acquisition unit 12 may acquire a satellite image captured at the timing indicated by the information indicating the timing of the region to be analyzed.

 衛星画像は、例えば、人工衛星に搭載されたSARによって撮像された画像である。衛星画像は、SARによって撮像された画像に限られない。また、衛星画像取得部12は、複数の撮像方式よって撮像された衛星画像を取得してもよい。衛星画像取得部12は、例えば、SARによって撮像された衛星画像と、可視光領域の光学画像を取得する。衛星画像取得部12は、赤外領域の衛星画像を取得してもよい。複数の撮像方式の衛星画像は、分析対象の地域がそれぞれに含まれていれば、同一の範囲を撮像した衛星画像に限られない。 The satellite image is, for example, an image captured by a SAR mounted on an artificial satellite. The satellite image is not limited to an image captured by a SAR. Furthermore, the satellite image acquisition unit 12 may acquire satellite images captured by a plurality of imaging methods. The satellite image acquisition unit 12 acquires, for example, a satellite image captured by a SAR and an optical image in the visible light region. The satellite image acquisition unit 12 may acquire a satellite image in the infrared region. The satellite images captured by a plurality of imaging methods are not limited to satellite images captured over the same range, so long as each of the images includes the area to be analyzed.

 環境データ取得部13は、分析対象の地域の環境データを取得する。環境データ取得部13は、例えば、観測した地点および観測した日時が付加されている環境データを取得する。観測した地点の情報は、例えば、環境データを観測した地点における緯度および経度である。観測した地点の情報は、上記に限られない。また、環境データは、環境を観測した観測値の時系列データであってもよい。環境データ取得部13は、例えば、環境データ管理サーバ40から、分析対象の地域の環境データを取得する。環境データ取得部13は、複数の環境データ管理サーバ40から、分析対象の地域の環境データを取得してもよい。環境データ取得部13は、気象庁またはその他の官庁が運営する環境データの提供サーバから、環境データを環境データとして取得してもよい。 The environmental data acquisition unit 13 acquires environmental data for the area to be analyzed. The environmental data acquisition unit 13 acquires environmental data to which, for example, the observation point and the observation date and time are added. The information on the observation point is, for example, the latitude and longitude of the point where the environmental data was observed. The information on the observation point is not limited to the above. The environmental data may also be time series data of observed values of the environment. The environmental data acquisition unit 13 acquires environmental data for the area to be analyzed, for example, from the environmental data management server 40. The environmental data acquisition unit 13 may acquire environmental data for the area to be analyzed from multiple environmental data management servers 40. The environmental data acquisition unit 13 may acquire environmental data as environmental data from an environmental data providing server operated by the Japan Meteorological Agency or other government agency.

 環境データは、例えば、水深、海洋に関する気象および地形に関するデータのうち、少なくとも1つの項目のデータである。海洋に関する気象のデータは、水温、気温、波浪および風速のうち、少なくとも1つの項目のデータである。海洋に関する気象は、上記に限られない。また、環境データは、上記に限られない。環境データ取得部13は、分析対象の物体に基づいて設定されている項目の環境データを取得してもよい。例えば、分析対象が船舶である場合に、環境データ取得部13は、船舶の存在の有無の推定に必要な水深、波浪および地形のデータを環境データとして取得する。船舶の存在の有無の推定の項目は、例えば、分析の担当者または画像分析システム10の運営者によって設定される。 The environmental data is, for example, data on at least one of the following items: water depth, ocean weather, and topography. Ocean weather data is data on at least one of the following items: water temperature, air temperature, waves, and wind speed. Ocean weather is not limited to the above. Furthermore, the environmental data is not limited to the above. The environmental data acquisition unit 13 may acquire environmental data for items that are set based on the object to be analyzed. For example, when the object to be analyzed is a ship, the environmental data acquisition unit 13 acquires data on water depth, waves, and topography, which are necessary for estimating the presence or absence of a ship, as environmental data. The items for estimating the presence or absence of a ship are set, for example, by the person in charge of analysis or the operator of the image analysis system 10.

 環境データ取得部13は、分析対象の地域またはタイミングの少なくとも一方に基づいて設定されている項目の環境データを取得してもよい。例えば、分析対象が冬季に海洋生物が存在する地域の場合に、冬季に撮像された衛星画像を分析するときに、環境データ取得部13は、海洋生物の存在の有無の推定に必要な項目の環境データであり、かつ、冬季に観測された環境データを取得する。 The environmental data acquisition unit 13 may acquire environmental data for items that are set based on at least one of the region and timing of the analysis target. For example, if the analysis target is a region where marine life exists in winter, when analyzing satellite images captured in winter, the environmental data acquisition unit 13 acquires environmental data for items necessary to estimate the presence or absence of marine life, and which was observed in winter.

 取得部11によって分析対象の地域を示す情報が取得されている場合に、環境データ取得部13は、分析対象の地域を示す情報によって示される地域において観測された環境データを取得してもよい。取得部11によって分析対象のタイミングを示す情報が取得されている場合に、環境データ取得部13は、分析対象のタイミングを示す情報によって示されるタイミングにおいて観測された環境データを取得してもよい。 When the acquisition unit 11 has acquired information indicating the region to be analyzed, the environmental data acquisition unit 13 may acquire environmental data observed in the region indicated by the information indicating the region to be analyzed. When the acquisition unit 11 has acquired information indicating the timing of the analysis target, the environmental data acquisition unit 13 may acquire environmental data observed at the timing indicated by the information indicating the timing of the analysis target.

 環境データ取得部13は、例えば、衛星画像の撮像時点と、観測された期間が対応する環境データが含まれるように環境データを取得してもよい。衛星画像の撮像時点と環境データが観測された期間が対応するとは、環境データの変動が想定する条件において起こり得る変動の範囲内である期間内に、衛星画像の撮像時点が含まれることをいう。例えば、潮流が冬季の間、一定である場合に、冬季のいずれかの時点において撮像された衛星画像と、冬季のいずれかの時点において潮流のデータは、撮像時点と観測された期間が対応するとしてもよい。また、平均値が物体の存在の有無に影響を及ぼす場合に、環境データ取得部13は、衛星画像の撮像時点を含む所定の期間における環境データの平均値のデータを取得する。所定の期間は、例えば、分析の担当者によって設定される。また、例えば、波高のように日時ごとに変わるデータの場合には、環境データ取得部13は、衛星画像が撮像された日時と、観測された日時が同一となるような波高のデータを取得する。同一となるようなデータには、例えば、推定部15における処理において同一とみなせる範囲で差異のあるデータを含む。また、環境データが地形のように通常は変化しないデータの場合は、環境データ取得部13は、変化が生じていない任意の時点における環境データを取得してもよい。どのようなタイミングで観測された環境データを取得するかは、上記に限られない。 The environmental data acquisition unit 13 may acquire environmental data such that the environmental data includes environmental data corresponding to the time when the satellite image was captured and the period of observation. The time when the satellite image was captured and the period of observation of the environmental data correspond means that the time when the satellite image was captured are included within a period within the range of fluctuations that can occur under the assumed conditions of fluctuations in the environmental data. For example, if the tidal current is constant during the winter season, the time when the satellite image was captured and the data of the tidal current at any time during the winter season may correspond to the time when the satellite image was captured and the period of observation. In addition, if the average value affects the presence or absence of an object, the environmental data acquisition unit 13 acquires data of the average value of the environmental data for a predetermined period including the time when the satellite image was captured. The predetermined period is set, for example, by the person in charge of analysis. In addition, in the case of data that changes depending on the date and time, such as wave height, the environmental data acquisition unit 13 acquires data of wave height such that the date and time when the satellite image was captured and the date and time when the observation were made are the same. Data that are the same include, for example, data that differs within a range that can be considered to be the same in the processing by the estimation unit 15. Furthermore, if the environmental data is data that does not normally change, such as topography, the environmental data acquisition unit 13 may acquire environmental data at any time when no changes have occurred. The timing at which observed environmental data is acquired is not limited to the above.

 環境データ取得部13は、分析対象の地域内に存在する船舶を特定する情報を取得してもよい。環境データ取得部13は、分析対象の地域におけるAIS(Automatic Identification System)による船舶の識別信号を取得してもよい。環境データ取得部13は、分析対象の地域において船舶の航行を監視する監視サーバから、AISによる船舶の識別信号を取得する。分析対象の地域内に存在する船舶を特定する情報は、AISによる船舶の識別信号に限られない。 The environmental data acquisition unit 13 may acquire information that identifies ships that are present within the area being analyzed. The environmental data acquisition unit 13 may acquire identification signals of ships by AIS (Automatic Identification System) in the area being analyzed. The environmental data acquisition unit 13 acquires identification signals of ships by AIS from a monitoring server that monitors the navigation of ships in the area being analyzed. The information that identifies ships that are present within the area being analyzed is not limited to identification signals of ships by AIS.

 検出部14は、衛星画像に写っている分析対象の物体を検出する。検出部14は、例えば、画像認識モデルを用いて、衛星画像に写っている分析対象の物体を検出する。検出部14は、例えば、画像認識モデルを用いて、衛星画像において、分析対象の物体が写っている領域を検出する。検出部14は、画像認識モデルを用いて、衛星画像において、分析対象の物体の種別をさらに検出してもよい。 The detection unit 14 detects an object to be analyzed that appears in a satellite image. The detection unit 14 detects an object to be analyzed that appears in a satellite image, for example, using an image recognition model. The detection unit 14 detects an area in the satellite image in which the object to be analyzed appears, for example, using an image recognition model. The detection unit 14 may further detect the type of object to be analyzed in the satellite image, using an image recognition model.

 画像認識モデルは、例えば、衛星画像を入力として、衛星画像に写っている物体を推定する学習モデルである。検出部14は、画像認識モデルを用いて、衛星画像に写っている、分析対象の物体以外の物体を検出してもよい。例えば、分析対象が船舶である場合に、分析対象の地域に海洋生物が存在するときに、検出部14は、画像認識モデルを用いて、衛星画像に写っている海洋生物を検出してもよい。 The image recognition model is, for example, a learning model that uses a satellite image as input and estimates objects that appear in the satellite image. The detection unit 14 may use the image recognition model to detect objects other than the object being analyzed that appear in the satellite image. For example, when the object being analyzed is a ship, if marine life is present in the area being analyzed, the detection unit 14 may use the image recognition model to detect marine life that appears in the satellite image.

 画像認識モデルは、例えば、衛星画像と、衛星画像に写っている物体の関係を学習することによって生成される。画像認識モデルは、例えば、衛星画像と、衛星画像において分析対象の物体が写っている領域の関係を学習することによって生成される。画像認識モデルは、衛星画像と、衛星画像に写っている物体の名称の関係を学習することによって生成されてもよい。画像認識モデルは、衛星画像と、物体が写っている領域および写っている物体の名称の関係を学習することによって生成されてもよい。画像認識モデルは、例えば、ニューラルネットワークを用いた深層学習によって生成される。画像認識モデルの生成に用いられる学習データおよび学習アルゴリズムは、上記に限られない。また、画像認識モデルは、例えば、画像分析システム10の外部のシステムにおいて生成される。画像認識モデルは、画像分析システム10が備える図示しない生成部において生成されてもよい。 The image recognition model is generated, for example, by learning the relationship between a satellite image and an object shown in the satellite image. The image recognition model is generated, for example, by learning the relationship between a satellite image and an area in which an object to be analyzed appears in the satellite image. The image recognition model may be generated by learning the relationship between a satellite image and the name of an object shown in the satellite image. The image recognition model may be generated by learning the relationship between a satellite image and an area in which an object appears and the name of the object shown. The image recognition model is generated, for example, by deep learning using a neural network. The learning data and learning algorithms used to generate the image recognition model are not limited to those described above. In addition, the image recognition model is generated, for example, in a system external to the image analysis system 10. The image recognition model may be generated in a generation unit (not shown) included in the image analysis system 10.

 検出部14は、衛星画像の輝度変化を基に、分析対象の物体を検出してもよい。例えば、衛星画像に海の領域のみが写っていて地形の変動が無い場合に、検出部14は、衛星画像のピクセル間での輝度変化を基に、衛星画像に写っている物体の輪郭を検出する。そして、検出部14は、輪郭に囲まれた領域の大きさおよび輪郭の形状の少なくとも一方を基に、衛星画像に写っている物体が分析対象の物体であるかを特定する。画像に写っている物体が分析対象の物体であると特定した場合に、検出部14は、例えば、輪郭で囲まれた領域に分析対象の物体が存在することを検出する。分析対象をどのように検出するかは、上記に限られない。 The detection unit 14 may detect the object to be analyzed based on the brightness change of the satellite image. For example, when a satellite image shows only an area of ocean and there is no change in the topography, the detection unit 14 detects the contour of the object shown in the satellite image based on the brightness change between pixels of the satellite image. The detection unit 14 then identifies whether the object shown in the satellite image is the object to be analyzed based on at least one of the size of the area surrounded by the contour and the shape of the contour. If the detection unit 14 identifies the object shown in the image as the object to be analyzed, it detects, for example, that the object to be analyzed is present in the area surrounded by the contour. How the object to be analyzed is detected is not limited to the above.

 推定部15は、環境データを基に、検出の確度を推定する。推定部15は、例えば、環境データを基に、検出部14によって検出された物体が、分析対象の物体であるかの確からしさを示す指標を検出の確度として推定する。推定部15は、例えば、環境データを基に、画像認識モデルによる検出結果における、分析対象の物体の検出の確度を推定する。すなわち、推定部15は、環境データを基に、画像認識モデルによる検出結果の確からしさを推定する。推定部15は、例えば、環境データを基に、分析対象の物体の存在の有無を推定する。そして、推定部15は、推定した分析対象の物体の存在の有無を基に、検出の確度を推定する。 The estimation unit 15 estimates the accuracy of detection based on the environmental data. For example, based on the environmental data, the estimation unit 15 estimates an index indicating the likelihood that the object detected by the detection unit 14 is the object to be analyzed as the accuracy of detection. For example, based on the environmental data, the estimation unit 15 estimates the accuracy of detection of the object to be analyzed in the detection result by the image recognition model. That is, based on the environmental data, the estimation unit 15 estimates the likelihood of the detection result by the image recognition model. For example, based on the environmental data, the estimation unit 15 estimates the presence or absence of the object to be analyzed. Then, the estimation unit 15 estimates the accuracy of detection based on the estimated presence or absence of the object to be analyzed.

 推定部15は、画像認識モデルが検出した物体が、分析対象以外の物体であるかを基に、検出の確度を推定してもよい。推定部15は、例えば、画像認識モデルが検出した物体が分析対象以外の物体である場合における当該物体の候補を推定する。そして、推定部15は、例えば、当該物体の候補が存在し得る可能性を基に、検出の確度を推定する。例えば、環境データが、当該物体の候補の存在に適しているとき、該物体の候補が存在し得る可能性は高くなる。分析対象以外の物体が存在する可能性が高い場合に、画像認識モデルが検出した物体が、分析対象の物体である確からしさは低くなる。分析対象の物体が船舶である場合に、推定部15は、例えば、画像認識モデルが検出した物体が船舶以外である場合における、船舶以外の物体の候補を推定する。分析対象の物体が船舶である場合に、分析対象の地域の環境データから海洋生物が存在することが推定されるとき、推定部15は、例えば、船舶以外の物体の候補として海洋生物を推定する。そして、海洋生物が存在する可能性が高いことを環境データが示しているとき、推定部15は、例えば、画像認識モデルが船舶として検出した物体が、船舶である確からしさは低いと推定する。すなわち、推定部15は、画像認識モデルが船舶として検出した検出結果の確度が低いと推定する。 The estimation unit 15 may estimate the accuracy of detection based on whether the object detected by the image recognition model is an object other than the analysis target. For example, the estimation unit 15 estimates a candidate for the object when the object detected by the image recognition model is an object other than the analysis target. Then, the estimation unit 15 estimates the accuracy of detection based on the possibility that the candidate object may exist. For example, when the environmental data is suitable for the existence of the candidate object, the possibility that the candidate object may exist increases. When the possibility of the existence of an object other than the analysis target is high, the likelihood that the object detected by the image recognition model is the object to be analyzed decreases. When the object to be analyzed is a ship, the estimation unit 15 estimates a candidate for the object other than the ship when the object detected by the image recognition model is other than the ship. When the object to be analyzed is a ship, when it is estimated from the environmental data of the area to be analyzed that marine life exists, the estimation unit 15 estimates, for example, marine life as a candidate for the object other than the ship. When the environmental data indicates that there is a high possibility that marine life is present, the estimation unit 15 estimates, for example, that the object detected as a ship by the image recognition model is unlikely to be a ship. In other words, the estimation unit 15 estimates that the accuracy of the detection result that the image recognition model detects as a ship is low.

 推定部15は、例えば、画像認識モデルの検出結果と、環境データを用いて推定される分析対象の地域に分析対象の物体の存在の有無とを基に、検出の確度を推定する。環境データを用いて推定される分析対象の地域に分析対象の物体の存在の有無とは、例えば、環境データを基に推定される分析対象の地域に分析対象の物体が存在し得る可能性である。推定部15は、例えば、分析対象の物体と、環境データとの関係を定めた基準を基に、分析対象の地域における分析対象の物体の有無を推定する。そして、画像認識モデルが分析対象の物体を検出した場合に、推定部15は、分析対象の物体の有無の推定結果を基に、検出の結果の確度を推定する。分析対象の物体と、環境データとの関係を定めた基準は、分析対象の存在の有無について環境データを基に推定するための基準である。分析対象の物体と、環境データとの関係を定めた基準は、どのような環境の場合に、分析対象の物体が存在する可能性が高いかを判断する基準である。また、分析対象の物体と、環境データとの関係を定めた基準は、どのような環境の場合に、分析対象の物体が存在しない可能性が高いかを判断する基準であってもよい。 The estimation unit 15 estimates the accuracy of detection, for example, based on the detection result of the image recognition model and the presence or absence of the object to be analyzed in the area to be analyzed estimated using the environmental data. The presence or absence of the object to be analyzed in the area to be analyzed estimated using the environmental data is, for example, the possibility that the object to be analyzed may exist in the area to be analyzed estimated based on the environmental data. The estimation unit 15 estimates the presence or absence of the object to be analyzed in the area to be analyzed, for example, based on a criterion that defines the relationship between the object to be analyzed and the environmental data. Then, when the image recognition model detects the object to be analyzed, the estimation unit 15 estimates the accuracy of the detection result based on the estimation result of the presence or absence of the object to be analyzed. The criterion that defines the relationship between the object to be analyzed and the environmental data is a criterion for estimating the presence or absence of the object to be analyzed based on the environmental data. The criterion that defines the relationship between the object to be analyzed and the environmental data is a criterion for determining in what kind of environment the object to be analyzed is likely to exist. The criterion that defines the relationship between the object to be analyzed and the environmental data may also be a criterion for determining in what kind of environment the object to be analyzed is likely not to exist.

 分析対象の物体が船舶である場合に、推定部15は、画像認識モデルが船舶を検出した結果と、環境データを用いて推定される分析対象の地域に船舶が存在し得る可能性とを基に、検出の確度を推定する。例えば、画像認識モデルが分析対象の物体である船舶を検出している場合に、環境データに含まれる地形のデータが停泊に適さない地形であるとき、推定部15は、船舶の停泊に適さない地形のため、検出された物体が船舶である可能性は低いと推定する。また、例えば、画像認識モデルが分析対象の物体である大型船を検出している場合に、環境データに含まれる水深のデータが航行および停泊に適さない深さであるとき、推定部15は、船舶の航行およびに適さない水深のため、検出された物体が船舶である可能性は低いと推定する。また、例えば、画像認識モデルが分析対象の物体である互いに接近している複数の船舶を検出している場合に、環境データに含まれる海流および波浪のデータが船舶の接近に適さない値を示しているとき、推定部15は、検出された複数の物体が船舶である可能性は低いと推定する。分析対象の物体と、環境データとの関係を定めた基準は、上記に限られない。分析対象の物体と、環境データとの関係を定めた基準は、例えば、分析の担当者または画像分析システム10の運営者によって設定される。 When the object to be analyzed is a ship, the estimation unit 15 estimates the accuracy of detection based on the result of the image recognition model detecting the ship and the possibility that the ship may exist in the area to be analyzed estimated using the environmental data. For example, when the image recognition model detects a ship, which is the object to be analyzed, and the topographical data included in the environmental data indicates a topographical topography that is not suitable for anchoring, the estimation unit 15 estimates that the detected object is unlikely to be a ship because of the topography that is not suitable for anchoring a ship. Also, for example, when the image recognition model detects a large ship, which is the object to be analyzed, and the water depth data included in the environmental data indicates a depth that is not suitable for navigation and anchoring, the estimation unit 15 estimates that the detected object is unlikely to be a ship because of the water depth that is not suitable for navigation and anchoring a ship. Also, for example, when the image recognition model detects multiple ships approaching each other, which are the object to be analyzed, and the ocean current and wave data included in the environmental data indicate values that are not suitable for the approach of ships, the estimation unit 15 estimates that the detected objects are unlikely to be ships. The criteria that define the relationship between the object to be analyzed and the environmental data are not limited to the above. The criteria that define the relationship between the object to be analyzed and the environmental data are set, for example, by the person in charge of analysis or the operator of the image analysis system 10.

 推定部15は、例えば、環境データを用いて推定される物体が存在する可能性を示すスコアを基に検出の確度を推定する。推定部15は、例えば、環境データを基に、分析対象の物体が存在する可能性を示すスコアを推定する。そして、推定部15は、推定したスコアを基に、検出の確度を推定する。物体が存在する可能性を示すスコアは、例えば、物体が存在する環境として適しているかを示す指標である。例えば、船舶が存在する可能性を示すスコアは、船舶が存在する環境として適しているかを示す指標である。船舶が存在する環境は、例えば、水深、地形および波浪に関する環境が船舶の航行および停泊の少なくとも一方に適した環境である。推定部15は、環境データの項目それぞれについて、スコアを算出する。そして、推定部15は、例えば、環境データの項目ごとのスコアを合計することで、物体が存在する可能性を示すスコアを推定する。環境データの値とスコアの関係は、例えば、環境データの項目ごとのテーブルとして設定されている。環境データの値とスコアの関係は、環境データの値を説明変数、スコアを目的変数とした関数を用いて設定されていてもよい。環境データの値とスコアの関係は、例えば、分析の担当者または画像分析システム10の運営者によって設定される。 The estimation unit 15 estimates the accuracy of detection based on a score indicating the possibility of the presence of an object estimated using environmental data, for example. The estimation unit 15 estimates a score indicating the possibility of the presence of an object to be analyzed, for example, based on environmental data. Then, the estimation unit 15 estimates the accuracy of detection based on the estimated score. The score indicating the possibility of the presence of an object is, for example, an index indicating whether an environment is suitable for an object to exist. For example, a score indicating the possibility of a ship to exist is an index indicating whether an environment is suitable for a ship to exist. An environment in which a ship exists is, for example, an environment related to water depth, topography, and waves that is suitable for at least one of ship navigation and anchoring. The estimation unit 15 calculates a score for each item of environmental data. Then, the estimation unit 15 estimates a score indicating the possibility of the presence of an object by, for example, adding up the scores for each item of environmental data. The relationship between the value of environmental data and the score is set, for example, as a table for each item of environmental data. The relationship between the value of environmental data and the score may be set using a function with the value of environmental data as an explanatory variable and the score as a target variable. The relationship between the environmental data value and the score is set, for example, by the person in charge of analysis or the operator of the image analysis system 10.

 推定部15は、環境データを基に、分析対象の以外の物体が存在する可能性を示すスコアを推定してもよい。分析対象の以外の物体は、例えば、画像認識モデルが分析対象の物体であると誤認識し得る物体である。分析対象の以外の物体が存在する可能性を示すスコアを算出した場合には、推定部15は、例えば、分析対象の以外の物体が存在する可能性を示すスコアを基に、検出の確度を推定する。推定部15は、例えば、分析対象の地域に存在する可能性がある、分析対象の物体以外の物体について、物体が存在する可能性を示すスコアを推定する。分析対象の地域に存在する可能性があるとは、例えば、分析対象の地域において、過去に物体が存在したことがあることをいう。分析対象の地域に存在する可能性があるとは、分析対象の地域と地形および環境の少なくとも一方が同一または類似の地域において、物体が存在する頻度が高いことであってもよい。例えば、分析対象が船舶である場合に、海洋生物が存在する可能性があるとき、推定部15は、海洋生物が存在する可能性を示すスコアを推定する。例えば、船舶よりも海洋生物のスコアが高い場合に、推定部15は、船舶として検出された物体が海洋生物である可能性が高いと推定する。すなわち、船舶よりも海洋生物のスコアが高い場合に、推定部15は、船舶として検出された物体が船舶である可能性が低いと推定する。 The estimation unit 15 may estimate a score indicating the possibility that an object other than the object to be analyzed exists based on the environmental data. An object other than the object to be analyzed is, for example, an object that the image recognition model may mistakenly recognize as the object to be analyzed. When a score indicating the possibility that an object other than the object to be analyzed exists is calculated, the estimation unit 15 estimates the accuracy of detection based on, for example, the score indicating the possibility that an object other than the object to be analyzed exists. The estimation unit 15 estimates a score indicating the possibility that an object exists for an object other than the object to be analyzed that may exist in the area to be analyzed. The possibility that an object exists in the area to be analyzed means, for example, that an object has existed in the area to be analyzed in the past. The possibility that an object exists in the area to be analyzed may mean that an object exists frequently in an area having at least one of the same topography and environment as the area to be analyzed, or similar thereto. For example, when the object to be analyzed is a ship and there is a possibility that marine life exists, the estimation unit 15 estimates a score indicating the possibility that marine life exists. For example, when the score of marine life is higher than that of a ship, the estimation unit 15 estimates that the object detected as a ship is highly likely to be marine life. That is, if the score for marine life is higher than that for ships, the estimation unit 15 estimates that the object detected as a ship is unlikely to be a ship.

 推定部15は、画像認識モデルの検出結果における分析対象である確率と、環境データから推定した物体が存在する可能性を示すスコアを基に、検出の確度を推定してもよい。例えば、画像認識モデルの推定結果における分析対象である確率がPm、環境データから推定した物体が存在する可能性を示すスコアがS、検出の確度がPdであるとすると、推定部15は、Pd=Pm×Sの式を用いて、検出の確度を推定する。検出の確度を推定する式は、上記に限られない。また、検出の確度をどのように推定するかは上記に限られない。 The estimation unit 15 may estimate the accuracy of detection based on the probability of being the analysis target in the detection results of the image recognition model and a score indicating the possibility of the existence of an object estimated from the environmental data. For example, if the probability of being the analysis target in the estimation results of the image recognition model is Pm, the score indicating the possibility of the existence of an object estimated from the environmental data is S, and the accuracy of detection is Pd, the estimation unit 15 estimates the accuracy of detection using the formula Pd = Pm x S. The formula for estimating the accuracy of detection is not limited to the above. Furthermore, how the accuracy of detection is estimated is not limited to the above.

 推定部15は、例えば、環境データを用いて推定される画像認識モデルの検出結果の確からしさに影響を及ぼす環境要因に基づいて、検出の確度を推定する。また、推定部15は、環境データを用いて推定される画像認識モデルが分析対象の物体であると誤認し得る物体の存在に影響を及ぼす環境要因に基づいて、検出の確度を推定してもよい。分析対象の物体を検出する画像認識モデルの検出結果の確からしさに影響を及ぼす環境要因は、例えば、物体の存在の有無に及ぼす影響が大きい環境データの項目である。推定部15は、例えば、スコアが最も高い項目が検出結果の確度に影響を及ぼす環境要因であると推定する。推定部15は、例えば、スコアが基準を満たす項目が検出結果の確度に影響を及ぼす環境要因であると推定してもよい。また、推定部15は、スコアが上位から所定の順位までの項目が検出結果の確度に影響を及ぼす環境要因であると推定してもよい。 The estimation unit 15 estimates the accuracy of detection, for example, based on environmental factors that affect the accuracy of the detection result of the image recognition model estimated using the environmental data. The estimation unit 15 may also estimate the accuracy of detection, based on environmental factors that affect the presence of an object that the image recognition model estimated using the environmental data may mistakenly recognize as the object to be analyzed. The environmental factors that affect the accuracy of the detection result of the image recognition model that detects the object to be analyzed are, for example, items of environmental data that have a large effect on the presence or absence of the object. The estimation unit 15 estimates, for example, that the item with the highest score is the environmental factor that affects the accuracy of the detection result. The estimation unit 15 may estimate, for example, that the item whose score meets a criterion is the environmental factor that affects the accuracy of the detection result. The estimation unit 15 may also estimate that the items with the highest scores up to a predetermined rank are the environmental factors that affect the accuracy of the detection result.

 また、推定部15は、環境データを基に、分析対象の物体を検出する画像認識モデルの検出結果の確からしさに影響を及ぼす環境要因を、検出の確度の推定理由として推定してもよい。推定部15は、例えば、環境データを用いて推定される分析対象の物体が存在する可能性に影響を及ぼす環境要因を、検出の確度の推定理由として推定する。また、推定部15は、例えば、環境データを用いて推定される分析対象以外の物体が存在する可能性に影響を及ぼす環境要因を、検出の確度の推定理由として推定する。分析対象の物体が船舶である場合に、推定部15は、例えば、船舶が存在する可能性に影響を及ぼす環境要因または船舶以外の物体が存在する可能性に影響を及ぼす環境要因を基に、検出の確度の推定理由を推定する。 The estimation unit 15 may also estimate, based on the environmental data, environmental factors that affect the likelihood of the detection result of the image recognition model that detects the object to be analyzed as the estimated reason for the detection accuracy. For example, the estimation unit 15 estimates environmental factors that affect the possibility of the existence of the object to be analyzed estimated using the environmental data as the estimated reason for the detection accuracy. For example, the estimation unit 15 estimates environmental factors that affect the possibility of the existence of an object other than the object to be analyzed estimated using the environmental data as the estimated reason for the detection accuracy. When the object to be analyzed is a ship, the estimation unit 15 estimates the estimated reason for the detection accuracy based on environmental factors that affect the possibility of the existence of a ship or environmental factors that affect the possibility of the existence of an object other than a ship.

 推定部15は、環境データから存在する物体を推定する推定モデルを用いて推定される分析対象の地域に存在する物体を基に、検出の確度を推定してもよい。推定モデルは、例えば、環境データと、存在する物体の関係を学習した学習モデルである。推定モデルは、例えば、ニューラルネットワークを用いた深層学習によって生成される。 The estimation unit 15 may estimate the accuracy of detection based on objects present in the area to be analyzed that are estimated using an estimation model that estimates the objects present from environmental data. The estimation model is, for example, a learning model that learns the relationship between environmental data and existing objects. The estimation model is, for example, generated by deep learning using a neural network.

 推定部15は、推定理由を特定可能な推定モデルを用いて推定される分析対象の地域に存在する物体を基に、検出の確度を推定してもよい。推定理由を出力可能な推定モデル、例えば、因子化漸近ベイズ推論を基にした学習アルゴリズムを用いて生成されてもよい。因子化漸近ベイズ推論を基にした学習アルゴリズムを用いて学習を行う際に、学習器は、環境データの項目それぞれのデータを入力データ、存在する物体を正解データとして決定木形式のルールによって場合分けする。そして、習器は、各場合で異なる説明変数を組み合わせた線形モデルを用いて実現度を予測する学習モデルを生成する。学習器は、データの場合分け条件の最適化、説明変数の組み合わせの最適化のよる予測モデルの生成、および不要な予測モデルの削除の処理を順に行うことで学習モデルを生成する。また、推定モデルは、環境データの項目それぞれの変動量に対する推定結果の変化を基に、推定理由を特定する学習モデルであってもよい。推定モデルを生成する学習アルゴリズムは、上記に限られない。また、推定モデルは、例えば、画像分析システム10の外部のシステムにおいて生成される。推定モデルは、画像分析システム10が備える図示しない生成部において生成されてもよい。 The estimation unit 15 may estimate the accuracy of detection based on an object present in the area to be analyzed, which is estimated using an estimation model capable of identifying the reason for estimation. The estimation model capable of outputting the reason for estimation may be generated using, for example, a learning algorithm based on factorized asymptotic Bayesian inference. When learning is performed using a learning algorithm based on factorized asymptotic Bayesian inference, the learning device performs case classification using a decision tree-type rule with the data of each item of environmental data as input data and the existing object as correct answer data. The learning device then generates a learning model that predicts the degree of realization using a linear model that combines different explanatory variables in each case. The learning device generates the learning model by sequentially performing the processes of optimizing the case classification conditions of the data, generating a prediction model by optimizing the combination of explanatory variables, and deleting unnecessary prediction models. The estimation model may also be a learning model that identifies the reason for estimation based on the change in the estimation result relative to the amount of variation of each item of environmental data. The learning algorithm that generates the estimation model is not limited to the above. The estimation model is also generated, for example, in a system external to the image analysis system 10. The estimation model may be generated in a generation unit (not shown) included in the image analysis system 10.

 推定部15は、環境データを用いて推定される分析対象の物体が存在しない領域を基に、検出の確度を推定してもよい。分析対象が船舶である場合に、推定部15は、例えば、環境データを基に、船舶が存在しない領域を推定する。分析対象の物体が船舶である場合に、推定部15は、船舶が存在しないと推定され領域において船舶がされたとき、検出の確度が低いと推定する。推定部15は、環境データを基に、領域ごとに分析対象の物体が存在する可能性を示すスコアを推定する。そして、推定部15は、例えば、スコアが基準未満の領域において船舶がされたとき、検出の確度が低いと推定する。 The estimation unit 15 may estimate the accuracy of detection based on areas where the object to be analyzed is not present, which is estimated using environmental data. When the object to be analyzed is a ship, the estimation unit 15 estimates areas where the ship is not present, for example, based on environmental data. When the object to be analyzed is a ship, the estimation unit 15 estimates that the accuracy of detection is low when a ship is detected in an area where it is estimated that no ship is present. The estimation unit 15 estimates a score indicating the possibility that the object to be analyzed is present for each area, based on environmental data. Then, the estimation unit 15 estimates that the accuracy of detection is low, for example, when a ship is detected in an area where the score is below a standard.

 推定部15は、環境データを用いて推定される分析対象の物体以外の所定の物体が存在しない領域を基に、検出の確度を推定してもよい。所定の物体は、例えば、画像認識モデルが分析対象の物体と誤認識し得ることが想定される物体である。推定部15は、環境データを基に、領域ごとに所定の物体が存在する可能性を示すスコアを推定する。そして、推定部15は、例えば、スコアが基準未満の領域には、所定の物体が存在しないと推定する。推定部15は、例えば、所定の物体が存在しない領域において分析対象の物体が検出されたとき、検出の確度が高いと推定する。推定部15は、環境データを基に、分析対象の物体以外の所定の物体が存在する可能性が高い領域を推定してもよい。推定部15は、例えば、所定の物体が存在する可能性が高い領域において分析対象の物体が検出されたとき、検出の確度が低いと推定する。推定部15は、環境データを基に、領域ごとに所定の物体が存在する可能性を示すスコアを推定する。そして、推定部15は、例えば、スコアが基準以上の領域は、所定の物体が存在する可能性が高い領域であると推定する。所定の物体は、例えば、分析の担当者によって設定される。 The estimation unit 15 may estimate the accuracy of detection based on an area where a specific object other than the object to be analyzed, which is estimated using the environmental data, does not exist. The specific object is, for example, an object that is expected to be erroneously recognized as the object to be analyzed by the image recognition model. The estimation unit 15 estimates a score indicating the possibility that a specific object exists for each area based on the environmental data. Then, the estimation unit 15 estimates that a specific object does not exist in an area where the score is below a standard, for example. The estimation unit 15 estimates that the accuracy of detection is high, for example, when the object to be analyzed is detected in an area where a specific object does not exist. The estimation unit 15 may estimate an area where a specific object other than the object to be analyzed is likely to exist based on the environmental data. For example, when the object to be analyzed is detected in an area where a specific object is likely to exist, the estimation unit 15 estimates that the accuracy of detection is low. The estimation unit 15 estimates a score indicating the possibility that a specific object exists for each area based on the environmental data. Then, the estimation unit 15 estimates that an area where a score is equal to or higher than a standard is an area where a specific object is likely to exist. The specified object is set, for example, by the person in charge of the analysis.

 推定部15は、船舶の識別信号をさらに用いて、検出の確度を推定してもよい。推定部15は、例えば、AISの識別信号を基に、検出の確度を推定する。推定部15は、例えば、検出結果が示す船舶の種類と、AISの識別信号が示す船舶の種類が一致したときに、検出の確度が高いと推定する。推定部15は、例えば、検出結果と、AISの識別信号が示す情報において一致する項目が多いほど、検出の確度が高いと推定してもよい。例えば、推定部15は、検出結果と、AISの識別信号が示す情報において、船舶の種類と、針路が一致した場合に、船舶の種類だけが一致した場合よりも、検出の確度が高いと推定する。 The estimation unit 15 may further use the ship's identification signal to estimate the accuracy of detection. The estimation unit 15 estimates the accuracy of detection based on, for example, the AIS identification signal. The estimation unit 15 estimates that the accuracy of detection is high, for example, when the type of ship indicated by the detection result matches the type of ship indicated by the AIS identification signal. The estimation unit 15 may estimate that the accuracy of detection is high, for example, the more items that match between the detection result and the information indicated by the AIS identification signal. For example, the estimation unit 15 estimates that the accuracy of detection is higher when the type of ship and course match in the detection result and the information indicated by the AIS identification signal than when only the type of ship matches.

 出力部16は、検出の結果と、検出の確度を示す情報とを出力する。出力部16は、例えば、端末装置20に、検出の結果と、検出の確度を示す情報とを出力する。出力部16は、例えば、衛星画像と、当該衛星画像における分析対象の物体の検出の結果と、検出の確度を示す情報とを出力する。出力部16は、例えば、衛星画像上において分析対象の物体が検出された領域を図形で囲むことによって物体が検出された領域を示す検出結果を出力する。分析対象の物体が検出された領域を囲う図形は、例えば、矩形である。分析対象の物体が検出された領域を囲う図形は、例えば、矩形に限られない。出力部16は、例えば、衛星画像上において分析対象の物体が検出された領域を囲む線を、検出の確度の段階に応じて変化させた衛星画像を出力する。出力部16は、例えば、衛星画像上において分析対象の物体が検出された領域を囲む線の形状、線の太さおよび線の色の少なくとも一つを、検出の確度の段階に応じて変化させた衛星画像を出力する。出力部16は、衛星画像上において分析対象の物体が検出された領域を囲む図形の形状を、検出の確度の段階に応じて変化させた衛星画像を出力してもよい。出力部16は、検出の確度を示す数値を衛星画像に重畳して出力してもよい。また、出力部16は、検出の確度の段階を示す数値、文字または記号を衛星画像に重畳して出力してもよい。検出の確度を出力する形態は、上記に限られない。 The output unit 16 outputs the detection result and information indicating the accuracy of the detection. The output unit 16 outputs the detection result and information indicating the accuracy of the detection to, for example, the terminal device 20. The output unit 16 outputs, for example, a satellite image, the detection result of the object to be analyzed in the satellite image, and information indicating the accuracy of the detection. The output unit 16 outputs the detection result indicating the area in which the object to be analyzed is detected by surrounding the area in which the object to be analyzed is detected on the satellite image with a figure. The figure surrounding the area in which the object to be analyzed is detected is, for example, a rectangle. The figure surrounding the area in which the object to be analyzed is detected is not limited to, for example, a rectangle. The output unit 16 outputs, for example, a satellite image in which the line surrounding the area in which the object to be analyzed is detected on the satellite image is changed according to the level of the accuracy of the detection. The output unit 16 outputs, for example, a satellite image in which at least one of the shape, thickness, and color of the line surrounding the area in which the object to be analyzed is detected on the satellite image is changed according to the level of the accuracy of the detection. The output unit 16 may output a satellite image in which the shape of a figure surrounding an area in which the object to be analyzed is detected on the satellite image is changed according to the level of detection accuracy. The output unit 16 may output a numerical value indicating the level of detection accuracy by superimposing it on the satellite image. The output unit 16 may also output a numerical value, letter, or symbol indicating the level of detection accuracy by superimposing it on the satellite image. The form in which the detection accuracy is output is not limited to the above.

 出力部16は、推定部15が環境データを基に推定する分析対象の物体以外の物体の候補を出力してもよい。出力部16は、推定部15が環境データを基に推定する分析対象の物体以外の物体の候補の情報を、画像認識モデルが物体を検出した領域に関連付けて出力する。出力部16は、推定部15が環境データを基に推定する分析対象の物体以外の物体の候補の名称を、画像認識モデルが物体を検出した領域に関連付けて出力する。 The output unit 16 may output candidates for objects other than the object to be analyzed that are estimated by the estimation unit 15 based on the environmental data. The output unit 16 outputs information on candidates for objects other than the object to be analyzed that are estimated by the estimation unit 15 based on the environmental data, in association with an area in which the image recognition model detects the object. The output unit 16 outputs names of candidates for objects other than the object to be analyzed that are estimated by the estimation unit 15 based on the environmental data, in association with an area in which the image recognition model detects the object.

 出力部16は、検出の確度の推定理由をさらに出力してもよい。出力部16は、例えば、推定部15が環境データを基に推定する検出の確度の推定理由を出力する。出力部16は、例えば、検出した物体が分析対象の物体であると推定する理由を出力する。出力部16は、例えば、検出した物体が分析対象以外の物体であると推定する推定理由を出力してもよい。 The output unit 16 may further output the reason for estimating the accuracy of detection. For example, the output unit 16 outputs the reason for estimating the accuracy of detection that the estimation unit 15 estimates based on the environmental data. For example, the output unit 16 outputs the reason for estimating that the detected object is the object to be analyzed. For example, the output unit 16 may output the reason for estimating that the detected object is an object other than the object to be analyzed.

 出力部16は、検出された物体と同種別の物体が特定されている衛星画像を参照画像として出力してもよい。例えば、検出された物体が大型船である場合に、出力部16は、過去の分析において大型船が写っていると特定されている衛星画像を参照画像として出力してもよい。出力部16は、分析対象の地域に存在し得る分析対象以外の物体に関する衛星画像を参照画像として出力してもよい。例えば、分析対象が船舶である場合に、出力部16は、船舶以外の物体の衛星画像を出力してもよい。 The output unit 16 may output, as a reference image, a satellite image in which an object of the same type as the detected object is identified. For example, if the detected object is a large ship, the output unit 16 may output, as a reference image, a satellite image in which a large ship has been identified in a past analysis as showing the large ship. The output unit 16 may output, as a reference image, a satellite image of an object other than the analysis target that may be present in the area being analyzed. For example, if the analysis target is a ship, the output unit 16 may output a satellite image of an object other than a ship.

 出力部16は、分析対象の衛星画像と異なる撮像方式の衛星画像を出力してもよい。例えば、分析対象の衛星画像がSARによって撮像された画像である場合に、出力部16は、SARによる画像を撮像した範囲と同じ範囲が含まれる可視光領域の光学画像をさらに出力してもよい。 The output unit 16 may output a satellite image captured using a method different from that of the satellite image being analyzed. For example, if the satellite image being analyzed is an image captured by a SAR, the output unit 16 may further output an optical image in the visible light region that includes the same range as the range in which the image captured by the SAR was captured.

 出力部16は、例えば、分析の担当者によって選択された領域を拡大した画像を出力してもよい。出力部16は、例えば、表示画面上において分析対象の物体が検出された領域が分析の担当者の操作によって選択されたことを検出した場合に、選択された領域を拡大した画像を出力する。また、出力部16は、分析の担当者によって選択された領域について、他の領域よりも解像度が高い画像を出力してもよい。また、出力部16は、分析の担当者によって選択された領域について、過去の分析において分析対象の物体が写っていると特定されている衛星画像を参照画像として出力してもよい。また、出力部16は、分析の担当者によって選択された領域について、分析対象の物体以外の物体の候補が写っていると過去の分析において特定されている衛星画像を参照画像として出力してもよい。 The output unit 16 may, for example, output an image of an enlarged area selected by the person in charge of analysis. When the output unit 16 detects, for example, that an area on the display screen in which the object to be analyzed is detected has been selected by the operation of the person in charge of analysis, it outputs an image of an enlarged area of the selected area. The output unit 16 may also output an image of the area selected by the person in charge of analysis that has a higher resolution than other areas. The output unit 16 may also output, as a reference image, a satellite image that has been identified in a past analysis as containing the object to be analyzed in the area selected by the person in charge of analysis. The output unit 16 may also output, as a reference image, a satellite image that has been identified in a past analysis as containing a candidate object other than the object to be analyzed in the area selected by the person in charge of analysis.

 図7は、検出の確度の推定理由を表示する表示画面の例を示す図である。図7の表示画面の例において、実線の矩形で囲われている領域は、例えば、検出の確度が基準以上の領域である。図7の表示画面の例において、破線の矩形で囲われている領域は、例えば、検出の確度が基準未満の領域である。また、図7の表示画面の例では、破線で囲われている領域に、検出された物体が「アザラシ」である可能性が表示されている。また、図7の表示画面の例では、分析対象の物体以外であると推定した理由が、「気温」と「潮流」であることが表示されている。図7の表示画面の例では、環境データのうち気温と潮流の観測データがアザラシの存在に適しているため、船舶である可能性が低く、アザラシである可能性が高いことを示している。 FIG. 7 is a diagram showing an example of a display screen that displays the reason for the estimated detection accuracy. In the example of the display screen in FIG. 7, the area surrounded by a solid-line rectangle is, for example, an area where the detection accuracy is equal to or higher than a standard. In the example of the display screen in FIG. 7, the area surrounded by a dashed-line rectangle is, for example, an area where the detection accuracy is lower than a standard. In the example of the display screen in FIG. 7, the area surrounded by the dashed line displays the possibility that the detected object is a "seal." In the example of the display screen in FIG. 7, the reason for the estimation that the object is other than the object being analyzed is displayed as "temperature" and "tides." In the example of the display screen in FIG. 7, the observation data for temperature and tides, which are part of the environmental data, are suitable for the presence of a seal, so it is shown that the possibility that it is a ship is low and the possibility that it is a seal is high.

 記憶部17は、例えば、衛星画像の分析に用いるデータを保存する。記憶部17は、例えば、衛星画像取得部12が取得する衛星画像を保存する。記憶部17は、例えば、環境データ取得部13が取得する環境データを保存する。記憶部17は、例えば、環境データと物体が存在する可能性を示すスコアとの関係を示すテーブルを保存する。記憶部17は、例えば、画像認識モデルを保存する。記憶部17は、例えば、推定モデルを保存する。画像認識モデルおよび推定モデルは、記憶部17以外の記憶手段に保存されていてもよい。記憶部17は、例えば、画像認識モデルの検出結果を保存する。記憶部17は、例えば、検出の確度の推定理由を保存してもよい。記憶部17は、過去の分析において分析対象の物体が写っていると特定されている衛星画像を参照画像として保存してもよい。記憶部17は、過去の分析において分析対象以外の物体が写っていると特定されている衛星画像を参照画像として保存してもよい。 The storage unit 17 stores, for example, data used in the analysis of satellite images. The storage unit 17 stores, for example, satellite images acquired by the satellite image acquisition unit 12. The storage unit 17 stores, for example, environmental data acquired by the environmental data acquisition unit 13. The storage unit 17 stores, for example, a table showing the relationship between environmental data and a score indicating the possibility of the presence of an object. The storage unit 17 stores, for example, an image recognition model. The storage unit 17 stores, for example, an estimation model. The image recognition model and the estimation model may be stored in a storage means other than the storage unit 17. The storage unit 17 stores, for example, the detection results of the image recognition model. The storage unit 17 may store, for example, an estimation reason for the accuracy of detection. The storage unit 17 may store, as a reference image, a satellite image identified in a past analysis as containing an object to be analyzed. The storage unit 17 may store, as a reference image, a satellite image identified in a past analysis as containing an object other than the object to be analyzed.

 端末装置20は、例えば、分析の担当者が衛星画像の分析に用いる端末装置である。端末装置20は、例えば、画像分析システム10の出力部16から、衛星画像と、当該衛星画像に写っている物体の検出結果と、検出の確度を取得する。そして、端末装置20は、例えば、図示しない表示装置に、衛星画像と、当該衛星画像に写っている物体の検出結果と、検出の確度を出力する。端末装置20は、例えば、画像分析システム10の出力部16から、検出の確度の推定理由を取得する。そして、端末装置20は、例えば、図示しない表示装置に、検出の確度の推定理由を出力する。 The terminal device 20 is, for example, a terminal device used by an analyst to analyze satellite images. The terminal device 20 acquires satellite images, detection results of objects captured in the satellite images, and the accuracy of detection, for example, from the output unit 16 of the image analysis system 10. The terminal device 20 then outputs the satellite images, detection results of objects captured in the satellite images, and the accuracy of detection, for example, to a display device not shown. The terminal device 20 acquires an estimated reason for the accuracy of detection, for example, from the output unit 16 of the image analysis system 10. The terminal device 20 then outputs an estimated reason for the accuracy of detection, for example, to a display device not shown.

 端末装置20は、例えば、利用者の操作によって入力される分析対象に関する情報を取得してもよい。端末装置20は、例えば、画像分析システム10の取得部11に、分析対象に関する情報を出力する。 The terminal device 20 may, for example, acquire information about the analysis target input by a user's operation. The terminal device 20 outputs the information about the analysis target to the acquisition unit 11 of the image analysis system 10, for example.

 端末装置20には、例えば、パーソナルコンピュータ、タブレット型コンピュータ、またはスマートフォンを用いることができる。端末装置20は、上記の例に限られない。 The terminal device 20 may be, for example, a personal computer, a tablet computer, or a smartphone. The terminal device 20 is not limited to the above examples.

 衛星画像管理サーバ30は、例えば、人工衛星に搭載された撮像装置が地表を撮像した画像を管理する。衛星画像管理サーバ30は、例えば、人工衛星と通信を行う地上局から、人工衛星に搭載された撮像装置が地表を撮像した衛星画像を取得する。衛星画像には、例えば、撮像日時と、撮像地点が関連付けられている。衛星画像管理サーバ30は、例えば、人工衛星と通信を行う地上局から、人工衛星に搭載された撮像装置が地表を撮像した衛星画像を取得する。そして、衛星画像管理サーバ30は、取得した衛星画像と、撮像日時と、撮像地点を関連付けて保存する。衛星画像管理サーバ30は、衛星画像に付加されている撮像時の衛星の位置および撮像に関するパラメータから撮像地点を特定してもよい。撮像に関するパラメータは、例えば、撮像に用いる帯域の電磁波を送出する方位、地表面に対する電磁波の送出角度と、地表面から反射してくる電磁波の受信確度である。衛星画像管理サーバ30は、例えば、画像分析システム10から衛星画像の要求を受けたときに、画像分析システム10の衛星画像取得部12に、要求を受けた衛星画像と、当該衛星画像の撮像日時および撮像地点を出力する。また、衛星画像管理サーバ30は、複数であってもよい。例えば、衛星画像は、撮像を行った人工衛星の管理主体ごとに異なるサーバに保存されていてもよい。衛星画像管理サーバ30の数は、適宜、設定され得る。 The satellite image management server 30 manages, for example, images of the earth's surface captured by an imaging device mounted on an artificial satellite. The satellite image management server 30 acquires satellite images of the earth's surface captured by an imaging device mounted on an artificial satellite, for example, from a ground station that communicates with the artificial satellite. The satellite images are associated with, for example, the image capture date and time and the image capture location. The satellite image management server 30 acquires satellite images of the earth's surface captured by an imaging device mounted on an artificial satellite, for example, from a ground station that communicates with the artificial satellite. The satellite image management server 30 then associates and stores the acquired satellite images with the image capture date and time and the image capture location. The satellite image management server 30 may identify the image capture location from the position of the satellite at the time of image capture and imaging-related parameters added to the satellite image. The imaging-related parameters are, for example, the direction in which electromagnetic waves in the band used for image capture are transmitted, the transmission angle of the electromagnetic waves relative to the earth's surface, and the reception accuracy of the electromagnetic waves reflected from the earth's surface. For example, when the satellite image management server 30 receives a request for a satellite image from the image analysis system 10, it outputs the requested satellite image and the image capture date and time and capture location of the satellite image to the satellite image acquisition unit 12 of the image analysis system 10. There may also be multiple satellite image management servers 30. For example, satellite images may be stored in different servers for each management entity of the artificial satellite that captured the image. The number of satellite image management servers 30 may be set as appropriate.

 環境データ管理サーバ40は、例えば、環境データを管理する。環境データ管理サーバ40は、例えば、環境データを取得する。そして、環境データ管理サーバ40は、取得した環境データを観測日時および観測地点の情報と関連付けて保存する。環境データ管理サーバ40は、例えば、水深、水温、気温、潮流、波高および風速の観測データを保存する。また、環境データ管理サーバ40は、例えば、地形のデータを保存する。また、環境データ管理サーバ40は、例えば、画像分析システム10から環境データの要求を受けたときに、画像分析システム10の環境データ取得部13に、要求を受けた環境データと、当該環境データの観測日時および観測地点を出力する。また、環境データ管理サーバ40は、複数であってもよい。例えば、環境データの観測主体ごとに、に異なるサーバに保存されていてもよい。環境データ管理サーバ40の数は、適宜、設定され得る。 The environmental data management server 40, for example, manages environmental data. The environmental data management server 40, for example, acquires environmental data. The environmental data management server 40 then stores the acquired environmental data in association with information on the observation date and time and the observation point. The environmental data management server 40 stores, for example, observation data on water depth, water temperature, air temperature, tides, wave height, and wind speed. The environmental data management server 40 also stores, for example, topographical data. When the environmental data management server 40 receives a request for environmental data from the image analysis system 10, for example, it outputs the requested environmental data and the observation date and time and observation point of the environmental data to the environmental data acquisition unit 13 of the image analysis system 10. There may also be multiple environmental data management servers 40. For example, the environmental data may be stored in a different server for each observer of the environmental data. The number of environmental data management servers 40 may be set as appropriate.

 画像分析システム10の動作の例について説明する。図8は、画像分析システム10が、衛星画像に写っている物体の検出および検出の確度を推定する際の動作フローの例を示す。 An example of the operation of the image analysis system 10 will be described. Figure 8 shows an example of the operation flow when the image analysis system 10 detects an object captured in a satellite image and estimates the accuracy of the detection.

 衛星画像取得部12は、分析対象の地域を撮像した衛星画像を取得する(ステップS11)。衛星画像取得部12は、例えば、衛星画像管理サーバ30から、分析対象の地域を撮像した衛星画像を取得する。 The satellite image acquisition unit 12 acquires satellite images of the area to be analyzed (step S11). The satellite image acquisition unit 12 acquires satellite images of the area to be analyzed, for example, from the satellite image management server 30.

 また、環境データ取得部13は、分析対象の地域の環境データを取得する(ステップS12)。環境データ取得部13は、例えば、環境データ管理サーバ40から、分析対象の地域の環境データを取得する。 The environmental data acquisition unit 13 also acquires environmental data for the area to be analyzed (step S12). The environmental data acquisition unit 13 acquires the environmental data for the area to be analyzed from, for example, the environmental data management server 40.

 分析対象の地域の衛星画像と環境データが取得されると、検出部14は、衛星画像に写っている分析対象の物体を検出する(ステップS13)。検出部14は、例えば、画像認識モデルを用いて、衛星画像に写っている分析対象の物体を検出する。 When the satellite image and environmental data of the area to be analyzed are acquired, the detection unit 14 detects the object to be analyzed that appears in the satellite image (step S13). The detection unit 14 detects the object to be analyzed that appears in the satellite image, for example, using an image recognition model.

 衛星画像に写っている分析対象の物体が検出された場合に(ステップS14でYes)、推定部15は、環境データを基に、分析対象の物体の検出の確度を推定する(ステップS15)。推定部15は、例えば、環境データを基に、画像認識モデルが分析対象の物体の検出した検出結果の確からしさを検出の確度として推定する。 If the object to be analyzed is detected in the satellite image (Yes in step S14), the estimation unit 15 estimates the accuracy of detection of the object to be analyzed based on the environmental data (step S15). The estimation unit 15 estimates, for example, based on the environmental data, the likelihood of the detection result of the image recognition model detecting the object to be analyzed as the detection accuracy.

 分析対象の物体の検出の確度が推定されると、出力部16は、分析対象の物体の検出の結果と、検出の確度を出力する(ステップS16)。出力部16は、例えば、端末装置20に、分析対象の物体の検出の結果と、検出の確度を出力する。 Once the accuracy of detection of the object to be analyzed is estimated, the output unit 16 outputs the result of detection of the object to be analyzed and the accuracy of detection (step S16). The output unit 16 outputs the result of detection of the object to be analyzed and the accuracy of detection to, for example, the terminal device 20.

 分析対象の物体の検出の結果と、検出の確度を出力した際に、分析対象の物体を検出する処理が行われていない画像がある場合に(ステップS17でNo)、ステップS13に戻り、検出部14は、分析対象の物体の検出が行われていない衛星画像に写っている分析対象の物体を検出する。 When the detection result of the object to be analyzed and the detection accuracy are output, if there is an image in which the process to detect the object to be analyzed has not been performed (No in step S17), the process returns to step S13, and the detection unit 14 detects the object to be analyzed that is captured in a satellite image in which the detection of the object to be analyzed has not been performed.

 ステップS16において、分析対象の物体の検出の結果と、検出の確度を出力した際に、すべての衛星画像について分析対象の物体を検出する処理が行われている場合に(ステップS17でYes)、画像分析システム10は、衛星画像に写っている物体の検出および検出の確度の推定に関する処理を終了する。 In step S16, when the detection result of the object to be analyzed and the detection accuracy are output, if the process of detecting the object to be analyzed has been performed for all satellite images (Yes in step S17), the image analysis system 10 ends the process of detecting the object in the satellite image and estimating the detection accuracy.

 ステップS14において、衛星画像に写っている分析対象の物体が検出されなかった場合に(ステップS14でNo)、分析対象の物体を検出する処理が行われていない衛星画像があるとき(ステップS17でNo)、ステップS13に戻り、検出部14は、分析が行われていない他の衛星画像に写っている分析対象の物体を検出する。 If the object to be analyzed is not detected in the satellite image in step S14 (No in step S14), or there is a satellite image for which processing to detect the object to be analyzed has not been performed (No in step S17), the process returns to step S13, and the detection unit 14 detects the object to be analyzed in other satellite images for which analysis has not been performed.

 ステップS14において、衛星画像に写っている分析対象の物体が検出されなかった場合に(ステップS14でNo)、すべての衛星画像について分析対象の物体を検出する処理が行われているとき(ステップS17でYes)、画像分析システム10は、衛星画像に写っている物体の検出および検出の確度の推定に関する処理を終了する。 If the object to be analyzed is not detected in the satellite image in step S14 (No in step S14), and the process of detecting the object to be analyzed has been performed for all satellite images (Yes in step S17), the image analysis system 10 ends the process of detecting the object in the satellite image and estimating the accuracy of detection.

 画像分析システム10は、分析対象の地域を撮像した衛星画像から、分析対象の物体を検出する。画像分析システム10は、分析対象の地域の環境データを基に、分析対象の物体を検出の確度を推定する。そして、画像分析システム10は、分析対象の物体の検出の結果と、検出の確度を示す情報を出力する。このように、分析対象の物体の検出の結果と、検出の確度を示す情報を出力することで、衛星画像の分析の担当者は、検出の確度として出力される検出の確からしさを示す情報を参照しながら、衛星画像に写っている物体が分析対象の物体であるかを確認することができる。例えば、環境データから海洋生物が存在する可能性が高い場合に、分析の担当者は、海洋生物が存在する可能性を考慮して衛星画像に写っている物体が分析対象の物体であるかを確認することができる。また、地形および水流から船舶が存在しない可能性が高い場合には、分析の担当者は、船舶であるときには通常とは異なる船舶であることを考慮して衛星画像の分析を行うことができる。このように、分析対象の物体の検出の結果と、検出の確度を示す情報を用いて、衛星画像の分析を行うことで分析対象の地域に存在する物体の特定が容易になる。このため、画像分析システム10を用いることで、分析対象の地域に存在する物体を容易に分析することができる。 The image analysis system 10 detects an object to be analyzed from a satellite image of the area to be analyzed. The image analysis system 10 estimates the accuracy of detection of the object to be analyzed based on the environmental data of the area to be analyzed. The image analysis system 10 then outputs the result of detection of the object to be analyzed and information indicating the accuracy of detection. In this way, by outputting the result of detection of the object to be analyzed and information indicating the accuracy of detection, the person in charge of analyzing the satellite image can confirm whether the object shown in the satellite image is the object to be analyzed while referring to the information indicating the accuracy of detection output as the accuracy of detection. For example, if the environmental data indicates that there is a high possibility of marine life being present, the person in charge of analysis can confirm whether the object shown in the satellite image is the object to be analyzed while taking into account the possibility of marine life being present. Also, if there is a high possibility that there is no ship based on the topography and water currents, the person in charge of analysis can analyze the satellite image while taking into account that if it is a ship, it is an unusual ship. In this way, by analyzing the satellite image using the result of detection of the object to be analyzed and information indicating the accuracy of detection, it becomes easier to identify objects present in the area to be analyzed. Therefore, by using the image analysis system 10, it is possible to easily analyze objects present in the area being analyzed.

 また、検出の確度の推定理由を出力する場合に、分析の担当者は、例えば、分析対象の物体である可能性の理由または分析対象の物体以外である可能性の理由を考慮して、画像に写っている物体が分析対象の物体であるかを特定することができる。このため、検出の確度の推定理由を出力する場合には、分析対象の地域に存在する物体をより容易に分析することができる。 In addition, when outputting the estimated reason for the detection accuracy, the analyst can determine whether an object shown in the image is the object to be analyzed by considering, for example, the reasons why it may be the object to be analyzed or the reasons why it may be something other than the object to be analyzed. Therefore, when outputting the estimated reason for the detection accuracy, it is easier to analyze objects that exist in the area to be analyzed.

 画像分析システム10における各処理は、ネットワークを介して接続されている複数の情報処理装置において分散されて実行されてもよい。例えば、検出部14における処理と、推定部15における処理は、別の情報処理装置において行われてもよい。画像分析システム10における各処理を、複数の情報処理装置のうちいずれにおいて行うかは、適宜、設定され得る。 Each process in the image analysis system 10 may be distributed and executed in multiple information processing devices connected via a network. For example, the process in the detection unit 14 and the process in the estimation unit 15 may be executed in different information processing devices. It may be set as appropriate which of the multiple information processing devices each process in the image analysis system 10 is executed in.

 画像分析システム10における各処理は、コンピュータプログラムをコンピュータで実行することによって実現することができる。図9は、画像分析システム10における各処理を行うコンピュータプログラムを実行するコンピュータ100の構成の例を示したものである。コンピュータ100は、CPU(Central Processing Unit)101と、メモリ102と、記憶装置103と、入出力I/F(Interface)104と、通信I/F105を備える。 Each process in the image analysis system 10 can be realized by executing a computer program on a computer. Figure 9 shows an example of the configuration of a computer 100 that executes a computer program that performs each process in the image analysis system 10. The computer 100 comprises a CPU (Central Processing Unit) 101, memory 102, a storage device 103, an input/output I/F (Interface) 104, and a communication I/F 105.

 CPU101は、記憶装置103から各処理を行うコンピュータプログラムを読み出して実行する。CPU101は、複数のCPUの組み合わせによって構成されていてもよい。また、CPU101は、CPUと他の種類のプロセッサの組み合わせによって構成されていてもよい。例えば、CPU101は、CPUとGPU(Graphics Processing Unit)の組み合わせによって構成されていてもよい。メモリ102は、DRAM(Dynamic Random Access Memory)等によって構成され、CPU101が実行するコンピュータプログラムや処理中のデータが一時記憶される。記憶装置103は、CPU101が実行するコンピュータプログラムを記憶している。記憶装置103は、例えば、不揮発性の半導体記憶装置によって構成されている。記憶装置103には、ハードディスクドライブ等の他の記憶装置が用いられてもよい。入出力I/F104は、作業者からの入力の受付および表示データ等の出力を行うインタフェースである。通信I/F105は、端末装置20、衛星画像管理サーバ30、環境データ管理サーバ40、および他の情報処理装置との間でデータの送受信を行うインタフェースである。また、端末装置20、衛星画像管理サーバ30および環境データ管理サーバ40は、コンピュータ100と同様の構成であってもよい。 The CPU 101 reads out and executes computer programs for performing each process from the storage device 103. The CPU 101 may be configured by a combination of multiple CPUs. The CPU 101 may also be configured by a combination of a CPU and another type of processor. For example, the CPU 101 may be configured by a combination of a CPU and a GPU (Graphics Processing Unit). The memory 102 is configured by a DRAM (Dynamic Random Access Memory) or the like, and temporarily stores the computer programs executed by the CPU 101 and data being processed. The storage device 103 stores the computer programs executed by the CPU 101. The storage device 103 is configured by, for example, a non-volatile semiconductor storage device. Other storage devices such as a hard disk drive may be used for the storage device 103. The input/output I/F 104 is an interface that accepts input from an operator and outputs display data, etc. The communication I/F 105 is an interface that transmits and receives data between the terminal device 20, the satellite image management server 30, the environmental data management server 40, and other information processing devices. Furthermore, the terminal device 20, the satellite image management server 30, and the environmental data management server 40 may have the same configuration as the computer 100.

 各処理の実行に用いられるコンピュータプログラムは、データを非一時的に記録するコンピュータ読み取り可能な記録媒体に格納して頒布することもできる。記録媒体としては、例えば、データ記録用磁気テープや、ハードディスクなどの磁気ディスクを用いることができる。また、記録媒体としては、CD-ROM(Compact Disc Read Only Memory)等の光ディスクを用いることもできる。不揮発性の半導体記憶装置を記録媒体として用いてもよい。 The computer programs used to execute each process can also be distributed by storing them on a computer-readable recording medium that non-temporarily records data. As the recording medium, for example, a magnetic tape for recording data or a magnetic disk such as a hard disk can be used. Alternatively, an optical disk such as a CD-ROM (Compact Disc Read Only Memory) can also be used as the recording medium. A non-volatile semiconductor memory device can also be used as the recording medium.

 上記の実施形態の一部又は全部は、以下の付記のようにも記載されうるが、以下には限られない。 Some or all of the above embodiments can be described as follows, but are not limited to the following:

[付記1]
 分析対象の地域を撮像した衛星画像を取得する画像取得手段と、
 分析対象の地域の環境データを取得する環境データ取得手段と、
 衛星画像に写っている分析対象の物体を検出する検出手段と、
 前記環境データを基に、前記検出の確度を推定する推定手段と、
 前記検出の結果と、前記検出の確度を示す情報とを出力する出力手段と
 を備える画像分析システム。
[Appendix 1]
An image acquisition means for acquiring a satellite image of an area to be analyzed;
An environmental data acquisition means for acquiring environmental data of an area to be analyzed;
A detection means for detecting an object to be analyzed that is captured in a satellite image;
an estimation means for estimating a degree of accuracy of the detection based on the environmental data;
and an output means for outputting a result of the detection and information indicating a degree of accuracy of the detection.

[付記2]
 前記検出の確度は、前記検出手段によって検出された物体が、前記分析対象の物体であるかの確からしさを示す指標である、
 付記1に記載の画像分析システム。
[Appendix 2]
The detection accuracy is an index indicating the likelihood that the object detected by the detection means is the object to be analyzed.
2. The image analysis system of claim 1.

[付記3]
 前記推定手段は、環境データを用いて推定される前記分析対象の物体を検出する画像認識モデルの検出結果の確からしさに影響を及ぼす環境要因に基づいて前記検出の確度を推定する、
 付記2に記載の画像分析システム。
[Appendix 3]
The estimation means estimates the accuracy of the detection based on environmental factors that affect the accuracy of a detection result of an image recognition model that detects the object to be analyzed, the detection result being estimated using environmental data.
3. The image analysis system of claim 2.

[付記4]
 前記推定手段は、環境データを用いて推定される前記画像認識モデルが前記分析対象の物体であると誤認し得る物体の存在に影響を及ぼす環境要因に基づいて前記検出の確度を推定する、
 付記3に記載の画像分析システム。
[Appendix 4]
The estimation means estimates the accuracy of the detection based on environmental factors that affect the presence of an object that the image recognition model estimated using environmental data may mistakenly recognize as the object to be analyzed.
4. The image analysis system of claim 3.

[付記5]
 前記推定手段は、前記画像認識モデルの検出結果と、前記環境データを用いて推定される分析対象の物体の存在の有無とを基に、前記検出の確度を推定する、
 付記3に記載の画像分析システム。
[Appendix 5]
The estimation means estimates a degree of accuracy of the detection based on a detection result of the image recognition model and the presence or absence of an object to be analyzed estimated using the environmental data.
4. The image analysis system of claim 3.

[付記6]
 前記分析対象の物体は、船舶であり、
 前記推定手段は、前記画像認識モデルが検出した物体が、前記船舶以外であることを示す環境要因を基に、前記検出の確度を推定する、
 付記3から5いずれかに記載の画像分析システム。
[Appendix 6]
The object to be analyzed is a ship,
The estimation means estimates the accuracy of the detection based on an environmental factor indicating that the object detected by the image recognition model is other than the ship.
6. The image analysis system of claim 3.

[付記7]
 前記推定手段は、前記環境データを用いて推定される前記船舶が存在しない領域を基に前記検出の確度を推定する、
 付記6に記載の画像分析システム。
[Appendix 7]
The estimation means estimates the accuracy of the detection based on an area where the ship is not present, which is estimated using the environmental data.
7. The image analysis system of claim 6.

[付記8]
 前記出力手段は、前記推定手段が推定した前記船舶以外の物体に関する衛星画像を出力する、
 付記6または7に記載の画像分析システム。
[Appendix 8]
The output means outputs a satellite image relating to the object other than the ship estimated by the estimation means.
8. The image analysis system of claim 6 or 7.

[付記9]
 前記環境データは、気温、水温、水流、水深および地形のうちの少なくとも1つに関するデータである、
 付記6から8いずれかに記載の画像分析システム。
[Appendix 9]
The environmental data is data on at least one of air temperature, water temperature, water flow, water depth, and topography.
9. An image analysis system according to any one of claims 6 to 8.

[付記10]
 前記推定手段は、船舶の識別信号をさらに用いて、 前記検出の確度を推定する、
 付記6から9いずれかに記載の画像分析システム。
[Appendix 10]
The estimation means further uses an identification signal of the vessel to estimate the accuracy of the detection.
10. The image analysis system of any one of claims 6 to 9.

[付記11]
 前記出力手段は、前記検出の確度の推定の理由をさらに出力する、
 付記1から10いずれかに記載の画像分析システム。
[Appendix 11]
The output means further outputs a reason for the estimation of the accuracy of the detection.
11. The image analysis system of claim 1.

[付記12]
 前記出力手段は、衛星画像上において分析対象の物体が検出された領域を囲む線を、前記検出の確度の段階に応じて変化させた衛星画像を出力する、
 付記1から11いずれかに記載の画像分析システム。
[Appendix 12]
the output means outputs a satellite image in which a line surrounding an area in which the object to be analyzed is detected on the satellite image is changed according to a level of accuracy of the detection.
12. An image analysis system according to any one of claims 1 to 11.

[付記13]
 分析対象の地域を撮像した衛星画像を取得し、
 分析対象の地域の環境データを取得し、
 衛星画像に写っている分析対象の物体を検出し、
 前記環境データを基に、前記検出の確度を推定し、
 前記検出の結果と、前記検出の確度を示す情報とを出力する、
 画像分析方法。
[Appendix 13]
Obtain satellite images of the area to be analyzed,
Obtain environmental data for the area to be analyzed;
Detects objects to be analyzed in satellite images,
Estimating the accuracy of the detection based on the environmental data;
outputting a result of the detection and information indicating a degree of accuracy of the detection;
Image analysis methods.

[付記14]
 分析対象の地域を撮像した衛星画像を取得する処理と、
 分析対象の地域の環境データを取得する処理と、
 衛星画像に写っている分析対象の物体を検出する処理と、
 前記環境データを基に、前記検出の確度を推定する処理と、
 前記検出の結果と、前記検出の確度を示す情報とを出力する処理と
 をコンピュータに実行させる画像分析プログラムを非一時的に記録する記録媒体。
[Appendix 14]
Acquiring satellite images of the area to be analyzed;
Obtaining environmental data for the area to be analyzed;
A process for detecting an object to be analyzed that is captured in satellite imagery;
A process of estimating the accuracy of the detection based on the environmental data;
and outputting a result of the detection and information indicating the accuracy of the detection.

 以上、上述した実施形態を例として本開示を説明した。しかしながら、本開示は、上述した実施形態には限定されない。即ち、本開示は、本開示のスコープ内において、当業者が理解し得る様々な態様を適用することができる。 The present disclosure has been described above using the above-mentioned embodiment as an example. However, the present disclosure is not limited to the above-mentioned embodiment. In other words, the present disclosure can be applied in various aspects that can be understood by a person skilled in the art within the scope of the present disclosure.

 10  画像分析システム
 11  取得部
 12  衛星画像取得部
 13  環境データ取得部
 14  検出部
 15  推定部
 16  出力部
 17  記憶部
 20  端末装置
 30  衛星画像管理サーバ
 40  環境データ管理サーバ
 100  コンピュータ
 101  CPU
 102  メモリ
 103  記憶装置
 104  入出力I/F
 105  通信I/F
REFERENCE SIGNS LIST 10 Image analysis system 11 Acquisition unit 12 Satellite image acquisition unit 13 Environmental data acquisition unit 14 Detection unit 15 Estimation unit 16 Output unit 17 Storage unit 20 Terminal device 30 Satellite image management server 40 Environmental data management server 100 Computer 101 CPU
102 Memory 103 Storage device 104 Input/output I/F
105 Communication I/F

Claims (14)

 分析対象の地域を撮像した衛星画像を取得する画像取得手段と、
 分析対象の地域の環境データを取得する環境データ取得手段と、
 衛星画像に写っている分析対象の物体を検出する検出手段と、
 前記環境データを基に、前記検出の確度を推定する推定手段と、
 前記検出の結果と、前記検出の確度を示す情報とを出力する出力手段と
 を備える画像分析システム。
An image acquisition means for acquiring a satellite image of an area to be analyzed;
An environmental data acquisition means for acquiring environmental data of an area to be analyzed;
A detection means for detecting an object to be analyzed that is captured in a satellite image;
an estimation means for estimating a degree of accuracy of the detection based on the environmental data;
and an output means for outputting a result of the detection and information indicating a degree of accuracy of the detection.
 前記検出の確度は、前記検出手段によって検出された物体が、前記分析対象の物体であるかの確からしさを示す指標である、
 請求項1に記載の画像分析システム。
The detection accuracy is an index indicating the likelihood that the object detected by the detection means is the object to be analyzed.
The image analysis system of claim 1 .
 前記推定手段は、環境データを用いて推定される前記分析対象の物体を検出する画像認識モデルの検出結果の確からしさに影響を及ぼす環境要因に基づいて前記検出の確度を推定する、
 請求項2に記載の画像分析システム。
The estimation means estimates the accuracy of the detection based on environmental factors that affect the accuracy of a detection result of an image recognition model that detects the object to be analyzed, the detection result being estimated using environmental data.
The image analysis system of claim 2.
 前記推定手段は、環境データを用いて推定される前記画像認識モデルが前記分析対象の物体であると誤認し得る物体の存在に影響を及ぼす環境要因に基づいて前記検出の確度を推定する、
 請求項3に記載の画像分析システム。
The estimation means estimates the accuracy of the detection based on environmental factors that affect the presence of an object that the image recognition model estimated using environmental data may mistakenly recognize as the object to be analyzed.
The image analysis system of claim 3.
 前記推定手段は、前記画像認識モデルの検出結果と、前記環境データを用いて推定される分析対象の物体の存在の有無とを基に、前記検出の確度を推定する、
 請求項3に記載の画像分析システム。
The estimation means estimates a degree of accuracy of the detection based on a detection result of the image recognition model and the presence or absence of an object to be analyzed estimated using the environmental data.
The image analysis system of claim 3.
 前記分析対象の物体は、船舶であり、
 前記推定手段は、前記画像認識モデルが検出した物体が、前記船舶以外であることを示す環境要因を基に、前記検出の確度を推定する、
 請求項3から5いずれかに記載の画像分析システム。
The object to be analyzed is a ship,
The estimation means estimates the accuracy of the detection based on an environmental factor indicating that the object detected by the image recognition model is other than the ship.
6. An image analysis system according to any one of claims 3 to 5.
 前記推定手段は、前記環境データを用いて推定される前記船舶が存在しない領域を基に前記検出の確度を推定する、
 請求項6に記載の画像分析システム。
The estimation means estimates the accuracy of the detection based on an area where the ship is not present, which is estimated using the environmental data.
The image analysis system of claim 6.
 前記出力手段は、前記推定手段が推定した前記船舶以外の物体に関する衛星画像を出力する、
 請求項6または7に記載の画像分析システム。
The output means outputs a satellite image relating to the object other than the ship estimated by the estimation means.
8. An image analysis system according to claim 6 or 7.
 前記環境データは、気温、水温、水流、水深および地形のうちの少なくとも1つに関するデータである、
 請求項6から8いずれかに記載の画像分析システム。
The environmental data is data on at least one of air temperature, water temperature, water flow, water depth, and topography.
An image analysis system according to any one of claims 6 to 8.
 前記推定手段は、船舶の識別信号をさらに用いて、 前記検出の確度を推定する、
 請求項6から9いずれかに記載の画像分析システム。
The estimation means further uses an identification signal of the vessel to estimate the accuracy of the detection.
10. An image analysis system according to any one of claims 6 to 9.
 前記出力手段は、前記検出の確度の推定の理由をさらに出力する、
 請求項1から10いずれかに記載の画像分析システム。
The output means further outputs a reason for the estimation of the accuracy of the detection.
11. An image analysis system according to any one of claims 1 to 10.
 前記出力手段は、衛星画像上において分析対象の物体が検出された領域を囲む線を、前記検出の確度の段階に応じて変化させた衛星画像を出力する、
 請求項1から11いずれかに記載の画像分析システム。
the output means outputs a satellite image in which a line surrounding an area in which the object to be analyzed is detected on the satellite image is changed according to a level of accuracy of the detection.
12. An image analysis system according to any preceding claim.
 分析対象の地域を撮像した衛星画像を取得し、
 分析対象の地域の環境データを取得し、
 衛星画像に写っている分析対象の物体を検出し、
 前記環境データを基に、前記検出の確度を推定し、
 前記検出の結果と、前記検出の確度を示す情報とを出力する、
 画像分析方法。
Obtain satellite images of the area to be analyzed,
Obtain environmental data for the area to be analyzed;
Detects objects to be analyzed in satellite images,
Estimating the accuracy of the detection based on the environmental data;
outputting a result of the detection and information indicating the accuracy of the detection;
Image analysis methods.
 分析対象の地域を撮像した衛星画像を取得する処理と、
 分析対象の地域の環境データを取得する処理と、
 衛星画像に写っている分析対象の物体を検出する処理と、
 前記環境データを基に、前記検出の確度を推定する処理と、
 前記検出の結果と、前記検出の確度を示す情報とを出力する処理と
 をコンピュータに実行させる画像分析プログラムを非一時的に記録する記録媒体。
Acquiring satellite images of the area to be analyzed;
Obtaining environmental data for the area to be analyzed;
A process for detecting an object to be analyzed that is captured in satellite imagery;
A process of estimating an accuracy of the detection based on the environmental data;
and outputting a result of the detection and information indicating the accuracy of the detection.
PCT/JP2023/011668 2023-03-23 2023-03-23 Image analysis system, image analysis method, and recording medium Pending WO2024195139A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2025508100A JPWO2024195139A5 (en) 2023-03-23 Image analysis system, image analysis method, and image analysis program
PCT/JP2023/011668 WO2024195139A1 (en) 2023-03-23 2023-03-23 Image analysis system, image analysis method, and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/011668 WO2024195139A1 (en) 2023-03-23 2023-03-23 Image analysis system, image analysis method, and recording medium

Publications (1)

Publication Number Publication Date
WO2024195139A1 true WO2024195139A1 (en) 2024-09-26

Family

ID=92841555

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/011668 Pending WO2024195139A1 (en) 2023-03-23 2023-03-23 Image analysis system, image analysis method, and recording medium

Country Status (1)

Country Link
WO (1) WO2024195139A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013084072A (en) * 2011-10-07 2013-05-09 Hitachi Ltd Land state estimation system and land state estimation method
JP2016173239A (en) * 2015-03-16 2016-09-29 三菱電機株式会社 Position estimation device, and synthetic aperture radar device
JP2019194821A (en) * 2018-05-06 2019-11-07 英俊 古川 Target recognition device, target recognition method, and program
US20210109209A1 (en) * 2019-10-10 2021-04-15 Orbital Insight, Inc. Object measurement using deep learning analysis of synthetic aperture radar backscatter signatures
CN112836571A (en) * 2020-12-18 2021-05-25 华中科技大学 Ship target detection and recognition method, system and terminal in remote sensing SAR images
JP2023000897A (en) * 2021-06-18 2023-01-04 株式会社スペースシフト Learning model, signal processor, flying body, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013084072A (en) * 2011-10-07 2013-05-09 Hitachi Ltd Land state estimation system and land state estimation method
JP2016173239A (en) * 2015-03-16 2016-09-29 三菱電機株式会社 Position estimation device, and synthetic aperture radar device
JP2019194821A (en) * 2018-05-06 2019-11-07 英俊 古川 Target recognition device, target recognition method, and program
US20210109209A1 (en) * 2019-10-10 2021-04-15 Orbital Insight, Inc. Object measurement using deep learning analysis of synthetic aperture radar backscatter signatures
CN112836571A (en) * 2020-12-18 2021-05-25 华中科技大学 Ship target detection and recognition method, system and terminal in remote sensing SAR images
JP2023000897A (en) * 2021-06-18 2023-01-04 株式会社スペースシフト Learning model, signal processor, flying body, and program

Also Published As

Publication number Publication date
JPWO2024195139A1 (en) 2024-09-26

Similar Documents

Publication Publication Date Title
KR102067242B1 (en) Method for detecting oil spills on satellite sar images using artificial neural network
KR101980354B1 (en) Detection method and system for discrimination of sea ice in the polar region
JP6319785B2 (en) Abnormal tide level fluctuation detection device, abnormal tide level fluctuation detection method, and abnormal tide level fluctuation detection program
Power et al. Iceberg detection capabilities of RADARSAT synthetic aperture radar
KR102469002B1 (en) Automatic Generation-Display System of Marine Pollution Information and Method thereof
KR101880616B1 (en) Method of sea fog prediction based on Sea surface winds and sea fog information from satellite
KR20210099371A (en) The method and System of Wave Observation Using Camera Module for Ocean Observation Buoy
Espedal Detection of oil spill and natural film in the marine environment by spaceborne SAR
US10895665B2 (en) Method for detecting hydrocarbon deposits
Vachon et al. Analysis of Sentinel-1 marine applications potential
Liu et al. Comparison of wave height measurement algorithms for ship-borne X-band nautical radar
KR101790481B1 (en) Sea surface wind measurement system and method using marine rader
CN111626129A (en) Ship target joint detection method based on satellite AIS and infrared camera
Kim et al. Semantic segmentation of marine radar images using convolutional neural networks
Marino et al. Ship detection with RADARSAT-2 quad-pol SAR data using a notch filter based on perturbation analysis
CN118640997B (en) Water level monitoring device and water level monitoring equipment based on thunder multi-mode fusion
WO2024195139A1 (en) Image analysis system, image analysis method, and recording medium
KR102713090B1 (en) System and method of providing responsive waterway information for smart ship
JP7763348B2 (en) Method and apparatus for generating spatiotemporal maps of estimated vessel traffic within an area - Patents.com
EP4475091A1 (en) A method for determining at least a wind feature above a body of water
Kalogirou et al. Oil Spill Detection using Convolutional Neural Networks and Sentinel-1 SAR Imagery
EP4393809A1 (en) Ship information collection device, ship information collection system, and ship information collection method
JP3629328B2 (en) Target motion estimation device
WO2024195140A1 (en) Image analysis system, image analysis method, and recording medium
Smith et al. The statistical characterization of the sea for the segmentation of maritime images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23928715

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2025508100

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2025508100

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE