[go: up one dir, main page]

WO2023233563A1 - Sensor system to be mounted on flying body, and data-processing system for sensor system - Google Patents

Sensor system to be mounted on flying body, and data-processing system for sensor system Download PDF

Info

Publication number
WO2023233563A1
WO2023233563A1 PCT/JP2022/022242 JP2022022242W WO2023233563A1 WO 2023233563 A1 WO2023233563 A1 WO 2023233563A1 JP 2022022242 W JP2022022242 W JP 2022022242W WO 2023233563 A1 WO2023233563 A1 WO 2023233563A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
data
interferogram
sensor system
reduction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2022/022242
Other languages
French (fr)
Japanese (ja)
Inventor
晃均 玉田
航 吉岐
優佑 伊藤
俊平 亀山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2022/022242 priority Critical patent/WO2023233563A1/en
Publication of WO2023233563A1 publication Critical patent/WO2023233563A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/45Interferometric spectrometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration

Definitions

  • the disclosed technology relates to a sensor system mounted on a flying object and a data processing system for the sensor system.
  • a sensor system such as an imaging device is mounted on a flying object such as a drone to observe an observation target on the ground.
  • a multispectrum camera is also known as such an imaging device.
  • Patent Document 1 discloses that a drone is equipped with an imaging device to sense the state of vegetation in a field. Specifically, Patent Document 1 relates to a technique for easily selecting a plurality of images and efficiently generating a mapping image based on the images selected to be used.
  • the disclosed technology aims to solve the above problems and provide a sensor system that reduces data while retaining useful information originally required for observation.
  • the sensor system includes an optical measurement sensor unit that is mounted on a flying object and acquires an interferogram of an observation target, an object determination unit that determines an observation target based on the interferogram, and an object determination unit that determines an observation target based on the interferogram. and a data reduction unit that performs data reduction on the interferogram based on the determination result.
  • the sensor system according to the disclosed technology has the above configuration, it is possible to reduce data while leaving useful information originally required for observation.
  • FIG. 1 is a block diagram showing the functional configurations of a sensor system and a data processing system according to the first embodiment.
  • FIG. 2 is a diagram illustrating the operation or processing of the sensor system according to the first embodiment.
  • FIG. 3 is a diagram illustrating the operation or processing of the data creation unit 7, memory unit 8, and data processing system of the sensor system according to the first embodiment.
  • FIG. 1 is a block diagram showing the functional configurations of a sensor system and a data processing system according to the first embodiment.
  • the sensor system and data processing system according to the disclosed technology will be collectively referred to as a signal processing system.
  • the sensor system according to the disclosed technology is mounted on a flying object such as a drone, an aircraft, or an artificial satellite, and is intended to observe the ground.
  • a flying object equipped with a sensor system may have the purpose of observing the ground as an observation target from a plurality of observation directions and obtaining observation results using two-dimensional images.
  • the data processing system is not mounted on a flying object, but is installed in a data center on the ground, and receives and processes data transmitted from a sensor system mounted on a flying object.
  • the sensor system includes an optical measurement sensor section 1, an AD conversion section 2, an interference image acquisition section 3, an interferogram storage section 4, a signal analysis section 5, a data reduction section 6, It includes a data creation section 7 and a memory section 8.
  • Each component in the sensor system is connected as shown in FIG.
  • the data processing system includes an interferogram holding section 9, a spectrum restoration section 10, an imaging processing section 11, a label creation section 12, a feature extraction section 13, and a teaching data creation section 14.
  • the interferogram holding unit 9, spectrum restoration unit 10, imaging processing unit 11, label creation unit 12, feature extraction unit 13, teacher data creation unit 14, and machine learning unit 15 are functional blocks related to pre-learning processing. group (hereinafter referred to as "pre-learning processing section"). Each component in the data processing system is connected as shown in FIG.
  • the optical measurement sensor section 1 is a component for acquiring an interferogram.
  • the optical measurement sensor section 1 is a Fourier spectroscopy type optical interferometer.
  • An interferogram is an interference wave obtained from an interferometer.
  • the waveform of the interferogram has, for example, the appearance shown above the text "interferogram acquisition" in FIG. 2, which will be described later.
  • the AD conversion section 2 is a component for converting an analog signal sent from the optical measurement sensor section 1 into a digital signal.
  • the interference image acquisition section 3 is a component for generating and acquiring an interference image from the digital signal sent from the AD conversion section 2.
  • the term "interference image” used in this specification is an image in which the absolute value of the difference between the maximum and minimum amplitudes of an interferogram is expressed as a contrast for each observation point.
  • an interference image is an image representing interference fringes.
  • the interference image has, for example, the appearance shown above the words "interference pattern image acquisition" in FIG. 2, which will be described later.
  • Interferogram storage section 4 in sensor system ⁇ The interferogram storage unit 4 stores interferogram digital data (hereinafter referred to as “interferogram data”) for pixels corresponding to a certain observation point from the interference image sent from the interference image acquisition unit 3. It is a component for interferogram data.
  • the signal analysis unit 5 is a component for performing signal analysis on the interferogram data sent from the interferogram storage unit 4.
  • the signal analysis section 5 includes a spectrum analysis section 5a, an object determination section 5b, and an importance analysis section 5c.
  • the spectrum analysis section 5a, the object determination section 5b, and the importance analysis section 5c are connected as shown in FIG.
  • the spectrum analysis unit 5a is a component for performing spectrum analysis on the interferogram data sent from the interferogram storage unit 4.
  • Spectral analysis of interferogram data yields the optical spectrum of the optical signal.
  • the optical spectrum is obtained by Fourier transforming the interferogram.
  • the optical spectrum is represented by a complex number for each frequency.
  • the interferogram data and optical spectrum data are sent by the spectrum analysis section 5a to the object determination section 5b.
  • the object determination unit 5b is a component for determining an object from an interference image based on interferogram data, optical spectrum data, or both.
  • the object of determination may be one-dimensional data corresponding to individual pixels on the interference image (i.e., interferogram data), or three-dimensional data (i.e., spectral image data) corresponding to a specific region of the interference image. There may be. Moreover, the object of determination may be both of these.
  • the target determination performed by the target determining unit 5b may be, for example, determining the presence of clouds in the observation direction of each pixel of the interference image. Clouds pose a nuisance to sensor systems that observe the ground.
  • the target determination performed by the target determination unit 5b is realized by having trained artificial intelligence.
  • Artificial intelligence is realized by a mathematical model called a learning model.
  • the learning model is, for example, an artificial neural network.
  • Artificial intelligence learning is performed in a machine learning unit 15 of the data processing system, which will be described later.
  • the broken line arrow extending from the machine learning section 15 in FIG. 1 represents that the object determination section 5b is equipped with artificial intelligence that has been trained in the machine learning section 15.
  • artificial intelligence may be realized using a semantic segmentation algorithm.
  • the importance analysis section 5c is a component for analyzing the importance of each observation point based on the determination result sent from the object determination section 5b.
  • the definition of importance analyzed by the importance analysis unit 5c may be set as appropriate depending on the purpose of use of the sensor system and data processing system.
  • the importance analysis result obtained by the importance analysis section 5c is sent to the data reduction section 6.
  • the data reduction unit 6 is a component for reducing data in locations determined to be unnecessary from the target determination and importance analysis information sent from the target determination unit 5b and importance analysis unit 5c.
  • the data reduction section 6 includes at least one component among an area reduction section 6a, a band reduction section 6b, and a bit reduction section 6c.
  • the criteria for determining that the data reduction unit 6 is unnecessary may be determined as appropriate depending on the purpose of use of the sensor system and data processing system.
  • the data reduced by the data reduction unit 6 is sent to the data creation unit 7.
  • the area reduction unit 6a selects observation areas of low importance from the target determination and importance analysis information sent from the target determination unit 5b and importance analysis unit 5c. In contrast, data reduction is performed.
  • clouds are a nuisance for sensor systems that observe the ground.
  • clouds exist between the sensor system and the observation point, scattered light generated by sunlight reflecting off the clouds becomes an interference signal for the sensor system.
  • the observation area in which the area reduction unit 6a performs data reduction corresponds to an area where clouds exist between the sensor system and the observation area. It is known that the light spectrum of scattered light generated by reflection from clouds is white.
  • the light spectrum obtained by the spectrum analysis unit 5a has many white components, it can be determined that clouds are occurring between the sensor system and the observation point observed by the sensor system.
  • Data reduction for areas performed by the area reduction unit 6a is realized by providing trained artificial intelligence. Artificial intelligence learning is performed in a machine learning unit 15 of the data processing system, which will be described later.
  • the broken line arrow extending from the machine learning unit 15 in FIG. 1 represents that the area reduction unit 6a is equipped with artificial intelligence that has been trained in the machine learning unit 15.
  • the band reduction unit 6b can estimate that the degree of importance is low based on the target determination and importance analysis information sent from the target determination unit 5b and the importance analysis unit 5c. This is a component for deleting frequency band data.
  • the term "band” used in the name of the band reduction unit 6b is derived from the English word "Frequency Band” which represents a frequency band.
  • the band reduction unit 6b is a component intended for the case where the observation target of the sensor system is a specific gas existing on the earth's surface. Gases are gaseous atoms, and absorption lines with wavelengths unique to the atoms appear in a continuous spectrum. Therefore, if the characteristics of absorption lines do not appear in the optical spectrum obtained by the spectrum analysis unit 5a, it can be assumed that the specific gas does not exist in the environment of the observation point and can be considered as a candidate for data reduction.
  • bit reduction unit 6c in data reduction unit 6>> When the data reduction unit 6 includes a bit reduction unit 6c, the bit reduction unit 6c can estimate that the degree of importance is low based on the target determination and importance analysis information sent from the target determination unit 5b and the importance analysis unit 5c. , is a component for deleting data in the time domain or frequency domain of an interferogram in units of bits or by reducing the number of bits. In this specification, deleting data in units of bits is expressed as "bit reduction.”
  • the bit reduction unit 6c may, for example, reduce bits of data in a wavelength domain where the signal level is excessively high in frequency domain data of the interferogram, that is, optical spectrum data.
  • bit reduction unit 6c may, for example, reduce bits of data in a time period that does not contribute to measurement accuracy in the time domain data of the interferogram.
  • the time periods that do not contribute to measurement accuracy are generally the time periods at both ends, between the start and end times (see the interferogram waveform shown in Figure 2 below). ). The details of bit reduction will become clear from the explanation along with FIG. 2, which will be described later.
  • the data creation unit 7 is a component for reconstructing three-dimensional data from the data sent from the data reduction unit 6.
  • the three-dimensional data reconstructed in the data creation unit 7 may be an interferogram for each pixel or an interference image for each sampling period.
  • an optical spectrum that is, frequency domain information
  • the data creation unit 7 performs an inverse Fourier transform on the frequency domain information and creates an interferogram for each pixel. May be reconfigured.
  • the three-dimensional data reconstructed by the data creation section 7 is sent to the memory section 8.
  • FIG. 1 shows that, in addition to the route from the data reduction unit 6, there is a route from the interferogram storage unit 4 as a route for the data creation unit 7 to acquire information.
  • the data creation unit 7 may perform a process of sending the raw interferogram data sent from the interferogram storage unit 4 to the memory unit 8.
  • the raw interferogram data can be used as training data for learning performed by the machine learning unit 15, which will be described later.
  • the raw interferogram data to be used as teacher data may be transmitted intermittently depending on the required amount, that is, depending on the purpose of use of the sensor system and the data processing system.
  • the memory unit 8 is a component for storing the three-dimensional data sent from the data creation unit 7.
  • the three-dimensional data stored in the memory section 8 is sent to a data processing system.
  • the memory unit 8 also stores raw interferogram data sent from the data creation unit 7 as appropriate.
  • the raw interferogram data stored in memory portion 8 is also sent to the data processing system.
  • the interferogram holding unit 9 is a component for temporarily holding interferogram data.
  • the interferogram data held by the interferogram holding unit 9 can be classified into two types, focusing on their roles.
  • One type is used in the learning phase of artificial intelligence, and is interferogram data as training data of a learning data set.
  • the interferogram data as the teacher data may be, for example, actual observation data actually observed by a sensor system in the past. If sufficient past actual observation data by the sensor system cannot be obtained, teacher data may be created using actual observation data observed by another system or by performing data expansion.
  • a part of the learning data set may be used as a verification data set for verification performed in the final stage of learning.
  • the other type is interferogram data, which is used in the inference phase of artificial intelligence and is observation data of the sensor system transmitted from the sensor system.
  • the interferogram data used in the learning phase is sent to the spectrum restoration section 10 and feature amount extraction section 13 of the pre-learning processing section.
  • Interferogram data used in the inference phase is sent to the main analysis section 16.
  • the spectrum restoration section 10 is a component for performing spectrum restoration on the interferogram data sent from the interferogram holding section 9.
  • Spectral restoration refers to the problem or process of estimating and restoring the signal spectrum that a signal originally has.
  • the waveform of the signal spectrum obtained by spectrum restoration has, for example, the appearance shown on the right side of the words "spectrum restoration" in FIG. 3, which will be described later.
  • a noise suppression method such as the MMSE-STSA (Minimum Mean-Square-Error Short-Time Spectral Amplitude Estimator) method is known.
  • the imaging processing unit 11 is a component for generating image data based on the signal spectrum sent from the spectrum restoration unit 10.
  • the image data generated by the imaging processing unit 11 will be referred to as "spectral image data" in this specification.
  • the spectral image data can be expressed, for example, in the appearance shown above the characters "imaging processing" in FIG. 3, which will be described later.
  • Spectral image data may be thought of as a data cube of three or more dimensions.
  • the imaging processing unit 11 may correct the data by processing such as interpolation for locations where data has been reduced (see FIG. 3, which will be described later).
  • the spectral image data generated by the imaging processing section 11 is sent to the label creation section 12.
  • the label creation unit 12 is a component for supporting the user of the sensor system and the data processing system to create correct labels constituting the learning data set.
  • the label creation unit 12 uses, for example, artificial intelligence that is in the process of learning to display correct label candidates on the display screen, overlapping them with the spectral image data.
  • the user may create a correct label by modifying the correct label candidates displayed by the label creation unit 12 as necessary.
  • the feature extraction unit 13, the teacher data creation unit 14, and the machine learning unit 15 may be considered to represent the learning process of artificial intelligence as a whole.
  • the feature amount extraction unit 13 represents a learning process in which the artificial intelligence in the learning phase extracts feature amounts from interferogram data sent and input from the interferogram holding unit 9 according to the purpose of learning. ing.
  • a feature is a term used in the field of learning, and is a variable in data to be analyzed that serves as a clue for prediction. Further, feature extraction is converting data to be analyzed into a feature amount.
  • the feature quantities extracted for each input interferogram data are multidimensional quantities (vectors), and are shown on the feature quantity map shown below the text "Feature quantity extraction" in Figure 3, which will be described later. can be represented as a plot of A feature map is sometimes referred to as a feature space.
  • the feature amount may artificially include a feature of the envelope for the attenuation of the interferogram, a time width of one vibration in the interferogram, and other amounts having a correlation with the optical spectrum.
  • the teacher data creation unit 14 represents the preparation process of preparing a sufficient number of learning data sets to perform artificial intelligence learning. Once the number of training datasets is complete, a plot distribution will appear for each label on the feature map.
  • the machine learning unit 15 represents a core process in the learning process of artificial intelligence. For example, when an artificial intelligence advances its learning to solve a classification problem, the artificial intelligence constructs a classification surface that separates classes, as shown in the feature map at the bottom right of FIG. Artificial intelligence may apply, for example, a support vector machine algorithm when constructing the classification surface.
  • the analysis unit 16 is a component for analyzing interferogram data acquired by the data processing system via the interferogram holding unit 9.
  • the user of the data processing system can utilize the results analyzed by the analysis unit 16.
  • FIG. 2 is a diagram illustrating the operation or processing of the sensor system according to the first embodiment.
  • the upper left part of Figure 2 shows the sensor system mounted on the flying object observing the observation target.
  • the flying object is, for example, a drone, an aircraft, or an artificial satellite.
  • the optical measurement sensor unit 1 is a Fourier spectroscopy type optical interferometer as described above, and the signals detected here are based on the detection time (t) and the two-dimensional coordinates (x, y) of the observation point. , are associated.
  • the two-dimensional coordinates (x, y) of the observation point are determined comprehensively, taking into consideration processing details such as the position of the flying object, the direction of light irradiation, TOF (Time of Flight), and beam forming.
  • the two-dimensional coordinates (x, y) of the observation point may be expressed, for example, in a geographic coordinate system.
  • the variables x, y, and t are discrete variables, they are expressed with subscripts, such as x n , y n , and t n .
  • the subscript for sampling time (t) starts at 0.
  • the two-dimensional coordinates (x n , y n ) mean the coordinates for the n-th observation point. Subscripts used for two-dimensional coordinates for observation points start at 1.
  • n used as a subscript is simply a variable and has a meaning as a counter variable in a for statement. n does not mean the total number of something.
  • the n used for the two-dimensional coordinates (x n , y n ) and the n used for the time t n do not need to be the same value. Although the same n is used in this specification, different variables may be used for two-dimensional coordinates and time in a program that implements the disclosed technology.
  • the upper center part of FIG. 2 shows how the interference image acquisition unit 3 acquires an interference image.
  • the interference image acquisition section 3 acquires a digital signal from the AD conversion section 2.
  • the digital signal is represented by the symbol F(), as shown in FIG.
  • the digital signal (F(x, y, t)) can also be said to be interferogram array data.
  • the upper right part of FIG. 2 shows how the interferogram storage unit 4 acquires interferogram data for pixels corresponding to a certain observation point from the interference image sent from the interference image acquisition unit 3.
  • the graph shown with the characters "F (x 1 , y 1 , t)" in the upper right part of Figure 2 is the first observation point (two-dimensional coordinates are (x 1 , y 1 )) 3 is a graph showing interferogram data for pixels corresponding to .
  • the graph shown with the characters "F (x 2 , y 2 , t)" in the upper right part of Figure 2 is the second observation point (the two-dimensional coordinates are (x 2 , y 2 ) ) is a graph showing interferogram data for pixels corresponding to .
  • a certain observation point on the ground corresponds to a pixel at a specific position in the interference image.
  • the interferogram data (F(x n , y n , t)) has a data length of observation time necessary to obtain the wave number resolution (also referred to as spectral resolution) required by the specifications.
  • the sensor system according to the present disclosure acquires interferogram data (F(x n , yn , t)) for each pixel corresponding to all observation points.
  • the signal analysis unit 5 performs object determination and importance analysis on the interferogram data (F(x n , y n , t)) sent from the interferogram storage unit 4. This shows how signal analysis is performed.
  • the spectrum analysis unit 5a of the signal analysis unit 5 performs Fourier transform in the time direction on the interferogram data (F(x n , yn , t)) sent from the interferogram storage unit 4 .
  • Optical spectrum data obtained by Fourier transforming interferogram data (F(x n , y n , t)) in the time direction is an array, and is expressed as B(x n , y n ).
  • the optical spectrum data (B(x n , y n )) obtained by the spectrum analysis section 5a is sent to the object determination section 5b.
  • the object determination unit 5b of the signal analysis unit 5 uses interferogram data (F(x n , y n , t)), optical spectrum data (B(x n , y n )), or both to determine whether all For observation points (observation points at all n), determine what the object was like.
  • the object determination unit 5b specifically determines whether there are any clouds between the sensor system and the observation point.
  • the object determination unit 5b prepares a reference interferogram for the case where clouds are present, and compares it with the interferogram data (F(x n , y n , t)) of the observation target.
  • a correlation coefficient with a reference interferogram may be determined and compared with a preset threshold.
  • the target determination performed by the target determination unit 5b is realized by trained artificial intelligence, as described above.
  • the importance analysis unit 5c of the signal analysis unit 5 uses interferogram data (F(x n , yn , t)), optical spectrum data (B(x n , yn )), and the judgment of the target judgment unit 5b. Based on the results, analyze the importance of each observation point. In analyzing the importance of each observation point, the importance analysis unit 5c analyzes the area of pixels corresponding to the area of the observation point, the unique frequency characteristics (continuous spectrum (absorption lines that appear) and the amount of data in the received signal.
  • the lower center portion of FIG. 2 shows how the data reduction section 6 includes at least one of an area reduction section 6a, a band reduction section 6b, and a bit reduction section 6c.
  • the bit reduction carried out by the bit reduction unit 6c can be carried out in two ways: a method of reducing the upper bits and a method of reducing the lower bits.
  • the time periods that do not contribute to measurement accuracy are generally the time periods at both ends of the start and end times. Since the interferogram signal is not large in this time period, only 0 is included in the upper bits of the signal data. Therefore, the bit reduction unit 6c can reduce the upper bits of the signal data in the time periods at both ends of the interferogram.
  • the signal amplitude is large and the S/N ratio is high in the time zone between the start and end times.
  • the bit reduction unit 6c can also reduce the lower bits of the signal data in a time period intermediate between the start and end of the interferogram.
  • the bit reduction unit 6c may perform upper bit reduction, lower bit reduction, or a combination of upper bit reduction and lower bit reduction.
  • the data reduction unit 6 performs data reduction on the interferogram data (F(x n , yn , t)) and generates interferogram data with data reduction performed.
  • the interferogram data subjected to data reduction is expressed as F'(x n , yn , t).
  • F'(x n , y n , t) is similar to the original signal F(x n , y n , t), so it is expressed as a prime ('', usually used in British English), which is used to express similarity in mathematical notation. dash) is used.
  • the lower right part of FIG. 2 shows how the interference image at time tn is affected by data reduction.
  • the interference image affected by data reduction is denoted as F'(x, y, t n ), as shown in FIG. 2 .
  • An interferogram affected by data reduction (F′(x, y, t n )) can be constructed from interferogram data (F′(x n , y n , t)) subjected to data reduction. can.
  • FIG. 3 is a diagram illustrating the operation or processing of the data creation unit 7, memory unit 8, and data processing system of the sensor system according to the first embodiment.
  • FIG. 3 shows how the three-dimensional data reconstructed by the data creation section 7 is stored in the memory section 8.
  • the middle part of FIG. 3 shows how the three-dimensional data stored in the memory section 8 is transferred to the data processing system.
  • the lower portion of FIG. 3 generally describes the operation or processing of the data processing system.
  • a signal waveform of F''(x n , yn , t) is shown as a diagram for explaining the operation or processing of the interferogram holding unit 9.
  • F''(x n , y n , t) is more prime than F' (x n , yn , t) stored in the memory unit 8 on the sensor system side. It is expressed by increasing the number.
  • B''(x 1 , y 1 , ⁇ ), B ''(x 2 , y 2 , ⁇ ), and two The spectral waveform is shown.
  • B''(x 1 , y 1 , ⁇ ) is spectrum data whose spectrum is restored from F'' (x 1 , y 1 , t).
  • F'' (x n , y 1 , t) is specifically obtained by Fourier transforming F'' (x n , y n , t) in the time direction. .
  • ⁇ in the argument of B''(x n , yn , ⁇ ) is the wavelength.
  • the spectral data is a quantity expressed in the wavelength domain (or frequency domain) (frequency is the speed of light divided by the wavelength).
  • interferogram data F(x n , y n , t) is a quantity that includes t as an argument and is expressed in the time domain.
  • spectral image data for a group of B''(x, y, ⁇ ) is shown as a diagram for explaining the operation or processing of the imaging processing unit 11.
  • the spectral image data created by the imaging processing unit 11 can be said to be a three-dimensional data cube consisting of a two-dimensional coordinate plane and a spectrum at the coordinates.
  • the lower part of FIG. 3 shows how the imaging processing unit 11 corrects data by processing such as interpolation for the portions where data has been reduced when generating image data.
  • label creation represents the operation or processing of the label creation section 12.
  • feature extraction represents the operation or processing of the feature extraction unit 13.
  • the feature map above the text "Teacher data creation” represents the operation or processing of the teacher data creation unit 14.
  • the horizontal axis represents the first feature amount (feature amount 1)
  • the vertical axis represents the second feature amount (feature amount 2).
  • a sample with a first label (label 1) and a sample with a second label (label 2) are plotted on the feature map.
  • machine learning model creation represents the operation or processing of the teacher data creation unit 14.
  • the feature map with the symbol “15” represents the operation or processing of the machine learning unit 15.
  • the classification plane divides the sample into "object 1" with the first label and "object 2" with the second label. It is classified as
  • the feature amount map shown in FIG. 3 shows an example in which "object 1" and "object 2" are classified into classes based on differences in importance. In FIG. 3, the importance of "object 1” is ⁇ 1 , and the importance of "object 2" is ⁇ 2 .
  • the sensor system according to the disclosed technology has industrial applicability because it can be mounted on a drone and applied to sensing the vegetation state of a field, for example.

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

A sensor system according to the present disclosure is to be mounted on a flying body, and comprises: an optical measurement sensor unit (1) that obtains an interferogram of an observed object; an object determination unit (5b) that includes trained artificial intelligence, and determines the observed object on the basis of the interferogram; and a data reduction unit (6) that reduces data of the interferogram on the basis of the determination result of the object determination unit (5b).

Description

飛翔体に搭載されるセンサシステム、及びセンサシステム用のデータ処理システムSensor systems mounted on flying objects and data processing systems for sensor systems

 本開示技術は、飛翔体に搭載されるセンサシステム、及びセンサシステム用のデータ処理システムに関する。 The disclosed technology relates to a sensor system mounted on a flying object and a data processing system for the sensor system.

 ドローン等の飛翔体に撮像装置等のセンサシステムを搭載し、地上の観測対象を観測する技術が知られている。このような撮像装置として、マルチスペクトラムカメラも知られている。 Technology is known in which a sensor system such as an imaging device is mounted on a flying object such as a drone to observe an observation target on the ground. A multispectrum camera is also known as such an imaging device.

 例えば特許文献1には、ドローンに撮像装置を搭載し、圃場(ほじょう)の植生状態のセンシングを行うことが開示されている。特許文献1は、具体的には、複数の画像を容易に選択し、使用するものとして選択された画像によるマッピング画像を効率的に生成する技術に関する。 For example, Patent Document 1 discloses that a drone is equipped with an imaging device to sense the state of vegetation in a field. Specifically, Patent Document 1 relates to a technique for easily selecting a plurality of images and efficiently generating a mapping image based on the images selected to be used.

特開2020-21437号公報JP2020-21437A

 飛翔体に搭載されるセンサシステムを用いて観測を行う場合、地上へデータを転送し、地上において情報処理を行いたい、という状況がある。特に飛翔体が無人のドローンであり、リアルタイムで観測を行う場合、膨大な量の情報を地上に転送しなければならない。地上へ転送するデータ量を抑えるため、データ削減を行うことが考えられる。サンプリングを粗くする、又はビット数を下げ量子化を粗くする、といった機械的なデータ削減は、本来観測に必要な有用情報までも削減されてしまう、という課題がある。 When conducting observations using a sensor system mounted on a flying object, there are situations in which it is desirable to transfer data to the ground and process the information there. In particular, when the flying object is an unmanned drone and observations are to be made in real time, a huge amount of information must be transferred to the ground. Data reduction can be considered to reduce the amount of data transferred to the ground. Mechanical data reduction, such as coarser sampling or coarser quantization by lowering the number of bits, has the problem of reducing even useful information that is originally necessary for observation.

 本開示技術は上記課題を解決し、本来観測に必要な有用情報は残しつつデータ削減を実施するセンサシステムを提供することを目的とする。 The disclosed technology aims to solve the above problems and provide a sensor system that reduces data while retaining useful information originally required for observation.

 本開示技術に係るセンサシステムは、飛翔体に搭載され、観測対象のインターフェログラムを取得する光計測センサ部と、インターフェログラムに基づいて、観測対象を判定する対象判定部と、対象判定部の判定結果に基づいて、インターフェログラムに対してデータ削減を行うデータ削減部と、を備える。 The sensor system according to the disclosed technology includes an optical measurement sensor unit that is mounted on a flying object and acquires an interferogram of an observation target, an object determination unit that determines an observation target based on the interferogram, and an object determination unit that determines an observation target based on the interferogram. and a data reduction unit that performs data reduction on the interferogram based on the determination result.

 本開示技術に係るセンサシステムは上記構成を備えるため、本来観測に必要な有用情報は残しつつデータ削減を実施できる。 Since the sensor system according to the disclosed technology has the above configuration, it is possible to reduce data while leaving useful information originally required for observation.

図1は、実施の形態1に係るセンサシステム及びデータ処理システムのそれぞれの機能構成を示すブロック図である。FIG. 1 is a block diagram showing the functional configurations of a sensor system and a data processing system according to the first embodiment. 図2は、実施の形態1に係るセンサシステムの動作又は処理を説明する図である。FIG. 2 is a diagram illustrating the operation or processing of the sensor system according to the first embodiment. 図3は、実施の形態1に係るセンサシステムのデータ作成部7とメモリ部8、及びデータ処理システムの動作又は処理を説明する図である。FIG. 3 is a diagram illustrating the operation or processing of the data creation unit 7, memory unit 8, and data processing system of the sensor system according to the first embodiment.

実施の形態1.
 図1は、実施の形態1に係るセンサシステム及びデータ処理システムのそれぞれの機能構成を示すブロック図である。本明細書において、本開示技術に係るセンサシステム及びデータ処理システムは、信号処理システムと総称することにする。
 本開示技術に係るセンサシステムは、ドローン、航空機、又は人工衛星、等の飛翔体に搭載され、地上を観測することを前提としている。センサシステムを搭載した飛翔体は、観測対象である地上を複数の観測方向から観測し、2次元画像により観測結果を得ることを目的としてよい。
 また本開示技術に係るデータ処理システムは、飛翔体には搭載されず、地上に在るデータセンタに設置され、飛翔体に搭載されたセンサシステムから発信されたデータを受信して処理を行うことを前提としている。
 図1に示されるとおりセンサシステムは、光計測センサ部1と、AD変換部2と、干渉画像取得部3と、インターフェログラム保存部4と、信号解析部5と、データ削減部6と、データ作成部7と、メモリ部8と、を含む。センサシステムにおける各構成要素は、図1に示されるとおりに接続される。
 図1に示されるとおりデータ処理システムは、インターフェログラム保持部9と、スペクトル復元部10と、画像化処理部11と、ラベル作成部12と、特徴量抽出部13と、教師データ作成部14と、機械学習部15と、本解析部16と、を含む。インターフェログラム保持部9、スペクトル復元部10、画像化処理部11、ラベル作成部12、特徴量抽出部13、教師データ作成部14、及び機械学習部15は、事前学習処理に関連する機能ブロック群(以降、「事前学習処理部」と称する)を構成する。データ処理システムおける各構成要素は、図1に示されるとおりに接続される。
Embodiment 1.
FIG. 1 is a block diagram showing the functional configurations of a sensor system and a data processing system according to the first embodiment. In this specification, the sensor system and data processing system according to the disclosed technology will be collectively referred to as a signal processing system.
The sensor system according to the disclosed technology is mounted on a flying object such as a drone, an aircraft, or an artificial satellite, and is intended to observe the ground. A flying object equipped with a sensor system may have the purpose of observing the ground as an observation target from a plurality of observation directions and obtaining observation results using two-dimensional images.
Furthermore, the data processing system according to the disclosed technology is not mounted on a flying object, but is installed in a data center on the ground, and receives and processes data transmitted from a sensor system mounted on a flying object. It is assumed that
As shown in FIG. 1, the sensor system includes an optical measurement sensor section 1, an AD conversion section 2, an interference image acquisition section 3, an interferogram storage section 4, a signal analysis section 5, a data reduction section 6, It includes a data creation section 7 and a memory section 8. Each component in the sensor system is connected as shown in FIG.
As shown in FIG. 1, the data processing system includes an interferogram holding section 9, a spectrum restoration section 10, an imaging processing section 11, a label creation section 12, a feature extraction section 13, and a teaching data creation section 14. , a machine learning section 15 , and a main analysis section 16 . The interferogram holding unit 9, spectrum restoration unit 10, imaging processing unit 11, label creation unit 12, feature extraction unit 13, teacher data creation unit 14, and machine learning unit 15 are functional blocks related to pre-learning processing. group (hereinafter referred to as "pre-learning processing section"). Each component in the data processing system is connected as shown in FIG.

《センサシステムにおける光計測センサ部1》
 光計測センサ部1は、インターフェログラムを取得するための構成要素である。光計測センサ部1は、具体的には、フーリエ分光型の光学干渉計である。インターフェログラムは、干渉計から得られる干渉波である。インターフェログラムの波形は、例えば、後述する図2の「インターフェログラム取得」と記載された文字の上側に示されている外観を有する。
<<Optical measurement sensor section 1 in sensor system>>
The optical measurement sensor section 1 is a component for acquiring an interferogram. Specifically, the optical measurement sensor section 1 is a Fourier spectroscopy type optical interferometer. An interferogram is an interference wave obtained from an interferometer. The waveform of the interferogram has, for example, the appearance shown above the text "interferogram acquisition" in FIG. 2, which will be described later.

《センサシステムにおけるAD変換部2》
 AD変換部2は、光計測センサ部1から送られるアナログ信号を、デジタル信号に変換するための構成要素である。
AD conversion unit 2 in sensor system》
The AD conversion section 2 is a component for converting an analog signal sent from the optical measurement sensor section 1 into a digital signal.

《センサシステムにおける干渉画像取得部3》
 干渉画像取得部3は、AD変換部2から送られるデジタル信号から干渉画像を生成し取得するための構成要素である。本明細書で用いる「干渉画像」の用語は、各観測地点について、インターフェログラムの振幅の最大値と最小値との差の絶対値を、コントラストとして表現した画像である。干渉画像は、簡単に言えば、干渉縞を表した画像である。干渉画像は、例えば、後述する図2の「干渉パターン画像取得」と記載された文字の上側に示されている外観を有する。
<<Interference image acquisition unit 3 in sensor system>>
The interference image acquisition section 3 is a component for generating and acquiring an interference image from the digital signal sent from the AD conversion section 2. The term "interference image" used in this specification is an image in which the absolute value of the difference between the maximum and minimum amplitudes of an interferogram is expressed as a contrast for each observation point. Simply put, an interference image is an image representing interference fringes. The interference image has, for example, the appearance shown above the words "interference pattern image acquisition" in FIG. 2, which will be described later.

《センサシステムにおけるインターフェログラム保存部4》
 インターフェログラム保存部4は、干渉画像取得部3から送られる干渉画像から、或る観測地点に相当する画素について、インターフェログラムのデジタルデータ(以降、「インターフェログラムデータ」と称する)を保存するための構成要素である。
Interferogram storage section 4 in sensor system》
The interferogram storage unit 4 stores interferogram digital data (hereinafter referred to as “interferogram data”) for pixels corresponding to a certain observation point from the interference image sent from the interference image acquisition unit 3. It is a component for

《センサシステムにおける信号解析部5》
 信号解析部5は、インターフェログラム保存部4から送られたインターフェログラムデータに対して、信号解析を行うための構成要素である。図1に示されるとおり信号解析部5は、スペクトル解析部5a、対象判定部5b、及び重要性解析部5cを含む。スペクトル解析部5a、対象判定部5b、及び重要性解析部5cは、図1に示されるとおりに接続される。
Signal analysis section 5 in sensor system》
The signal analysis unit 5 is a component for performing signal analysis on the interferogram data sent from the interferogram storage unit 4. As shown in FIG. 1, the signal analysis section 5 includes a spectrum analysis section 5a, an object determination section 5b, and an importance analysis section 5c. The spectrum analysis section 5a, the object determination section 5b, and the importance analysis section 5c are connected as shown in FIG.

《信号解析部5におけるスペクトル解析部5a》
 スペクトル解析部5aは、インターフェログラム保存部4から送られたインターフェログラムデータに対して、スペクトル解析をするための構成要素である。インターフェログラムデータをスペクトル解析すると、光信号の光スペクトルが求められる。光スペクトルは、具体的には、インターフェログラムをフーリエ変換して求める。光スペクトルは、周波数ごとに複素数で表される。
 インターフェログラムデータ及び光スペクトルデータは、スペクトル解析部5aにより対象判定部5bへと送られる。
<<Spectrum analysis section 5a in signal analysis section 5>>
The spectrum analysis unit 5a is a component for performing spectrum analysis on the interferogram data sent from the interferogram storage unit 4. Spectral analysis of interferogram data yields the optical spectrum of the optical signal. Specifically, the optical spectrum is obtained by Fourier transforming the interferogram. The optical spectrum is represented by a complex number for each frequency.
The interferogram data and optical spectrum data are sent by the spectrum analysis section 5a to the object determination section 5b.

《信号解析部5における対象判定部5b》
 対象判定部5bは、インターフェログラムデータ、光スペクトルデータ、又はその両方に基づいて、干渉画像から、対象を判定するための構成要素である。
判定の対象は、干渉画像上の個々の画素に相当する1次元データ(すなわちインターフェログラムデータ)であってもよく、又は干渉画像の特定領域に相当する3次元データ(すなわちスペクトル画像データ)であってもよい。また判定の対象は、これらの両方であってもよい。
 対象判定部5bが行う対象判定は、例えば、干渉画像の各画素の観測方向における雲の存在を判定するものであってもよい。地上を観測するセンサシステムにとって、雲は邪魔な存在となる。
 対象判定部5bが行う対象判定は、学習済みの人工知能を備えることにより実現される。人工知能は、学習モデルと呼ばれる数理モデルにより実現される。学習モデルは、例えば、人工ニューラルネットワークが該当する。人工知能の学習は、後述するデータ処理システムの機械学習部15において実施される。図1における機械学習部15から伸びている破線の矢印は、機械学習部15において学習済みの人工知能が対象判定部5bに備えられていることを表現している。対象判定部5bが画素ごとの判定を行う場合、人工知能は、セマンティックセグメンテーション(Semantic Segmentation)のアルゴリズムを用いて実現してよい。
<<Object determination unit 5b in signal analysis unit 5>>
The object determination unit 5b is a component for determining an object from an interference image based on interferogram data, optical spectrum data, or both.
The object of determination may be one-dimensional data corresponding to individual pixels on the interference image (i.e., interferogram data), or three-dimensional data (i.e., spectral image data) corresponding to a specific region of the interference image. There may be. Moreover, the object of determination may be both of these.
The target determination performed by the target determining unit 5b may be, for example, determining the presence of clouds in the observation direction of each pixel of the interference image. Clouds pose a nuisance to sensor systems that observe the ground.
The target determination performed by the target determination unit 5b is realized by having trained artificial intelligence. Artificial intelligence is realized by a mathematical model called a learning model. The learning model is, for example, an artificial neural network. Artificial intelligence learning is performed in a machine learning unit 15 of the data processing system, which will be described later. The broken line arrow extending from the machine learning section 15 in FIG. 1 represents that the object determination section 5b is equipped with artificial intelligence that has been trained in the machine learning section 15. When the object determination unit 5b performs determination for each pixel, artificial intelligence may be realized using a semantic segmentation algorithm.

《信号解析部5における重要性解析部5c》
 重要性解析部5cは、対象判定部5bから送られた判定の結果に基づいて、各観測地点の重要性を解析するための構成要素である。
 重要性解析部5cが解析する重要性の定義は、センサシステム及びデータ処理システムの使用目的に応じて、適宜、設定されてよい。
 重要性解析部5cにより得られた重要性の解析結果は、データ削減部6へと送られる。
<<Importance analysis unit 5c in the signal analysis unit 5>>
The importance analysis section 5c is a component for analyzing the importance of each observation point based on the determination result sent from the object determination section 5b.
The definition of importance analyzed by the importance analysis unit 5c may be set as appropriate depending on the purpose of use of the sensor system and data processing system.
The importance analysis result obtained by the importance analysis section 5c is sent to the data reduction section 6.

《センサシステムにおけるデータ削減部6》
 データ削減部6は、対象判定部5b及び重要性解析部5cから送られた対象判定及び重要性解析の情報から、不要だと判断した箇所のデータを削減するための構成要素である。具体的にはデータ削減部6は、エリア削減部6a、バンド削減部6b、及びビット削減部6cのうち少なくとも1つの構成要素を備える。
 データ削減部6が不要だと判断する基準は、センサシステム及びデータ処理システムの使用目的に応じて、適宜、決められてよい。
 データ削減部6でデータ削減が行われたデータは、データ作成部7へと送られる。
Data reduction part 6 in sensor system》
The data reduction unit 6 is a component for reducing data in locations determined to be unnecessary from the target determination and importance analysis information sent from the target determination unit 5b and importance analysis unit 5c. Specifically, the data reduction section 6 includes at least one component among an area reduction section 6a, a band reduction section 6b, and a bit reduction section 6c.
The criteria for determining that the data reduction unit 6 is unnecessary may be determined as appropriate depending on the purpose of use of the sensor system and data processing system.
The data reduced by the data reduction unit 6 is sent to the data creation unit 7.

《データ削減部6におけるエリア削減部6a》
 データ削減部6がエリア削減部6aを備える場合、エリア削減部6aは、対象判定部5b及び重要性解析部5cから送られた対象判定及び重要性解析の情報から、重要度の低い観測エリアに対して、データ削減を行う。
 前述のとおり地上を観測するセンサシステムにとって、雲は邪魔な存在となる。センサシステムと観測地点との間に雲が存在する場合、太陽光が雲に反射して生じる散乱光は、センサシステムにとっての妨害信号となる。エリア削減部6aがデータ削減を行う観測エリアは、センサシステムとの間に雲が存在するエリアが該当する。
 雲に反射して生じる散乱光の光スペクトルは、白色であることが知られている。したがって、スペクトル解析部5aで求められた光スペクトルに白色の成分が多い場合、センサシステムとセンサシステムが観測した観測地点との間に雲が発生している、と判断できる。
 エリア削減部6aが行うエリアについてのデータ削減は、学習済みの人工知能を備えることにより実現される。人工知能の学習は、後述するデータ処理システムの機械学習部15において実施される。図1における機械学習部15から伸びている破線の矢印は、機械学習部15において学習済みの人工知能がエリア削減部6aに備えられていることを表現している。
<<Area reduction unit 6a in data reduction unit 6>>
When the data reduction unit 6 includes an area reduction unit 6a, the area reduction unit 6a selects observation areas of low importance from the target determination and importance analysis information sent from the target determination unit 5b and importance analysis unit 5c. In contrast, data reduction is performed.
As mentioned above, clouds are a nuisance for sensor systems that observe the ground. When clouds exist between the sensor system and the observation point, scattered light generated by sunlight reflecting off the clouds becomes an interference signal for the sensor system. The observation area in which the area reduction unit 6a performs data reduction corresponds to an area where clouds exist between the sensor system and the observation area.
It is known that the light spectrum of scattered light generated by reflection from clouds is white. Therefore, if the light spectrum obtained by the spectrum analysis unit 5a has many white components, it can be determined that clouds are occurring between the sensor system and the observation point observed by the sensor system.
Data reduction for areas performed by the area reduction unit 6a is realized by providing trained artificial intelligence. Artificial intelligence learning is performed in a machine learning unit 15 of the data processing system, which will be described later. The broken line arrow extending from the machine learning unit 15 in FIG. 1 represents that the area reduction unit 6a is equipped with artificial intelligence that has been trained in the machine learning unit 15.

《データ削減部6におけるバンド削減部6b》
 データ削減部6がバンド削減部6bを備える場合、バンド削減部6bは、対象判定部5b及び重要性解析部5cから送られた対象判定及び重要性解析の情報から、重要度が低いと推定できる周波数帯域のデータを削除するための構成要素である。バンド削減部6bの名称に用いられる「バンド」の用語は、周波数帯を表す英語のFrequency Bandに由来する。
 バンド削減部6bは、具体的には、センサシステムの観測対象が地表に存在する特定のガスである場合を想定した構成要素である。ガスは、気体状の原子であるが、その原子に固有の波長の吸収線が連続スペクトルに現れる。したがって、スペクトル解析部5aで求められた光スペクトルに吸収線の特徴が現れていなければ、観測地点の環境に特定ガスが存在していないとして、データ削減の候補として考えることができる。
<<Band reduction unit 6b in data reduction unit 6>>
When the data reduction unit 6 includes a band reduction unit 6b, the band reduction unit 6b can estimate that the degree of importance is low based on the target determination and importance analysis information sent from the target determination unit 5b and the importance analysis unit 5c. This is a component for deleting frequency band data. The term "band" used in the name of the band reduction unit 6b is derived from the English word "Frequency Band" which represents a frequency band.
Specifically, the band reduction unit 6b is a component intended for the case where the observation target of the sensor system is a specific gas existing on the earth's surface. Gases are gaseous atoms, and absorption lines with wavelengths unique to the atoms appear in a continuous spectrum. Therefore, if the characteristics of absorption lines do not appear in the optical spectrum obtained by the spectrum analysis unit 5a, it can be assumed that the specific gas does not exist in the environment of the observation point and can be considered as a candidate for data reduction.

《データ削減部6におけるビット削減部6c》
 データ削減部6がビット削減部6cを備える場合、ビット削減部6cは、対象判定部5b及び重要性解析部5cから送られた対象判定及び重要性解析の情報から、重要度が低いと推定できる、インターフェログラムの時間領域又は周波数領域のデータに対して、ビット単位で、又はビット数を落とすことで、データを削除するための構成要素である。本明細書において、ビット単位でデータを削除することは、「ビット削減」と表現することとする。
 ビット削減部6cは、例えば、インターフェログラムの周波数領域のデータ、すなわち光スペクトルのデータにおいて、信号レベルが過剰に高くなっている波長領域のデータを、ビット削減してもよい。またビット削減部6cは、例えば、インターフェログラムの時間領域のデータにおいて、計測精度に寄与しない時間帯のデータを、ビット削減してもよい。インターフェログラムの時間領域のデータにおいて、計測精度に寄与しない時間帯は、一般には、開始時と終了時との両端の時間帯である(後述の図2に示されるインターフェログラムの波形を参照)。
 ビット削減の詳細は、後述の図2に沿った説明により明らかとなる。
<<Bit reduction unit 6c in data reduction unit 6>>
When the data reduction unit 6 includes a bit reduction unit 6c, the bit reduction unit 6c can estimate that the degree of importance is low based on the target determination and importance analysis information sent from the target determination unit 5b and the importance analysis unit 5c. , is a component for deleting data in the time domain or frequency domain of an interferogram in units of bits or by reducing the number of bits. In this specification, deleting data in units of bits is expressed as "bit reduction."
The bit reduction unit 6c may, for example, reduce bits of data in a wavelength domain where the signal level is excessively high in frequency domain data of the interferogram, that is, optical spectrum data. Further, the bit reduction unit 6c may, for example, reduce bits of data in a time period that does not contribute to measurement accuracy in the time domain data of the interferogram. In the time domain data of the interferogram, the time periods that do not contribute to measurement accuracy are generally the time periods at both ends, between the start and end times (see the interferogram waveform shown in Figure 2 below). ).
The details of bit reduction will become clear from the explanation along with FIG. 2, which will be described later.

《センサシステムにおけるデータ作成部7》
 データ作成部7は、データ削減部6から送られたデータから、3次元データを再構成するための構成要素である。データ作成部7において再構成される3次元データは、画素ごとのインターフェログラムとした態様でもよいし、サンプリング周期ごとの干渉画像とした態様でもよい。例えば、データ削減部6からデータ削減がされたデータとして、光スペクトルすなわち周波数領域の情報が送られる場合、データ作成部7は、周波数領域の情報を逆フーリエ変換し、画素ごとにインターフェログラムを再構成してよい。
 データ作成部7で再構成された3次元データは、メモリ部8へと送られる。
Data creation section 7 in sensor system》
The data creation unit 7 is a component for reconstructing three-dimensional data from the data sent from the data reduction unit 6. The three-dimensional data reconstructed in the data creation unit 7 may be an interferogram for each pixel or an interference image for each sampling period. For example, when an optical spectrum, that is, frequency domain information is sent from the data reduction unit 6 as data reduced, the data creation unit 7 performs an inverse Fourier transform on the frequency domain information and creates an interferogram for each pixel. May be reconfigured.
The three-dimensional data reconstructed by the data creation section 7 is sent to the memory section 8.

 図1は、データ作成部7が情報を取得するルートとして、データ削減部6からのルートとは別に、インターフェログラム保存部4からのルートがあることを示している。データ作成部7は、インターフェログラム保存部4から送られた未加工のインターフェログラムデータを、メモリ部8へ送る処理を実施してもよい。未加工のインターフェログラムデータは、後述する機械学習部15が行う学習の教師データとして用いることができる。なお、教師データとして使用する未加工のインターフェログラムデータの送信は、必要な量に応じて、すなわちセンサシステム及びデータ処理システムの使用目的に応じて、適宜、間欠的に実施すればよい。 FIG. 1 shows that, in addition to the route from the data reduction unit 6, there is a route from the interferogram storage unit 4 as a route for the data creation unit 7 to acquire information. The data creation unit 7 may perform a process of sending the raw interferogram data sent from the interferogram storage unit 4 to the memory unit 8. The raw interferogram data can be used as training data for learning performed by the machine learning unit 15, which will be described later. Note that the raw interferogram data to be used as teacher data may be transmitted intermittently depending on the required amount, that is, depending on the purpose of use of the sensor system and the data processing system.

《センサシステムにおけるメモリ部8》
 メモリ部8は、データ作成部7から送られた3次元データを、格納するための構成要素である。メモリ部8において格納された3次元データは、データ処理システムへと送られる。
 メモリ部8は、データ作成部7から送られた未加工のインターフェログラムデータも、適宜、格納する。メモリ部8において格納された未加工のインターフェログラムデータも、データ処理システムへと送られる。
Memory part 8 in sensor system》
The memory unit 8 is a component for storing the three-dimensional data sent from the data creation unit 7. The three-dimensional data stored in the memory section 8 is sent to a data processing system.
The memory unit 8 also stores raw interferogram data sent from the data creation unit 7 as appropriate. The raw interferogram data stored in memory portion 8 is also sent to the data processing system.

《データ処理システムにおけるインターフェログラム保持部9》
 インターフェログラム保持部9は、インターフェログラムデータを一時的に保持するための構成要素である。インターフェログラム保持部9が保持するインターフェログラムデータは、その役割に着目すると、2つの種類に分類することができる。
 一方の種類は、人工知能の学習フェーズにおいて用いられるものであり、学習データセットの教師データとしてのインターフェログラムデータである。教師データとしてのインターフェログラムデータは、例えば、過去にセンサシステムによって実際に観測した実観測データであってよい。センサシステムによる過去の実観測データが十分に得られない場合、他のシステムによって観測された実観測データを用い、又はデータ拡張を実施して、教師データを作成してもよい。学習データセットの一部は、学習の最終段階において行う検証用の検証データセットとして用いられてよい。
 他方の種類は、人工知能の推論フェーズにおいて用いられるものであり、センサシステムから発信されたセンサシステムの観測データであるインターフェログラムデータである。
 学習フェーズにおいて用いられるインターフェログラムデータは、事前学習処理部のスペクトル復元部10及び特徴量抽出部13へと送られる。推論フェーズにおいて用いられるインターフェログラムデータは、本解析部16へと送られる。
<<Interferogram holding unit 9 in data processing system>>
The interferogram holding unit 9 is a component for temporarily holding interferogram data. The interferogram data held by the interferogram holding unit 9 can be classified into two types, focusing on their roles.
One type is used in the learning phase of artificial intelligence, and is interferogram data as training data of a learning data set. The interferogram data as the teacher data may be, for example, actual observation data actually observed by a sensor system in the past. If sufficient past actual observation data by the sensor system cannot be obtained, teacher data may be created using actual observation data observed by another system or by performing data expansion. A part of the learning data set may be used as a verification data set for verification performed in the final stage of learning.
The other type is interferogram data, which is used in the inference phase of artificial intelligence and is observation data of the sensor system transmitted from the sensor system.
The interferogram data used in the learning phase is sent to the spectrum restoration section 10 and feature amount extraction section 13 of the pre-learning processing section. Interferogram data used in the inference phase is sent to the main analysis section 16.

《データ処理システムにおけるスペクトル復元部10》
 スペクトル復元部10は、インターフェログラム保持部9から送られたインターフェログラムデータに対してスペクトル復元を行うための構成要素である。スペクトル復元とは、本来信号が有する信号スペクトルを推定し復元する問題又は処理のことを意味する。スペクトル復元により得られる信号スペクトルの波形は、例えば、後述する図3に記載された「スペクトル復元」の文字の右側に示された外観を有する。スペクトル復元を応用した技術には、例えば、MMSE-STSA法(Minimum Mean-Square-Error Short-Time Spectral Amplitude Estimator)等の雑音抑制方法が知られている。
<<Spectral restoration unit 10 in data processing system>>
The spectrum restoration section 10 is a component for performing spectrum restoration on the interferogram data sent from the interferogram holding section 9. Spectral restoration refers to the problem or process of estimating and restoring the signal spectrum that a signal originally has. The waveform of the signal spectrum obtained by spectrum restoration has, for example, the appearance shown on the right side of the words "spectrum restoration" in FIG. 3, which will be described later. As a technique applying spectrum restoration, for example, a noise suppression method such as the MMSE-STSA (Minimum Mean-Square-Error Short-Time Spectral Amplitude Estimator) method is known.

《データ処理システムにおける画像化処理部11》
 画像化処理部11は、スペクトル復元部10から送られた信号スペクトルに基づいて、画像データを生成するための構成要素である。画像化処理部11で生成された画像データは、本明細書において、「スペクトル画像データ」と称することとする。スペクトル画像データは、例えば、後述する図3の「画像化処理」と記載された文字の上側に示す外観で表現することができる。スペクトル画像データは、3次元の又は3次以上の次元のデータキューブと考えてよい。
 画像化処理部11は、画像データを生成する際、データ削減がなされた箇所に対しては、内挿等の処理によりデータを補正してよい(後述の図3参照)。
 画像化処理部11で生成されたスペクトル画像データは、ラベル作成部12へと送られる。
<<Imaging processing unit 11 in the data processing system>>
The imaging processing unit 11 is a component for generating image data based on the signal spectrum sent from the spectrum restoration unit 10. The image data generated by the imaging processing unit 11 will be referred to as "spectral image data" in this specification. The spectral image data can be expressed, for example, in the appearance shown above the characters "imaging processing" in FIG. 3, which will be described later. Spectral image data may be thought of as a data cube of three or more dimensions.
When generating image data, the imaging processing unit 11 may correct the data by processing such as interpolation for locations where data has been reduced (see FIG. 3, which will be described later).
The spectral image data generated by the imaging processing section 11 is sent to the label creation section 12.

《データ処理システムにおけるラベル作成部12》
 ラベル作成部12は、センサシステム及びデータ処理システムの使用者が、学習データセットを構成する正解ラベルを作成することを、支援するための構成要素である。ラベル作成部12は、例えば、学習途上の人工知能を使って、正解ラベルの候補をスペクトル画像データと重ねてディスプレイ画面上に表示する。使用者は、ラベル作成部12により表示された正解ラベルの候補を、必要に応じて修正することにより、正解ラベルを作成すればよい。
Label creation unit 12 in data processing system》
The label creation unit 12 is a component for supporting the user of the sensor system and the data processing system to create correct labels constituting the learning data set. The label creation unit 12 uses, for example, artificial intelligence that is in the process of learning to display correct label candidates on the display screen, overlapping them with the spectral image data. The user may create a correct label by modifying the correct label candidates displayed by the label creation unit 12 as necessary.

《データ処理システムにおける特徴量抽出部13、教師データ作成部14、及び機械学習部15》
 特徴量抽出部13、教師データ作成部14、及び機械学習部15は、全体として人工知能の学習過程を表していると考えてよい。
<<Feature extraction unit 13, teacher data creation unit 14, and machine learning unit 15 in the data processing system>>
The feature extraction unit 13, the teacher data creation unit 14, and the machine learning unit 15 may be considered to represent the learning process of artificial intelligence as a whole.

 特徴量抽出部13は、学習フェーズの人工知能が、学習の目的に応じて、インターフェログラム保持部9から送られ入力されるインターフェログラムデータに対して、特徴量を抽出する学習過程を表している。特徴量(feature)は、学習の分野で用いられる用語であり、分析対象データの中の、予測の手がかりとなる変数である。また、特徴量抽出(feature extaction)は、分析対象データを特徴量に変換することである。
 入力されるインターフェログラムデータごとに抽出した特徴量は、それぞれ多次元量(ベクトル)であり、後述する図3の「特徴量抽出」と記載された文字の下側に示される特徴量マップ上のプロットとして表すことができる。特徴量マップは、特徴量空間と称されることもある。なお、学習の進度に伴って、特徴量の定義が変化し、さらに特徴量の次元が増減することもある。
 特徴量は、人為的に、インターフェログラムの減衰に対する包絡線の特徴、インターフェログラムにおける1振動分の時間幅、その他の光スペクトルと相関を有する量を含むようにしてもよい。
The feature amount extraction unit 13 represents a learning process in which the artificial intelligence in the learning phase extracts feature amounts from interferogram data sent and input from the interferogram holding unit 9 according to the purpose of learning. ing. A feature is a term used in the field of learning, and is a variable in data to be analyzed that serves as a clue for prediction. Further, feature extraction is converting data to be analyzed into a feature amount.
The feature quantities extracted for each input interferogram data are multidimensional quantities (vectors), and are shown on the feature quantity map shown below the text "Feature quantity extraction" in Figure 3, which will be described later. can be represented as a plot of A feature map is sometimes referred to as a feature space. Note that as learning progresses, the definition of the feature may change, and the dimensions of the feature may increase or decrease.
The feature amount may artificially include a feature of the envelope for the attenuation of the interferogram, a time width of one vibration in the interferogram, and other amounts having a correlation with the optical spectrum.

 教師データ作成部14は、学習データセットを、人工知能の学習を行うのに十分な数まで揃える、という準備の過程を表している。学習データセットの数が揃ってくると、特徴量マップ上には、ラベルごとにプロット分布が現れる。 The teacher data creation unit 14 represents the preparation process of preparing a sufficient number of learning data sets to perform artificial intelligence learning. Once the number of training datasets is complete, a plot distribution will appear for each label on the feature map.

 機械学習部15は、人工知能の学習過程におけるコアなプロセスを表している。例えば、人工知能が分類問題を解くように学習を進めた場合、人工知能は、図3右下の特徴量マップに示されるように、クラスとクラスとを分ける分類面を構築する。人工知能は、分類面の構築に際し、例えば、サポートベクターマシンのアルゴリズムを適用してよい。 The machine learning unit 15 represents a core process in the learning process of artificial intelligence. For example, when an artificial intelligence advances its learning to solve a classification problem, the artificial intelligence constructs a classification surface that separates classes, as shown in the feature map at the bottom right of FIG. Artificial intelligence may apply, for example, a support vector machine algorithm when constructing the classification surface.

《データ処理システムにおける本解析部16》
 本解析部16は、データ処理システムがインターフェログラム保持部9を介して取得したインターフェログラムデータを、解析するための構成要素である。データ処理システムの使用者は、本解析部16により解析された結果を活用することができる。
Main analysis unit 16 in data processing system》
The analysis unit 16 is a component for analyzing interferogram data acquired by the data processing system via the interferogram holding unit 9. The user of the data processing system can utilize the results analyzed by the analysis unit 16.

《実施の形態1に係るセンサシステムの動作について》
 図2は、実施の形態1に係るセンサシステムの動作又は処理を説明する図である。
<<About the operation of the sensor system according to Embodiment 1>>
FIG. 2 is a diagram illustrating the operation or processing of the sensor system according to the first embodiment.

 図2左上の部分は、飛翔体に搭載されたセンサシステムが、観測目標を観測している様子を表している。前述のとおり飛翔体は、例えば、ドローン、航空機、又は人工衛星、等である。
 光計測センサ部1は、前述のとおりフーリエ分光型の光学干渉計であるが、ここで検出される信号は、検出した時刻(t)と、観測地点についての2次元座標(x,y)と、が関連付けされる。観測地点についての2次元座標(x,y)は、飛翔体の位置、光の照射方向、TOF(Time of Flight)、及びビームフォーミング等の処理内容を勘案し、総合的に決定される。観測地点についての2次元座標(x,y)は、例えば、地理座標系で表されてもよい。
 x,y,及びtの各変数は、離散的な変数であることを強調するときは、x,y,及びtのように、下添え字を付して表す。
 時刻tは、n番目のサンプリングが行われたサンプリング時刻を表す。サンプリングは、一般に、時刻t=0から開始される。すなわちサンプリング時刻は、サンプリングを開始した時刻を基準とした時間である。サンプリング時刻(t)についての下添え字は、0から開始される。
 2次元座標(x,y)は、n番目の観測地点についての座標であることを意味する。観測地点についての2次元座標に用いられる下添え字は、1から開始される。ここで、下添え字に用いられるnは、単なる変数であり、for文におけるカウンタ変数としての意味を有する。nは、何かの総数を意味するものではない。2次元座標(x,y)に用いるnと時刻tに用いるnとは、同じ値である必要はない。本明細書では同じnが用いられているが、本開示技術を実現するプログラムにおいて、2次元座標と時刻とで、異なる変数が用いられてもよい。
The upper left part of Figure 2 shows the sensor system mounted on the flying object observing the observation target. As described above, the flying object is, for example, a drone, an aircraft, or an artificial satellite.
The optical measurement sensor unit 1 is a Fourier spectroscopy type optical interferometer as described above, and the signals detected here are based on the detection time (t) and the two-dimensional coordinates (x, y) of the observation point. , are associated. The two-dimensional coordinates (x, y) of the observation point are determined comprehensively, taking into consideration processing details such as the position of the flying object, the direction of light irradiation, TOF (Time of Flight), and beam forming. The two-dimensional coordinates (x, y) of the observation point may be expressed, for example, in a geographic coordinate system.
When emphasizing that the variables x, y, and t are discrete variables, they are expressed with subscripts, such as x n , y n , and t n .
Time t n represents the sampling time when the nth sampling was performed. Sampling generally starts at time t 0 =0. That is, the sampling time is a time based on the time when sampling started. The subscript for sampling time (t) starts at 0.
The two-dimensional coordinates (x n , y n ) mean the coordinates for the n-th observation point. Subscripts used for two-dimensional coordinates for observation points start at 1. Here, n used as a subscript is simply a variable and has a meaning as a counter variable in a for statement. n does not mean the total number of something. The n used for the two-dimensional coordinates (x n , y n ) and the n used for the time t n do not need to be the same value. Although the same n is used in this specification, different variables may be used for two-dimensional coordinates and time in a program that implements the disclosed technology.

 図2中央上の部分は、干渉画像取得部3が干渉画像を取得している様子を表している。第1の処理として干渉画像取得部3は、AD変換部2からデジタル信号を取得する。デジタル信号は、図2に示されるように、F()の符号で表す。デジタル信号(F(x,y,t))は、インターフェログラムの配列データだとも言える。 The upper center part of FIG. 2 shows how the interference image acquisition unit 3 acquires an interference image. As a first process, the interference image acquisition section 3 acquires a digital signal from the AD conversion section 2. The digital signal is represented by the symbol F(), as shown in FIG. The digital signal (F(x, y, t)) can also be said to be interferogram array data.

 図2右上の部分は、インターフェログラム保存部4が、干渉画像取得部3から送られる干渉画像から、或る観測地点に相当する画素についてのインターフェログラムデータを取得している様子を表している。例えば、図2右上の部分において「F(x,y,t)」と記載された文字とともに示されたグラフは、1番目の観測地点(2次元座標は(x,y))に相当する画素についてのインターフェログラムデータを表したグラフである。同様に、図2右上の部分において「F(x,y,t)」と記載された文字とともに示されたグラフは、2番目の観測地点(2次元座標は(x,y))に相当する画素についてのインターフェログラムデータを表したグラフである。
 このように地上における或る観測地点は、干渉画像においては特定の位置にある画素に対応する。
 インターフェログラムデータ(F(x,y,t))は、仕様で求められる波数分解能(スペクトル分解能とも称する)を得るために必要な観測時間のデータ長を有しているとする。本開示技術に係るセンサシステムは、全ての観測地点に対応するそれぞれの画素について、インターフェログラムデータ(F(x,y,t))を取得する。
The upper right part of FIG. 2 shows how the interferogram storage unit 4 acquires interferogram data for pixels corresponding to a certain observation point from the interference image sent from the interference image acquisition unit 3. There is. For example, the graph shown with the characters "F (x 1 , y 1 , t)" in the upper right part of Figure 2 is the first observation point (two-dimensional coordinates are (x 1 , y 1 )) 3 is a graph showing interferogram data for pixels corresponding to . Similarly, the graph shown with the characters "F (x 2 , y 2 , t)" in the upper right part of Figure 2 is the second observation point (the two-dimensional coordinates are (x 2 , y 2 ) ) is a graph showing interferogram data for pixels corresponding to .
In this way, a certain observation point on the ground corresponds to a pixel at a specific position in the interference image.
It is assumed that the interferogram data (F(x n , y n , t)) has a data length of observation time necessary to obtain the wave number resolution (also referred to as spectral resolution) required by the specifications. The sensor system according to the present disclosure acquires interferogram data (F(x n , yn , t)) for each pixel corresponding to all observation points.

 図2左下の部分は、信号解析部5が、インターフェログラム保存部4から送られたインターフェログラムデータ(F(x,y,t))に対して、対象判定及び重要性解析の信号解析を行う様子を表している。 In the lower left part of FIG. 2, the signal analysis unit 5 performs object determination and importance analysis on the interferogram data (F(x n , y n , t)) sent from the interferogram storage unit 4. This shows how signal analysis is performed.

 信号解析部5のスペクトル解析部5aは、インターフェログラム保存部4から送られたインターフェログラムデータ(F(x,y,t))に対して、時間方向のフーリエ変換を行う。インターフェログラムデータ(F(x,y,t))を時間方向にフーリエ変換して得られる光スペクトルデータは、配列であり、B(x,y)と表す。スペクトル解析部5aで求めた光スペクトルデータ(B(x,y))は、対象判定部5bへ送られる。 The spectrum analysis unit 5a of the signal analysis unit 5 performs Fourier transform in the time direction on the interferogram data (F(x n , yn , t)) sent from the interferogram storage unit 4 . Optical spectrum data obtained by Fourier transforming interferogram data (F(x n , y n , t)) in the time direction is an array, and is expressed as B(x n , y n ). The optical spectrum data (B(x n , y n )) obtained by the spectrum analysis section 5a is sent to the object determination section 5b.

 信号解析部5の対象判定部5bは、インターフェログラムデータ(F(x,y,t))、光スペクトルデータ(B(x,y))、又はその両方を用いて、全ての観測地点(全てのnにおける観測地点)について、対象がどのようであったかを判定する。対象判定部5bは、特に、センサシステムと観測地点との間に、雲が存在していなかったかを判定する。対象判定部5bは、雲の有無を確認するため、例えば、雲が存在する場合の参照インターフェログラムを準備し、観測対象のインターフェログラムデータ(F(x,y,t))と参照インターフェログラムとの相関係数を求め、予め設定しておいた閾値と比較するようにしてもよい。
 対象判定部5bが行う対象判定は、前述のとおり、学習済みの人工知能により実現される。
The object determination unit 5b of the signal analysis unit 5 uses interferogram data (F(x n , y n , t)), optical spectrum data (B(x n , y n )), or both to determine whether all For observation points (observation points at all n), determine what the object was like. The object determination unit 5b specifically determines whether there are any clouds between the sensor system and the observation point. In order to confirm the presence or absence of clouds, the object determination unit 5b prepares a reference interferogram for the case where clouds are present, and compares it with the interferogram data (F(x n , y n , t)) of the observation target. A correlation coefficient with a reference interferogram may be determined and compared with a preset threshold.
The target determination performed by the target determination unit 5b is realized by trained artificial intelligence, as described above.

 信号解析部5の重要性解析部5cは、インターフェログラムデータ(F(x,y,t))、光スペクトルデータ(B(x,y))、及び対象判定部5bの判定結果に基づいて、各観測地点の重要性を解析する。
 重要性解析部5cは、各観測地点の重要性を解析するにあたり、観測地点のエリアに対応する画素の領域、観測対象(例えばガス)がその性質に応じて有する固有の周波数特性(連続スペクトルに現れる吸収線)、及び受信信号のデータ量を参照する。
The importance analysis unit 5c of the signal analysis unit 5 uses interferogram data (F(x n , yn , t)), optical spectrum data (B(x n , yn )), and the judgment of the target judgment unit 5b. Based on the results, analyze the importance of each observation point.
In analyzing the importance of each observation point, the importance analysis unit 5c analyzes the area of pixels corresponding to the area of the observation point, the unique frequency characteristics (continuous spectrum (absorption lines that appear) and the amount of data in the received signal.

 図2中央下の部分は、データ削減部6が、エリア削減部6a、バンド削減部6b、及びビット削減部6cのうち少なくとも1つを備える様子を表している。 The lower center portion of FIG. 2 shows how the data reduction section 6 includes at least one of an area reduction section 6a, a band reduction section 6b, and a bit reduction section 6c.

 ビット削減部6cが行うビット削減は、上位ビットを削減する方法と、下位ビットを削減する方法と、が考えられる。
 前述のとおり、インターフェログラムの時間領域のデータにおいて、計測精度に寄与しない時間帯は、一般には、開始時と終了時との両端の時間帯である。この時間帯においてインターフェログラムの信号は大きくないため、信号データの上位ビットには0しか入らない。したがってビット削減部6cは、インターフェログラムの開始時と終了時との両端の時間帯において、信号データの上位ビットを削減できる。
 これとは逆に、インターフェログラムの時間領域のデータにおいて、開始時と終了時との中間の時間帯は、信号振幅が大きくSN比も高い。この時間帯において、インターフェログラムの信号データの下位ビットの情報は重要ではない。したがってビット削減部6cは、インターフェログラムの開始時と終了時との中間の時間帯において、信号データの下位ビットを削減することもできる。
 ビット削減部6cは、上位ビット削減、下位ビット削減、又は上位ビット削減と下位ビット削減との組合せ、を実施すればよい。
The bit reduction carried out by the bit reduction unit 6c can be carried out in two ways: a method of reducing the upper bits and a method of reducing the lower bits.
As described above, in the time domain data of the interferogram, the time periods that do not contribute to measurement accuracy are generally the time periods at both ends of the start and end times. Since the interferogram signal is not large in this time period, only 0 is included in the upper bits of the signal data. Therefore, the bit reduction unit 6c can reduce the upper bits of the signal data in the time periods at both ends of the interferogram.
On the contrary, in the time domain data of the interferogram, the signal amplitude is large and the S/N ratio is high in the time zone between the start and end times. During this time period, the information in the lower bits of the signal data of the interferogram is not important. Therefore, the bit reduction unit 6c can also reduce the lower bits of the signal data in a time period intermediate between the start and end of the interferogram.
The bit reduction unit 6c may perform upper bit reduction, lower bit reduction, or a combination of upper bit reduction and lower bit reduction.

 このようにデータ削減部6は、インターフェログラムデータ(F(x,y,t))に対してデータ削減を実施し、データ削減がなされたインターフェログラムデータを生成する。データ削減がなされたインターフェログラムデータは、F´(x,y,t)と表す。F´(x,y,t)は、原信号のF(x,y,t)と類似するため、数学表記で類似を表すときに用いるプライム(「´」、英国英語では通常dashと称される)が用いられている。 In this way, the data reduction unit 6 performs data reduction on the interferogram data (F(x n , yn , t)) and generates interferogram data with data reduction performed. The interferogram data subjected to data reduction is expressed as F'(x n , yn , t). F'(x n , y n , t) is similar to the original signal F(x n , y n , t), so it is expressed as a prime ('', usually used in British English), which is used to express similarity in mathematical notation. dash) is used.

 図2右下の部分は、時刻がtにおける干渉画像が、データ削減による影響を受けどうなるかその様子を表している。データ削減の影響を受けた干渉画像は、図2に示されるとおり、F´(x,y,t)と表す。データ削減の影響を受けた干渉画像(F´(x,y,t))は、データ削減がなされたインターフェログラムデータ(F´(x,y,t))から構築することができる。 The lower right part of FIG. 2 shows how the interference image at time tn is affected by data reduction. The interference image affected by data reduction is denoted as F'(x, y, t n ), as shown in FIG. 2 . An interferogram affected by data reduction (F′(x, y, t n )) can be constructed from interferogram data (F′(x n , y n , t)) subjected to data reduction. can.

 図3は、実施の形態1に係るセンサシステムのデータ作成部7とメモリ部8、及びデータ処理システムの動作又は処理を説明する図である。 FIG. 3 is a diagram illustrating the operation or processing of the data creation unit 7, memory unit 8, and data processing system of the sensor system according to the first embodiment.

 図3の上部は、データ作成部7において再構成された3次元データが、メモリ部8に格納される様子を表している。
 図3の中部は、メモリ部8に格納されている3次元データが、データ処理システムへ転送される様子を表している。
 図3の下部は、全体として、データ処理システムの動作又は処理を説明したものである。
The upper part of FIG. 3 shows how the three-dimensional data reconstructed by the data creation section 7 is stored in the memory section 8.
The middle part of FIG. 3 shows how the three-dimensional data stored in the memory section 8 is transferred to the data processing system.
The lower portion of FIG. 3 generally describes the operation or processing of the data processing system.

 図3の下部において、インターフェログラム保持部9の動作又は処理を説明する図として、F´´(x,y,t)の信号波形が示されている。F´´(x,y,t)は、データ転送による影響を考慮して、センサシステム側のメモリ部8に格納されているF´(x,y,t)よりもプライムの数を増やして表している。 In the lower part of FIG. 3, a signal waveform of F''(x n , yn , t) is shown as a diagram for explaining the operation or processing of the interferogram holding unit 9. Considering the influence of data transfer, F''(x n , y n , t) is more prime than F' (x n , yn , t) stored in the memory unit 8 on the sensor system side. It is expressed by increasing the number.

 図3の下部において、スペクトル復元部10の動作又は処理を説明する図として、B´´(x,y,λ)と、B´´(x,y,λ)と、2つのスペクトル波形が示されている。B´´(x,y,λ)は、F´´(x,y,t)からスペクトル復元されたスペクトルデータである。n=1,2,…について、B´´(x,y,λ)は、具体的には、F´´(x,y,t)を時間方向にフーリエ変換して得られる。なお、B´´(x,y,λ)の引数のλは、波長である。すなわちB´´(x,y,λ)の引数にλを含むことにより、スペクトルデータが波長領域(又は、周波数領域)で表現される量であることを示している(周波数は、光速を波長で割ったものである)。これは、インターフェログラムデータ(F(x,y,t))が、引数にtを含み時間領域で表現される量であること、と対比される。 In the lower part of FIG. 3, as a diagram explaining the operation or processing of the spectrum restoration unit 10, B''(x 1 , y 1 , λ), B ''(x 2 , y 2 , λ), and two The spectral waveform is shown. B''(x 1 , y 1 , λ) is spectrum data whose spectrum is restored from F'' (x 1 , y 1 , t). For n=1, 2,..., B''(x n , y n , λ) is specifically obtained by Fourier transforming F'' (x n , y n , t) in the time direction. . Note that λ in the argument of B''(x n , yn , λ) is the wavelength. In other words, including λ in the argument of B''(x n , y n , λ) indicates that the spectral data is a quantity expressed in the wavelength domain (or frequency domain) (frequency is the speed of light divided by the wavelength). This is contrasted with the fact that interferogram data (F(x n , y n , t)) is a quantity that includes t as an argument and is expressed in the time domain.

 図3の下部において、画像化処理部11の動作又は処理を説明する図として、一群のB´´(x,y,λ)についてのスペクトル画像データが示されている。画像化処理部11が作成するスペクトル画像データは、2次元座標平面と、その座標におけるスペクトルと、からなる3次元のデータキューブであると言える。
 図3の下部には、画像化処理部11が、画像データを生成する際、データ削減がなされた箇所に対して内挿等の処理によりデータを補正する様子を示している。
In the lower part of FIG. 3, spectral image data for a group of B''(x, y, λ) is shown as a diagram for explaining the operation or processing of the imaging processing unit 11. The spectral image data created by the imaging processing unit 11 can be said to be a three-dimensional data cube consisting of a two-dimensional coordinate plane and a spectrum at the coordinates.
The lower part of FIG. 3 shows how the imaging processing unit 11 corrects data by processing such as interpolation for the portions where data has been reduced when generating image data.

 図3の下部において、「ラベル作成」と記載された文字の上にある白抜き矢印は、ラベル作成部12の動作又は処理を表している。 In the lower part of FIG. 3, the white arrow above the text "label creation" represents the operation or processing of the label creation section 12.

 図3の下部において、「特徴量抽出」と記載された文字の左にある白抜き矢印は、特徴量抽出部13の動作又は処理を表している。 At the bottom of FIG. 3, the white arrow to the left of the words "feature extraction" represents the operation or processing of the feature extraction unit 13.

 図3の下部において、「教師データ作成」と記載された文字の上にある特徴量マップは、教師データ作成部14の動作又は処理を表している。図3に示された特徴量マップは、横軸に第1の特徴量(特徴量 1)とし、縦軸に第2の特徴量(特徴量 2)としている。特徴量マップには、第1のラベル(ラベル 1)が付されたサンプルと、第2のラベル(ラベル 2)が付されたサンプルと、がプロットされている。 At the bottom of FIG. 3, the feature map above the text "Teacher data creation" represents the operation or processing of the teacher data creation unit 14. In the feature amount map shown in FIG. 3, the horizontal axis represents the first feature amount (feature amount 1), and the vertical axis represents the second feature amount (feature amount 2). A sample with a first label (label 1) and a sample with a second label (label 2) are plotted on the feature map.

 図3の下部において、「機械学習モデル作成」と記載された文字の上にある白抜き矢印は、教師データ作成部14の動作又は処理を表している。 At the bottom of FIG. 3, the white arrow above the words "machine learning model creation" represents the operation or processing of the teacher data creation unit 14.

 図3の下部において、符号の「15」が付された特徴量マップは、機械学習部15の動作又は処理を表している。符号の「15」が付された特徴量マップにおいては、分類面によってサンプルが、第1のラベルが付された「対象物 1」と、第2のラベルが付された「対象物 2」とに分類されている。図3に示されている特徴量マップは、「対象物 1」と「対象物 2」とが、重要度の違いによってクラス分けされている例を示している。図3において、「対象物 1」の重要度はαであり、「対象物 2」の重要度はαである。 At the bottom of FIG. 3, the feature map with the symbol “15” represents the operation or processing of the machine learning unit 15. In the feature map with the code "15", the classification plane divides the sample into "object 1" with the first label and "object 2" with the second label. It is classified as The feature amount map shown in FIG. 3 shows an example in which "object 1" and "object 2" are classified into classes based on differences in importance. In FIG. 3, the importance of "object 1" is α 1 , and the importance of "object 2" is α 2 .

 本開示技術に係るセンサシステムは、例えば、ドローンに搭載し、圃場(ほじょう)の植生状態のセンシングを行うことに応用できるため、産業上の利用可能性を有する。 The sensor system according to the disclosed technology has industrial applicability because it can be mounted on a drone and applied to sensing the vegetation state of a field, for example.

 1 光計測センサ部、2 AD変換部、3 干渉画像取得部、4 インターフェログラム保存部、5 信号解析部、5a スペクトル解析部、5b 対象判定部、5c 重要性解析部、6 データ削減部、6a エリア削減部、6b バンド削減部、6c ビット削減部、7 データ作成部、8 メモリ部、9 インターフェログラム保持部、10 スペクトル復元部、11 画像化処理部、12 ラベル作成部、13 特徴量抽出部、14 教師データ作成部、15 機械学習部、16 本解析部。 1 Optical measurement sensor section, 2 AD conversion section, 3 Interference image acquisition section, 4 Interferogram storage section, 5 Signal analysis section, 5a Spectrum analysis section, 5b Target determination section, 5c Importance analysis section, 6 Data reduction section, 6a area reduction unit, 6b band reduction unit, 6c bit reduction unit, 7 data creation unit, 8 memory unit, 9 interferogram storage unit, 10 spectrum restoration unit, 11 image processing unit, 12 label creation unit, 13 feature amount Extraction section, 14. Teacher data creation section, 15. Machine learning section, 16. Main analysis section.

Claims (7)

 飛翔体に搭載され、観測対象のインターフェログラムを取得する光計測センサ部と、
 前記インターフェログラムに基づいて、前記観測対象を判定する対象判定部と、
 前記対象判定部の判定結果に基づいて、前記インターフェログラムに対してデータ削減を行うデータ削減部と、を備える、
センサシステム。
an optical measurement sensor unit that is mounted on a flying object and acquires an interferogram of an observation target;
an object determination unit that determines the observation target based on the interferogram;
a data reduction unit that performs data reduction on the interferogram based on the determination result of the target determination unit;
sensor system.
 前記対象判定部は、
 事前に機械学習がなされた学習モデルにより、前記観測対象を判定し、
 前記インターフェログラムと雲が存在する場合の参照インターフェログラムとの相関係数を算出し、雲の有無を判定する、
請求項1に記載のセンサシステム。
The target determination unit includes:
Determine the observation target using a learning model that has been subjected to machine learning in advance,
calculating a correlation coefficient between the interferogram and a reference interferogram when clouds are present, and determining the presence or absence of clouds;
The sensor system according to claim 1.
 前記対象判定部における前記学習モデルは、セマンティックセグメンテーションのアルゴリズムを用いる、
請求項2に記載のセンサシステム。
The learning model in the target determination unit uses a semantic segmentation algorithm,
The sensor system according to claim 2.
 前記対象判定部の前記判定結果に基づいて、前記観測対象の重要性を解析する重要性解析部をさらに備え、
 前記データ削減部は、前記重要性解析部の解析結果を参照し、前記インターフェログラムに対してのデータ削減を行う、
請求項1に記載のセンサシステム。
further comprising an importance analysis unit that analyzes the importance of the observation target based on the determination result of the target determination unit,
The data reduction unit refers to the analysis result of the importance analysis unit and performs data reduction on the interferogram.
The sensor system according to claim 1.
 前記データ削減部は、エリア削減部、バンド削減部、又はビット削減部のうちいずれか1つを備え、
 前記エリア削減部は、前記判定結果と解析結果とに基づいて、エリアの単位でデータ削減を行い、
 前記バンド削減部は、前記判定結果と解析結果とに基づいて、周波数帯域の単位でデータ削減を行い、
 前記ビット削減部は、前記判定結果と解析結果とに基づいて、ビットの単位でデータ削減を行う、
請求項4に記載のセンサシステム。
The data reduction unit includes any one of an area reduction unit, a band reduction unit, or a bit reduction unit,
The area reduction unit performs data reduction in area units based on the determination result and analysis result,
The band reduction unit performs data reduction in frequency band units based on the determination result and analysis result,
The bit reduction unit performs data reduction in units of bits based on the determination result and the analysis result.
The sensor system according to claim 4.
 前記光計測センサ部は、フーリエ分光型の光学干渉計である、
請求項1に記載のセンサシステム。
The optical measurement sensor section is a Fourier spectroscopy type optical interferometer,
The sensor system according to claim 1.
 請求項2に記載のセンサシステム用のデータ処理システムであって、
 前記対象判定部が用いる前記学習モデルに前記機械学習を行う事前学習処理部を備える、
 センサシステム用のデータ処理システム。
A data processing system for a sensor system according to claim 2, comprising:
comprising a pre-learning processing unit that performs the machine learning on the learning model used by the target determination unit;
Data processing system for sensor systems.
PCT/JP2022/022242 2022-06-01 2022-06-01 Sensor system to be mounted on flying body, and data-processing system for sensor system Ceased WO2023233563A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/022242 WO2023233563A1 (en) 2022-06-01 2022-06-01 Sensor system to be mounted on flying body, and data-processing system for sensor system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/022242 WO2023233563A1 (en) 2022-06-01 2022-06-01 Sensor system to be mounted on flying body, and data-processing system for sensor system

Publications (1)

Publication Number Publication Date
WO2023233563A1 true WO2023233563A1 (en) 2023-12-07

Family

ID=89025981

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/022242 Ceased WO2023233563A1 (en) 2022-06-01 2022-06-01 Sensor system to be mounted on flying body, and data-processing system for sensor system

Country Status (1)

Country Link
WO (1) WO2023233563A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014510913A (en) * 2011-03-10 2014-05-01 アストリアム リミテッド SAR data processing
JP2019519784A (en) * 2016-06-21 2019-07-11 タレス・アレーニア・スペース・イタリア・エッセ・ピ・ア・コン・ウニコ・ソシオ SAR imaging method for interference analysis

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014510913A (en) * 2011-03-10 2014-05-01 アストリアム リミテッド SAR data processing
JP2019519784A (en) * 2016-06-21 2019-07-11 タレス・アレーニア・スペース・イタリア・エッセ・ピ・ア・コン・ウニコ・ソシオ SAR imaging method for interference analysis

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Thesis", 1 June 2021, KAGAWA UNIVERSITY, JP, article NOGO, KOSUKE: "Research on microplastic discrimination techniques by passive infrared spectroscopic imaging", pages: 1 - 106, XP009551520 *
IMASU, RYOICHI: "Greenhouse gas measurement from satellite using Fourier transform infrared spectrometer", KOGAKU - JAPANESE JOURNAL OF OPTICS, OYO BUTSURI GAKKAI. KOGAKU KONWAKAI, TOKYO, JP, vol. 39, no. 12, 1 January 2010 (2010-01-01), JP , pages 589 - 594, XP009551388, ISSN: 0389-6625 *

Similar Documents

Publication Publication Date Title
Hormese et al. Automated road extraction from high resolution satellite images
Mountrakis et al. A linearly approximated iterative Gaussian decomposition method for waveform LiDAR processing
Wu et al. 3D reconstruction of Chinese hickory tree for dynamics analysis
Periyasamy et al. How to get the most out of u-net for glacier calving front segmentation
Picetti et al. A generative adversarial network for seismic imaging applications
EP3794376B1 (en) Apparatus and method to classify full waveform data from retro-flected signals
Zurita-Milla et al. Visualizing the ill-posedness of the inversion of a canopy radiative transfer model: A case study for Sentinel-2
Carcereri et al. A deep learning framework for the estimation of forest height from bistatic TanDEM-X data
Yablokov et al. Uncertainty quantification of multimodal surface wave inversion using artificial neural networks
Zhen et al. An adaptive optimal interpolation based on analog forecasting: application to SSH in the Gulf of Mexico
Shawal et al. Fundamentals of digital image processing and basic concept of classification
Kang et al. Multipass SAR interferometry based on total variation regularized robust low rank tensor decomposition
Vachon et al. Validation of along-track interferometric SAR measurements of ocean surface waves
Al Najar et al. A combined color and wave-based approach to satellite derived bathymetry using deep learning
Kaur et al. Estimating the inverse Hessian for amplitude correction of migrated images using deep learning
WO2023233563A1 (en) Sensor system to be mounted on flying body, and data-processing system for sensor system
Wieland et al. UKIS-CSMASK: A Python package for multi-sensor cloud and cloud-shadow segmentation
Andriyanov et al. Using generative models to improve fire detection efficiency
Dal Corso et al. An approach to semantic segmentation of radar sounder data based on unsupervised random walks and user-guided label propagation
US6807513B2 (en) Method for determining a spatial quality index of regionalized data
Beauchamp et al. End-to-End Learning of Variational Interpolation Schemes for Satellite-Derived SSH Data
Sebastianelli et al. A speckle filter for SAR Sentinel-1 GRD data based on Residual Convolutional Neural Networks
Shinohara et al. FWNetAE: Spatial representation learning for full waveform data using deep learning
Parikh et al. An evolutionary system for recognition and tracking of synoptic-scale storm systems
CN119990509B (en) Power transmission line settlement risk evaluation method, storage medium and computer equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22944850

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22944850

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP