WO2017105177A1 - System for processing images for multispectral and hyperspectral analysis in precision agriculture - Google Patents
System for processing images for multispectral and hyperspectral analysis in precision agriculture Download PDFInfo
- Publication number
- WO2017105177A1 WO2017105177A1 PCT/MX2015/000182 MX2015000182W WO2017105177A1 WO 2017105177 A1 WO2017105177 A1 WO 2017105177A1 MX 2015000182 W MX2015000182 W MX 2015000182W WO 2017105177 A1 WO2017105177 A1 WO 2017105177A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- multispectral
- hyperspectral
- images
- georeferenced
- map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A40/00—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production
- Y02A40/10—Adaptation technologies in agriculture, forestry, livestock or agroalimentary production in agriculture
Definitions
- the present invention relates to a method for image processing for the generation of image maps in agricultural fields and more particularly refers to a method for determining the health of plants by aerial photographic analysis.
- US7058197 uses visible light reflectance to generate NDVI images.
- This patent is based on the light reflected by the sun and therefore teaches that the optimal time for image acquisition using the procedure described is the two-hour "solar noon" period and on clear days. This makes it very impractical for a commercial application.
- this patent discloses aerial images that it collected four times during the growing season. The image dates are correlated with the stages of cultivation of bare soil, VI2, VT, and R4. The aerial images were taken flying with digital cameras with a matrix size of approximately 1,500 pixels wide and 1000 pixels in the dimension along the track.
- the digital systems were 8-bit systems and were collected and stored on an on-board computer in a labeled image format (TIF).
- TIF labeled image format
- Four bands were collected representing the infrared parts blue, green, red, and near the electromagnetic spectrum.
- the cameras were aligned in a two-by-two matrix and were rigid mounted with the centering [sic] lenses at infinity.
- the images were taken by plane at about 5,000 feet above ground level to produce a spatial resolution of approximately one meter by one meter.
- Digital cameras have square pixels and do not intertwine during image acquisition.
- the optimal time for image acquisition was two hours before or two hours after solar noon.
- the images were not acquired in times of bad weather conditions (fog, rain, clouds). There are no appreciable cloud shadows in the images.
- US7058197 is only able to indicate that there is a problem after a plant has really changed its structure, as indicated by its color. In many cases this is too late to take corrective action.
- US Patent 6597991 uses thermal images to detect the water content in the leaves for irrigation purposes. This patent depends on obtaining real temperatures and uses ground references for calibration. It could be said that a significant disadvantage of US patent 6597991 is its dependence on extremely precise temperature measurements so that the need for irrigation can be determined. This requirement requires an additional step and the additional costs associated with the calibration.
- Figure 1 represents a general diagram of the image processing stages.
- Figure 2 illustrates a table of image processing algorithms.
- Figure 3 shows a flow chart of the multispectral image processing process.
- Figure 4 shows a display screen of the image processing system.
- Figure 5 illustrates a diagram of the organization of elements of the web user interface and mobile application
- the first stage of the process is the planning of the mission, which begins with the delimitation of the region of interest (RDI), drawing a polygon on a map in the system platform.
- RDI region of interest
- These maps are taken from Google Maps, so they are free to use and have georeferencing; saving time and increasing the practicality of the procedure.
- the result is a KML file with a series of geodetic coordinates that describe the polygon.
- the next step in the process is multispectral or hyperspectral scanning or aerial scanning; which consists in determining the routes that the aircraft (s) will follow, so that the RDI is photographed completely.
- the routes determined by the system are usually in shape. Each scan line is called the flight line.
- the process used begins by converting the geodetic coordinates, which specify the polygon, to Cartesian coordinates in a local navigation reference frame (NED). This conversion is necessary to be able to use planning algorithms in Euclidean spaces. From that moment it is assumed that the surface to be explored has no curvature. This assumption is reasonable if we compare the size of the land with respect to the land area.
- the entry of the system is the Images in captured RAW format and the flight log originated in the execution of the mission.
- the image processing is carried out for the geographical analysis of the data generated in a mission, with which your respective georeferenced images and a mosaic that integrates these images into a single map of the plot covered see fig. one.
- Preprocessing Treatment of RAW images to generate rectified .PNG images which will be used in following blocks.
- Georeferencing Location of the pixels of each image in the GPS coordinate system.
- Mosaic generation Process of integration of the images generated in the flight into a single map of the region covered.
- the next part of image processing uses the generated mosaic to calculate and analyze various vegetation indices for the deployment of useful applications in the field.
- Index extraction consists of the calculation, pixel by pixel of the mosaic of various vegetation indices that have been developed to find certain characteristics related to crop properties. As a result, for each index, a discretized grayscale image with values from 0 to 255 is obtained.
- Figure 2 shows a table of indices for precision agriculture used in the image processing algorithms.
- the process is described in the diagram of figure 3.
- the unified, previously rectified and geo-referenced image mosaic is loaded after the type of map is selected, this is the visual interpretation that is it gives the mosaic according to some spectrum captured.
- the indices that are to be analyzed are identified, such as visible light, green, red, near infrared, among others, and finally, the type of crop being analyzed and the specific application selected are selected. It will analyze such as the amount of moisture or the amount of nitrogen in the plant, among others, to generate an output and the end of the processing.
- An image of the image processing system is presented in Figure 4.
- the output of the image analysis is a tablet, which contains: RAW images of the flight, mosaic of plot traveled, vegetation indexes, thematic maps which can be used to show applications.
- the output at this stage of the analysis process is input to the system for displaying results for the user, which can be used as:
- Layers of information for visualization of thematic maps by crop The output obtained from the image processing system is stored on a server in the cloud and displayed on a web interface as illustrated in the block diagram of Figure 5.
Landscapes
- Physics & Mathematics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- General Physics & Mathematics (AREA)
- Image Processing (AREA)
Abstract
Description
SISTEMA DE PROCESAMIENTO DE IMÁGENES PARA ANÁLISIS MULTIESPECTRAL E HIPERESPECTRAL EN AGRICULTURA DE PRECISIÓN IMAGE PROCESSING SYSTEM FOR MULTIESPECTRAL AND HYPERESPECTRAL ANALYSIS IN PRECISION AGRICULTURE
CAMPO TÉCNICO DE LA INVENCIÓN TECHNICAL FIELD OF THE INVENTION
La presente invención se refiere a un método para el procesamiento de imágenes para la generación de mapas de imágenes en campos agrícolas y más particularmente se refiere a un método para determinar la salud de las plantas por análisis fotográfico aéreo The present invention relates to a method for image processing for the generation of image maps in agricultural fields and more particularly refers to a method for determining the health of plants by aerial photographic analysis.
ANTECEDENTES BACKGROUND
Varias tecnologías se han utilizado en el pasado para medir la temperatura de las hojas de las plantas. Por ejemplo, la Patente de US7058197 utiliza reflectancia de luz visible para generar imágenes NDVI. Esta patente se basa en la luz reflejada por el sol y por lo tanto enseña que el momento óptimo para la adquisición de imágenes usando el procedimiento descrito es el plazo de dos horas de "mediodía solar" y en días despejados. Esto hace que sea muy poco práctico para una aplicación comercial. En particular esta patente da a conocer imágenes aéreas que recogió cuatro veces durante la temporada de crecimiento. Las fechas de imagen están correlacionadas con las etapas de cultivo de suelo desnudo, VI2, VT, y R4. La imágenes aéreas se tomaron volando con cámaras digitales con un tamaño de matriz de aproximadamente 1.500 píxeles de ancho y 1000 píxeles en la dimensión a lo largo de la pista. Los sistemas digitales eran sistemas de 8 bits y fueron recogidos y almacenados en un ordenador de a bordo en un formato de imagen etiquetado (TIF). Cuatro bandas se recogieron en representación de las partes infrarrojos azules, verdes, rojos, y cerca del espectro electromagnético. Las cámaras fueron alineadas en una matriz de dos por dos y eran rígidas montadas con los lentes de centrado [sic] en el infinito. Las imágenes fueron tomadas en avión a unos 5.000 pies sobre el nivel del suelo para producir una resolución espacial de aproximadamente un metro por un metro. Las cámaras digitales tienen píxeles cuadrados y no se entrelazan durante la adquisición de imagen. El momento óptimo para la adquisición de imágenes fue de dos horas antes o dos horas después de mediodía solar. Las imágenes no fueron adquiridas en tiempos de malas condiciones atmosféricas (neblina, lluvia, nubes). No hay sombras de las nubes apreciables en las imágenes. Además, parece que la metodología divulgada por la patente US7058197 sólo es capaz de indicar que existe un problema después de que una planta ha cambiado realmente su estructura, como se indica por su color. En muchos casos esto es demasiado tarde para tomar medidas correctivas. Several technologies have been used in the past to measure the temperature of plant leaves. For example, US7058197 uses visible light reflectance to generate NDVI images. This patent is based on the light reflected by the sun and therefore teaches that the optimal time for image acquisition using the procedure described is the two-hour "solar noon" period and on clear days. This makes it very impractical for a commercial application. In particular, this patent discloses aerial images that it collected four times during the growing season. The image dates are correlated with the stages of cultivation of bare soil, VI2, VT, and R4. The aerial images were taken flying with digital cameras with a matrix size of approximately 1,500 pixels wide and 1000 pixels in the dimension along the track. The digital systems were 8-bit systems and were collected and stored on an on-board computer in a labeled image format (TIF). Four bands were collected representing the infrared parts blue, green, red, and near the electromagnetic spectrum. The cameras were aligned in a two-by-two matrix and were rigid mounted with the centering [sic] lenses at infinity. The images were taken by plane at about 5,000 feet above ground level to produce a spatial resolution of approximately one meter by one meter. Digital cameras have square pixels and do not intertwine during image acquisition. The optimal time for image acquisition was two hours before or two hours after solar noon. The images were not acquired in times of bad weather conditions (fog, rain, clouds). There are no appreciable cloud shadows in the images. In addition, it seems that the methodology disclosed by US7058197 is only able to indicate that there is a problem after a plant has really changed its structure, as indicated by its color. In many cases this is too late to take corrective action.
Como otro ejemplo, la Patente de US 6597991 utiliza imágenes térmicas para detectar el contenido de agua en las hojas con fines de riego. Esta patente depende de la obtención de temperaturas reales y utiliza referencias en tierra para la calibración. Podría decirse que una desventaja significativa de la patente de US 6597991 es su dependencia de las mediciones de temperatura extremadamente precisas para que la necesidad de riego se puede determinar. Este requisito requiere un paso adicional y los costos adicionales asociados con la calibración. As another example, US Patent 6597991 uses thermal images to detect the water content in the leaves for irrigation purposes. This patent depends on obtaining real temperatures and uses ground references for calibration. It could be said that a significant disadvantage of US patent 6597991 is its dependence on extremely precise temperature measurements so that the need for irrigation can be determined. This requirement requires an additional step and the additional costs associated with the calibration.
3 3
BREVE DESCRIPCIÓN DE FIGURAS BRIEF DESCRIPTION OF FIGURES
La figura 1 representa un diagrama general de las etapas de procesamiento de imágenes. Figure 1 represents a general diagram of the image processing stages.
La figura 2 ilustra una tabla de algoritmos de procesamiento de imágenes. Figure 2 illustrates a table of image processing algorithms.
La figura 3 muestra un diagrama de flujo del proceso de tratamiento de imágenes multiespectrales. Figure 3 shows a flow chart of the multispectral image processing process.
La figura 4 muestra una pantalla de la visualización del sistema de procesamiento de imágenes. Figure 4 shows a display screen of the image processing system.
La figura 5 ilustra un diagrama del organización de elementos de la interfaz de usuario web y aplicación móviT Figure 5 illustrates a diagram of the organization of elements of the web user interface and mobile application
DESCRIPCIÓN DETALLADA DE LA INVENCIÓN DETAILED DESCRIPTION OF THE INVENTION
La primera etapa del proceso es la planeación de la misión, la cual, inicia con la delimitación de la región de interés (RDI), trazando un polígono sobre un mapa en la plataforma del sistema. Estos mapas son tomados de Google Maps, por lo que son de uso libre y cuentan con georreferenciación; ahorrando tiempo y aumentando la practicidad del procedimiento. El resultado es un archivo KML con una serie de coordenadas geodésicas que describen el polígono. The first stage of the process is the planning of the mission, which begins with the delimitation of the region of interest (RDI), drawing a polygon on a map in the system platform. These maps are taken from Google Maps, so they are free to use and have georeferencing; saving time and increasing the practicality of the procedure. The result is a KML file with a series of geodetic coordinates that describe the polygon.
El siguiente paso del proceso es el Barrido o escaneo aéreo multiespectral o hiperespectral; el cual consiste en determinar las rutas que la(s) aeronave(s) seguirá(n), de tal forma que se capture fotográficamente por completo la RDI. The next step in the process is multispectral or hyperspectral scanning or aerial scanning; which consists in determining the routes that the aircraft (s) will follow, so that the RDI is photographed completely.
Los recorridos determinados por el sistema son por lo general de forma. Cada línea del barrido se le denomina línea de vuelo. El proceso utilizado comienza por convertir las coordenadas geodésicas, que especifican el polígono, a coordenadas cartesianas en un marco de referencia de navegación local (NED). Esta conversión es necesaria para poder utilizar algoritmos de planificación en espacios euclidianos. A partir de ese momento se supone que la superficie a explorar no tiene curvatura. Esta suposición es razonable si comparamos el tamaño del terreno con respecto de la superficie terrestre. The routes determined by the system are usually in shape. Each scan line is called the flight line. The process used begins by converting the geodetic coordinates, which specify the polygon, to Cartesian coordinates in a local navigation reference frame (NED). This conversion is necessary to be able to use planning algorithms in Euclidean spaces. From that moment it is assumed that the surface to be explored has no curvature. This assumption is reasonable if we compare the size of the land with respect to the land area.
La entrada del sistema son las Imágenes en formato RAW capturadas y el log de vuelo originado en la ejecución de la misión. The entry of the system is the Images in captured RAW format and the flight log originated in the execution of the mission.
En la primera parte de procesamiento se realiza el tratamiento de imágenes para el análisis geográfico de los datos generados en una misión, con el cual se obtendrán sus respectivas imágenes georreferenciadas y un mosaico que integra estas imágenes en un único mapa de la parcela recorrida ver fig. 1. In the first part of the processing, the image processing is carried out for the geographical analysis of the data generated in a mission, with which your respective georeferenced images and a mosaic that integrates these images into a single map of the plot covered see fig. one.
Preprocesamiento: Tratamiento de imágenes RAW para generar imágenes .PNG rectificadas las cuales serán utilizadas en siguientes bloques. Preprocessing: Treatment of RAW images to generate rectified .PNG images which will be used in following blocks.
Georreferenciación: Localización de los pixeles de cada imagen en el sistema de coordenadas de GPS. Georeferencing: Location of the pixels of each image in the GPS coordinate system.
Generación de mosaico: Proceso de integración de las imágenes generadas en el vuelo en un único mapa de la región recorrida. Mosaic generation: Process of integration of the images generated in the flight into a single map of the region covered.
La siguiente parte del procesamiento de imágenes utiliza el mosaico generado para calcular y analizar diversos índices de vegetación para el despliegue de aplicaciones útiles en el campo. The next part of image processing uses the generated mosaic to calculate and analyze various vegetation indices for the deployment of useful applications in the field.
Extracción de índices: La extracción de índices consiste en el cálculo, pixel a pixel del mosaico de diversos índices de vegetación que han sido desarrollados para encontrar ciertas características referentes a propiedades de los cultivos. Como resultado, por cada índice, se obtiene una imagen en escala de grises discretizada con valores de 0 a 255. En la figura 2 se muestra una tabla de índices para agricultura de precisión utilizados en los algoritmos de procesamiento de imágenes. Index extraction: Index extraction consists of the calculation, pixel by pixel of the mosaic of various vegetation indices that have been developed to find certain characteristics related to crop properties. As a result, for each index, a discretized grayscale image with values from 0 to 255 is obtained. Figure 2 shows a table of indices for precision agriculture used in the image processing algorithms.
En la interfaz de procesamiento de imágenes el proceso se describe en el diagrama de la figura 3. Al inicio del procesamiento se carga el mosaico de imágenes unificado, previamente rectificado y geo referenciado después se selecciona el tipo de mapa esto es la interpretación visual que se le da al mosaico según algún espectro captado. Seguido se identifican los índices que se desean analizar cómo puede ser el de la luz visible, el del verde, el rojo, el infrarrojo cercano, entre otros, por último se selecciona el tipo de cultivo que se está analizando y la aplicación específica que se va a analizar tal como puede ser la cantidad de humedad o la cantidad de nitrógeno en la planta entre otras para generar una salida y el fin del procesamiento. En la figura 4 se presenta una pantalla del sistema procesamiento de imágenes. In the image processing interface the process is described in the diagram of figure 3. At the beginning of the processing the unified, previously rectified and geo-referenced image mosaic is loaded after the type of map is selected, this is the visual interpretation that is it gives the mosaic according to some spectrum captured. Next, the indices that are to be analyzed are identified, such as visible light, green, red, near infrared, among others, and finally, the type of crop being analyzed and the specific application selected are selected. It will analyze such as the amount of moisture or the amount of nitrogen in the plant, among others, to generate an output and the end of the processing. An image of the image processing system is presented in Figure 4.
La salida que arroja el análisis de imágenes es un comprimido, el cual contiene: imágenes RAW del vuelo, mosaico de parcela recorrida, índices de vegetación, mapas de temáticos lo cuales podrán ser utilizados para mostrar aplicaciones. La salida en esta etapa del proceso de análisis es entrada para al sistema de visualización de resultados para el usuario, que pueden ser utilizados como: The output of the image analysis is a tablet, which contains: RAW images of the flight, mosaic of plot traveled, vegetation indexes, thematic maps which can be used to show applications. The output at this stage of the analysis process is input to the system for displaying results for the user, which can be used as:
Imágenes con regiones con señalamientos para indicar regiones de concentración. Images with regions with signs to indicate regions of concentration.
Reportes semi automatizados en formato PDF. Semi-automated reports in PDF format.
Capas de información para visualización de mapas temáticos por cultivo. La salida obtenida del sistema de procesamiento de imágenes, es almacenada en un servidor en la nube y visualizada en una interfaz web como se ilustra en el diagrama de bloques de la figura 5. Layers of information for visualization of thematic maps by crop. The output obtained from the image processing system is stored on a server in the cloud and displayed on a web interface as illustrated in the block diagram of Figure 5.
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/MX2015/000182 WO2017105177A1 (en) | 2015-12-14 | 2015-12-14 | System for processing images for multispectral and hyperspectral analysis in precision agriculture |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/MX2015/000182 WO2017105177A1 (en) | 2015-12-14 | 2015-12-14 | System for processing images for multispectral and hyperspectral analysis in precision agriculture |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017105177A1 true WO2017105177A1 (en) | 2017-06-22 |
Family
ID=59056941
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/MX2015/000182 Ceased WO2017105177A1 (en) | 2015-12-14 | 2015-12-14 | System for processing images for multispectral and hyperspectral analysis in precision agriculture |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2017105177A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112097679A (en) * | 2020-09-10 | 2020-12-18 | 厦门海铂特生物科技有限公司 | Three-dimensional space measuring method based on optical information |
| EP3839804A1 (en) | 2019-12-20 | 2021-06-23 | KWS SAAT SE & Co. KGaA | Method and system for automated plant image labeling |
| TWI740224B (en) * | 2019-10-01 | 2021-09-21 | 台灣海博特股份有限公司 | Optical information three-dimensional space measurement method |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7149366B1 (en) * | 2001-09-12 | 2006-12-12 | Flight Landata, Inc. | High-definition hyperspectral imaging system |
| US20150022656A1 (en) * | 2013-07-17 | 2015-01-22 | James L. Carr | System for collecting & processing aerial imagery with enhanced 3d & nir imaging capability |
-
2015
- 2015-12-14 WO PCT/MX2015/000182 patent/WO2017105177A1/en not_active Ceased
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7149366B1 (en) * | 2001-09-12 | 2006-12-12 | Flight Landata, Inc. | High-definition hyperspectral imaging system |
| US20150022656A1 (en) * | 2013-07-17 | 2015-01-22 | James L. Carr | System for collecting & processing aerial imagery with enhanced 3d & nir imaging capability |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| TWI740224B (en) * | 2019-10-01 | 2021-09-21 | 台灣海博特股份有限公司 | Optical information three-dimensional space measurement method |
| EP3839804A1 (en) | 2019-12-20 | 2021-06-23 | KWS SAAT SE & Co. KGaA | Method and system for automated plant image labeling |
| WO2021122446A1 (en) | 2019-12-20 | 2021-06-24 | KWS SAAT SE & Co. KGaA | Method and system for automated plant image labeling |
| CN112097679A (en) * | 2020-09-10 | 2020-12-18 | 厦门海铂特生物科技有限公司 | Three-dimensional space measuring method based on optical information |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Jorge et al. | Detection of irrigation inhomogeneities in an olive grove using the NDRE vegetation index obtained from UAV images | |
| Kattenborn et al. | UAV data as alternative to field sampling to map woody invasive species based on combined Sentinel-1 and Sentinel-2 data | |
| WO2017099570A1 (en) | System and method for precision agriculture by means of multispectral and hyperspectral aerial image analysis using unmanned aerial vehicles | |
| Webster et al. | Three-dimensional thermal characterization of forest canopies using UAV photogrammetry | |
| Duan et al. | Comparison of ground cover estimates from experiment plots in cotton, sorghum and sugarcane based on images and ortho-mosaics captured by UAV | |
| Zarco-Tejada et al. | Tree height quantification using very high resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3D photo-reconstruction methods | |
| Brenner et al. | Estimation of evapotranspiration of temperate grassland based on high-resolution thermal and visible range imagery from unmanned aerial systems | |
| Ćwiąkała et al. | Assessment of the possibility of using unmanned aerial vehicles (UAVs) for the documentation of hiking trails in alpine areas | |
| Suziedelyte Visockiene et al. | Unmanned aerial vehicles for photogrammetry: analysis of orthophoto images over the territory of Lithuania | |
| Šedina et al. | Using RPAS for the detection of archaeological objects using multispectral and thermal imaging | |
| WO2022239006A1 (en) | Accurate geolocation in remote-sensing imaging | |
| Cermakova et al. | Calculation of visible spectral indices from UAV-based data: small water bodies monitoring | |
| Lopes Bento et al. | Overlap influence in images obtained by an unmanned aerial vehicle on a digital terrain model of altimetric precision | |
| Gonzalez Musso et al. | Applying unmanned aerial vehicles (UAVs) to map shrubland structural attributes in northern Patagonia, Argentina | |
| WO2017105177A1 (en) | System for processing images for multispectral and hyperspectral analysis in precision agriculture | |
| Tian et al. | Urban tree carbon storage estimation using unmanned aerial vehicles remote sensing | |
| Zhang et al. | Data on three-year flowering intensity monitoring in an apple orchard: A collection of RGB images acquired from unmanned aerial vehicles | |
| Kubiniec et al. | Street-Scale Urban Air Temperatures Predicted by Simple High-Resolution Cover-and Shade-Weighted Surface Temperature Mosaics in a Variety of Residential Neighborhoods | |
| Borisov et al. | Estimating Cloud Base Height From All-Sky Imagery Using Artificial Neural Networks | |
| Jensen et al. | A new method to correct pushbroom hyperspectral data using linear features and ground control points | |
| Młynarczyk et al. | Experience Gained When Using the Yuneec E10T Thermal Camera in Environmental Research | |
| Wulfsohn et al. | The use of a multirotor and high-resolution imaging for precision horticulture in Chile: An industry perspective | |
| Singh et al. | Applications of Unmanned Aerial Systems in Agricultural Operation Management: Part III: Best Practices for Efficient Aerial Surveying: AE553/AE553, 02/2021 | |
| Huang et al. | Multisource remote sensing field monitoring for improving crop production management | |
| Sharma | Comparison of low-cost methods for vegetation mapping using object based analysis of UAV imagery: a case study for the greater Côa Valley, Portugal |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15910823 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15910823 Country of ref document: EP Kind code of ref document: A1 |