WO2023018387A1 - Procédé de classification de cultures faisant appel aux réseaux neuronaux profonds - Google Patents
Procédé de classification de cultures faisant appel aux réseaux neuronaux profonds Download PDFInfo
- Publication number
- WO2023018387A1 WO2023018387A1 PCT/TR2021/050792 TR2021050792W WO2023018387A1 WO 2023018387 A1 WO2023018387 A1 WO 2023018387A1 TR 2021050792 W TR2021050792 W TR 2021050792W WO 2023018387 A1 WO2023018387 A1 WO 2023018387A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time series
- crop
- classification method
- process step
- classification
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/09—Supervised learning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
Definitions
- the present invention relates to a crop classification method that can operate with a high hit rate on a global scale.
- the present invention particularly relates to a crop detection and classification method that allows for classifying cultivated areas in the observed regions of the satellite imagery for detecting the boundaries of the clusters that exhibit similar development characteristics in the same time period in the regions, and collecting the data related to economic activities such as cultivated crop type and yield estimation in designated agricultural activity group areas.
- CNN-based solutions can be divided into two groups based on the usage of data: FCN (Fully Convolutional Networks) based solutions are generally used for crop classification through an input of a single image.
- FCN Fast Convolutional Networks
- This group of methods uses images of a single time for classification, and images are classified based on pixels by means of using architectures such as MASK R-CNN.
- Phenology based solutions are based on the detection of distinguishing features of the development of the plant groups. Data sets of these feature groups are obtained from multiple satellite images collected progressively over time and subjected to classification. Said group methods are generally based on product- and location-specific heuristic approaches.
- RNN based solutions have been developed as a solution to the problem that CNN-based solutions cannot capture temporal variations in general.
- LSTM Long-Short Term Memory
- LSTM Long-Short Term Memory
- the invention that is subject to the application numbered "CN107121681" relates to a residential land extraction system based on high- resolution satellite remote sensing data.
- the invention provides automatic extraction of residential areas by means of using remote sensing data based on the difference of characteristics between residential areas and non-resident areas. Handcrafted indexes are utilized in order to classify residential areas and deep learning techniques are not utilized in order to obtain distinguishing features for crop types.
- the invention that is subjected to the application numbered "CN109214287” relates to the technical field of remote sensing image processing, particularly to a method and system for crop interpretation based on RapidEye satellite remote sensing images.
- the invention that is subjected to the application numbered “CN110020635” relates to the technical field of crop classification, particularly to a method for finely classifying crops in a planting area based on images of a drone and a satellite images.
- the invention performs product classification by means of using convolutional neural network.
- Phenology based systems in the state of the art are based on finding the temporal patterns of changes that are formed in the reflection values during the development stage of plants.
- the change in reflection values differs according to the aforementioned local variables and sowing time. This situation increases the intraclass variances as in the previous method group and causes that the products in different classes are confused with each other.
- the present invention relates to a crop classification method that can operate with a high hit rate on a global scale.
- the most important object of the present invention is to enable the amount of the crop to be produced. Thus, it provides that the early indicators related to the crop yield and cultivated area are detected.
- Another important object of the present invention is to enable calculation of the agricultural feasibility score by means of evaluating the fields and the agricultural feasibility of the farmers worldwide. Thus, financial risks can become anticipatable.
- Yet another important object of the present invention is to provide frequently updated results and substantial analyzes with the obtained data. Thus, it gains an important place in the decision-making process of all large-scaled crop buyers, including commodity, food, and retail businesses.
- Yet another important object of the present invention is to provide determining, distinguishing, and repetitive features on a global scale, which serve to distinguish plant/product groups from each other, instead of directly using the reflection values for classification.
- Figure 1 illustrates a view of schematic flow diagram of the method according to the present invention.
- the present invention particularly relates to a classification method of a product that allows for classifying the areas in the region of which the satellite images are examined according to use, detecting the boundaries of the clusters that show similar development characteristics in the same time period in the regions that are determined as agricultural activity areas, and collecting the data related to economic activities such as product type classification and harvest estimation in designated agricultural activity group areas.
- the crop classification method (100) enables that the features of determining, distinguishing, and globally repetitive features are found, which serve to distinguish plant/product groups from each other for classification. Deep learning methods are used to extract said features. Then, these determined features are searched and the plants are classified independently from the local variables as much as possible.
- the crop classification method (100) divides the classification problem into three smaller problems unlike the methods in use. Thus, adaptation is provided much more easily compared to methods in the literature when there is a change in classification targets/classes or application areas.
- the crop classification method (100) provides classification by means of classifying the areas in the region of which the satellite images are examined according to use, detecting the boundaries of the clusters that show similar development characteristics in the same time period in the regions that are determined as agricultural activity areas, finding the features of determining, distinguishing, and globally repetitive features, which serve to distinguish plant/product groups from each other, and training the neural network via said data.
- the crop classification method (100) comprises the following process steps. Said process steps are executed on a server.
- step of determining (102) the areas that may be used for classification in the region where satellite images are examined, cloud, cloud shadow, and water areas are filtered from all images.
- Fmask 4.0 algorithm is used in order to carry out said process.
- a time series representation of pixel values and selected indices (NDVI, EVI) is generated for each band during the agricultural season.
- the indices selected therein are the normalized difference vegetation index (NDVI) and the enhanced vegetation index (EVI).
- NDVI normalized difference vegetation index
- EVI enhanced vegetation index
- a separate time series vector is generated for each pixel.
- missing points are obtained by means of modifying one of the predefined curves to the present data.
- the predefined curves are based on generalized cultivated land behavior and are sigmoid functions.
- the process step of extracting (106) the feature by means of using multiple time series from different bands is modified to be based on VGG-16 architecture and to use multiple time series from different bands as inputs. Classification is executed for each pixel independently.
- a detector head with additional convolutional feature layers in multiple scales is utilized in the final stage same as in SSD.
- a detector head with additional convolutional feature layers in multiple scales is utilized in the final stage same as in SSD.
- prior boxes as in SSD only one-dimensional prior regions are used on the time axis since the desired results do not include decomposition on the band axis of the two- dimensional data (bands-time). These regions cover the entire time interval (1 year) and include a wide combination of starting points and lengths.
- Crop index, localization, and classification loss functions are defined for each region in order to train the detector head.
- the crop index loss function measures the suitability of the relevant time period to the generalized crop behavior.
- the localization loss function measures the accuracy of the temporal position and length of the relevant time period.
- the classification loss function measures the accuracy of matching the relevant time period with the specific crop type.
- a vector of class scores is generated for the present classes.
- Said classification is independent of the start, end (time of the year) or duration of the time series. This allows classification of similar crops in multiple climates/geographic regions having differences in growth parameters. Scores are generated for each pixel showing how similar the growth model of said pixel is to the reference data for different crop types. For initial scores, binary regression loss is used to provide independent scoring for each class.
- the CNN model that performs the classification process uses a hierarchical tree. Primarily, the crop index confidence score is generated independently for each determined sub-category and parent category in each prior region that is defined on the time axis.
- the classification process is carried out by means of matching with the classes that are determined within the regions having high crop index.
- the processes of finding and classifying the agricultural activity in the time series are separated from each other by means of this change in the detector.
- the final classification is made by means of multiplying each child by the parental confidence score and navigating the tree. It is provided that the detector is precisely trained for distinguishing and globally repetitive features for each plant and region by means of the changes in this step made to the detector architecture.
- Agricultural areas are divided into two groups as cultivated lands and pastures.
- Cultivated lands create four different sub-branches: com, cotton, soybean, and wheat.
- Com has the grain and silage sub-branch, cotton has the fiber sub-branch, soybean has the bean sub-branch, and wheat has soft wheat and durum wheat sub-branch.
- Permanent vegetation is classified as permanent crop, forest, and shrubs. Permanent crops are classified as olives, vineyards, and tea plantations.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Molecular Biology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Data Mining & Analysis (AREA)
- Remote Sensing (AREA)
- Astronomy & Astrophysics (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
La présente invention concerne un procédé de classification de cultures qui peut être mis en œuvre avec une grande précision sur une échelle mondiale.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/TR2021/050792 WO2023018387A1 (fr) | 2021-08-11 | 2021-08-11 | Procédé de classification de cultures faisant appel aux réseaux neuronaux profonds |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/TR2021/050792 WO2023018387A1 (fr) | 2021-08-11 | 2021-08-11 | Procédé de classification de cultures faisant appel aux réseaux neuronaux profonds |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2023018387A1 true WO2023018387A1 (fr) | 2023-02-16 |
Family
ID=85200976
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/TR2021/050792 Ceased WO2023018387A1 (fr) | 2021-08-11 | 2021-08-11 | Procédé de classification de cultures faisant appel aux réseaux neuronaux profonds |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2023018387A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116563720A (zh) * | 2023-07-12 | 2023-08-08 | 华中师范大学 | 协同光学-微波物候特征的单双季水稻样本自动生成方法 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180211156A1 (en) * | 2017-01-26 | 2018-07-26 | The Climate Corporation | Crop yield estimation using agronomic neural network |
| CN109977802A (zh) * | 2019-03-08 | 2019-07-05 | 武汉大学 | 强噪音背景下作物分类识别方法 |
| CN112115983A (zh) * | 2020-08-28 | 2020-12-22 | 浙大城市学院 | 一种基于深度学习的农作物果实分拣算法 |
-
2021
- 2021-08-11 WO PCT/TR2021/050792 patent/WO2023018387A1/fr not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180211156A1 (en) * | 2017-01-26 | 2018-07-26 | The Climate Corporation | Crop yield estimation using agronomic neural network |
| CN109977802A (zh) * | 2019-03-08 | 2019-07-05 | 武汉大学 | 强噪音背景下作物分类识别方法 |
| CN112115983A (zh) * | 2020-08-28 | 2020-12-22 | 浙大城市学院 | 一种基于深度学习的农作物果实分拣算法 |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN116563720A (zh) * | 2023-07-12 | 2023-08-08 | 华中师范大学 | 协同光学-微波物候特征的单双季水稻样本自动生成方法 |
| CN116563720B (zh) * | 2023-07-12 | 2023-10-03 | 华中师范大学 | 协同光学-微波物候特征的单双季水稻样本自动生成方法 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| He et al. | Fruit yield prediction and estimation in orchards: A state-of-the-art comprehensive review for both direct and indirect methods | |
| Yang et al. | Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images | |
| Farjon et al. | Deep-learning-based counting methods, datasets, and applications in agriculture: A review | |
| Dokic et al. | From machine learning to deep learning in agriculture–the quantitative review of trends | |
| Oppenheim et al. | Detecting tomato flowers in greenhouses using computer vision | |
| Tiwari et al. | An experimental set up for utilizing convolutional neural network in automated weed detection | |
| Chen et al. | Sugarcane nodes identification algorithm based on sum of local pixel of minimum points of vertical projection function | |
| Zermas et al. | Extracting phenotypic characteristics of corn crops through the use of reconstructed 3D models | |
| Mhango et al. | Applying colour-based feature extraction and transfer learning to develop a high throughput inference system for potato (Solanum tuberosum L.) stems with images from unmanned aerial vehicles after canopy consolidation | |
| Nagaraju et al. | Multifactor analysis to predict best crop using Xg-Boost algorithm | |
| Zou et al. | A deep learning image augmentation method for field agriculture | |
| Liu et al. | AI-driven time series analysis for predicting strawberry weekly yields integrating fruit monitoring and weather data for optimized harvest planning | |
| Jiang et al. | An automatic rice mapping method based on constrained feature matching exploiting Sentinel-1 data for arbitrary length time series | |
| Mohd Robi et al. | Utilizing UAV Data for Neural Network-based Classification of Melon Leaf Diseases in Smart Agriculture. | |
| Yang et al. | Integrating multidimensional feature indices and phenological windows for mapping cropping patterns in complex agricultural landscape regions | |
| Liu et al. | MA-Res U-Net: Design of soybean navigation system with improved U-Net Model | |
| WO2023018387A1 (fr) | Procédé de classification de cultures faisant appel aux réseaux neuronaux profonds | |
| Concepcion et al. | Arabidopsis tracker: a centroid-based vegetation localization model for automatic leaf canopy phenotyping in multiple-pot cultivation system | |
| Widiyanto et al. | Monitoring the growth of tomatoes in real time with deep learning-based image segmentation | |
| Abbasov | Image recognition in agriculture and landscape protection | |
| Li et al. | Predicting winter wheat emergence and stem elongation time using CNN | |
| Mulla et al. | Computer vision system to detect maturity of tomatoes in real time using deep learning | |
| Tahaseen et al. | An assessment of the machine learning algorithms used in agriculture | |
| Zualkernan et al. | Machine Learning for Precision Agriculture Using Imagery from Unmanned Aerial Vehicles (UAVs): A Survey. Drones 2023, 7, 382 | |
| Oppenheim et al. | Tomato flower detection using deep learning |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21953582 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2024/001571 Country of ref document: TR |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21953582 Country of ref document: EP Kind code of ref document: A1 |