WO2020000043A1 - Plant growth feature monitoring - Google Patents
Plant growth feature monitoring Download PDFInfo
- Publication number
- WO2020000043A1 WO2020000043A1 PCT/AU2019/050670 AU2019050670W WO2020000043A1 WO 2020000043 A1 WO2020000043 A1 WO 2020000043A1 AU 2019050670 W AU2019050670 W AU 2019050670W WO 2020000043 A1 WO2020000043 A1 WO 2020000043A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- plot
- method includes
- plant
- images
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01G—HORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
- A01G7/00—Botany in general
- A01G7/06—Treatment of growing trees or plants, e.g. for preventing decay of wood, for tingeing flowers or wood, for prolonging the life of plants
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
- G06T7/0014—Biomedical image inspection using an image reference approach
- G06T7/0016—Biomedical image inspection using an image reference approach involving temporal comparison
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/254—Analysis of motion involving subtraction of images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01C—PLANTING; SOWING; FERTILISING
- A01C21/00—Methods of fertilising, sowing or planting
- A01C21/007—Determining fertilization requirements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/251—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving models
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
Definitions
- the present invention relates to a method and apparatus for monitoring plant growth features, and in one particular example to a method and apparatus for monitoring features relating to plant growth of crops in a plot.
- a further alternative approach is analysis of a shadow length in images, as described in Liasis, G and Stabrou, S (2016) “Satellite image analysis for shadow detection and building height detection”, ISPRS Journal of Photogrammetry and Remote Sensing 119:437- 450 and McCarthy, A and Tscharke, M (2014) (above).
- measuring the length of shadows requires line of sight to the ground and is not therefore suitable for use in closed canopy scenarios.
- Flowering may be detected through both optical filtering and image analysis techniques.
- Image analysis techniques that have reported use for segmenting flowers include: Local Binary Patterns to determine the strength of textural differences within the image Guo, Z, Zhang, L and Zhang, D (2010) “Rotation invariant texture classification using LBP variance (LBPV) with global matching: Pattern Recognition 43(3):706-7l9 and Ojala, T, Pietikainen, M and Maenpaa, T (2002) Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(7):97l-987.
- Spectral reflectance and colour segmentation techniques have been used to automatically determine when crops are flowering (Thorp and Dierig 2011 (above) and Thorp, KR, Dierig, DA, French, AN and Hunsaker, DJ (2011)“Analysis of hyperspectral reflectance data for monitoring growth and development of lesquerella”, Industrial Crops and Products 33(2):524-53 l .
- these techniques only allow specific analysis to be performed and cannot be used to monitor plant growth more broadly.
- an aspect of the present invention seeks to provide a method of monitoring plant growth features, the method including, in one or more electronic processing devices: on each of a plurality of different time periods: acquiring a plurality images of the plot captured by an imaging device; determining image properties of the acquired images; and, selecting at least one image for analysis; analysing selected images for the plurality of time periods; and, using results of the analysis to monitor plant growth features.
- the method includes performing at least one of: a height analysis to determine a plant height; a flowering analysis to determine a flowering status; an emergence analysis to determine an emergence date; a canopy analysis to determine a canopy extent; a disease/pest analysis to determine a presence of disease/pests; a status analysis to determine a plant status; and, a condition analysis to determine growing conditions.
- the method includes: selecting a first image that is used to determine at least one of: a plant height; and, a flowering status; and, selecting a second image that is used to determine at least one of: an emergence rate; a canopy cover; a presence of disease/pests; and, growing conditions.
- the image properties include: for the first image, at least one of: a shutter speed closest to about 31.25ms; a shutter speed of less than or equal to about 31.25ms; an ISO of between 80 and 250; and, an ISO of about 64; and, for the second image, at least one of: has a short exposure time; has a low gain; a shutter speed closest to about l .25ms; a shuter speed of less than or equal to about l.25ms; an ISO closest to about 50; an image captured closest to about l2pm; and, an image having a largest file size.
- the images are monoscopic images and the method includes performing a geometric analysis of the image to correct for perspective distortion.
- the method includes: performing a geometric analysis of image features in the selected image to determine a perspective projection model from measured plot parameters; and, analysing the image using the perspective projection model.
- the measured plot parameters include: a number of rows and columns; and, a plot length and width.
- the image features include at least one of: site vertices; site edges; plot vertices; plot edges; and, plot rows.
- the method includes determining measurements of plant height from the image using the perspective projection model and the measured plot parameters.
- the method includes: analysing the selected image to identify: plant locations on a ground plane; and, canopy features; and, measuring the plant height based on the plant locations, the canopy features and the perspective projection model.
- the method includes: using the canopy features to determine a top of each plant; and, measuring the plant height by measuring along a perspective line upwardly from the plant locations to a top of each plant using the perspective projection model.
- the method includes identifying canopy features by tracking movement between successive selected images for at least part of the plot.
- the method includes: analysing the selected image to identify ground points; and, using the ground points to determine the ground plane.
- the method includes determining the perspective projection model using vanishing points, including: a first vanishing point based on lines parallel to crop rows; a second vanishing point based on lines perpendicular to crop rows; and, a third vanishing point based on lines parallel to crop stems.
- the method includes using a perspective correction model to a identify a location of the plot in the selected image.
- the method includes: analysing the selected image to identify ground points; using the ground points to identify movement of the camera relative to the plot between different images; and, using movement of the camera to analyse selected images.
- the method includes determining a flowering status by: extracting a saturation channel for the selected image; enhancing a contrast in the saturation channel; and, identifying flowers using the contrast enhanced saturation channel by performing a connected component analysis.
- the method includes determining a flowering status by: extracting a saturation channel from HSB (hue, saturation and brightness) space to create a saturation channel image; inverting the saturation channel image to create an inverted image; applying a local equalisation to the inverted image to create a contrast enhanced image; and, applying a colour threshold to the contrast enhanced image to create a binary image.
- the method includes determining a flowering status by: identifying connected components in the binary image for a plot; counting a number of large connected components in the plot, the large connected components having a number of pixels equal to or greater than 5% of pixels in the binary image; and, determining the plot is flowering if there is more than one additional large connected component detected in the plot than in a previous time period.
- the method includes: applying colour segmentation to the selected image to identify green pixels; and, using a number of detected green pixels to determine at least one of: an emergence status; and, a canopy cover status. [0030] In one embodiment the method includes identifying green pixels when a green value of the sum of the red, green and blue channels is greater than at least one of: about 0.34; and, about 0.31.
- the method includes determining emergence if any green pixels are identified in the plot.
- the method includes calculating a canopy cover based on a proportion of green pixels to total pixels for the plot.
- the method includes: applying colour segmentation to the selected image to identify pixels having a defined colour; and, using a number of identified pixels to determine at least one of: a presence of a disease or pests; and, growing conditions.
- the method includes: determining a plant profile indicative of expected plant growth features; and, comparing monitored plant growth features to the plant profile to determine a plant status.
- the method includes generating a notification if the monitor plant growth features fall outside expected plant growth feature ranges.
- the method includes acquiring the images using a client device, and transferring the images to a remote server for analysis.
- an aspect of the present invention seeks to provide a method of monitoring plant growth features in a plot, the method including, in one or more electronic processing devices: acquiring an image of the plot captured by an imaging device that generates monoscopic images; performing a geometric analysis of image features to determine a perspective projection model from measured plot parameters; and, analysing the image using the perspective projection model to monitor plant growth features.
- the measured plot parameters include: a number of rows and columns; and, a plot length and width.
- the image features include at least one of: site vertices; site edges; plot vertices; plot edges; and, plot rows.
- the method includes determining measurements of plant height from the image using the perspective projection model and the measured plot parameters.
- the method includes: analysing the selected image to identify: plant locations on a ground plane; and, canopy features; and, measuring the plant height based on the plant locations, the canopy features and the perspective projection model.
- the method includes: using the canopy features to determine a top of each plant; and, measuring the plant height by measuring along a perspective line upwardly from the plant locations to a top of each plant using the perspective projection model.
- the method includes identifying canopy features by tracking movement between successive selected images for at least part of the plot.
- the method includes: analysing the selected image to identify ground points; and, using the ground points to determine the ground plane.
- the method includes determining the perspective projection model using vanishing points, including: a first vanishing point based on lines parallel to crop rows; a second vanishing point based on lines perpendicular to crop rows; and, a third vanishing point based on lines parallel to crop stems.
- the method includes using a perspective correction model to a identify a location of the plot in the selected image.
- an aspect of the present invention seeks to provide a system for monitoring plant growth features in a plot, the system including one or more electronic processing devices that: on each of a plurality of different time periods: acquire a plurality images of the plot captured by an imaging device; determine image properties of the acquired images; and, select at least one image for analysis; analyse selected images for the plurality of time periods; and, use results of the analysis to monitor plant growth features.
- an aspect of the present invention seeks to provide a system for monitoring plant growth features in a plot, the system including one or more electronic processing devices that: acquire an image of the plot captured by an imaging device that generates monoscopic images; perform a geometric analysis of image features to determine a perspective projection model from measured plot parameters; and, analyse the image using the perspective projection model to monitor plant growth features.
- Figure 1A is a flow chart of an example of a method for use in monitoring plant growth features
- Figure 1B is a flow chart of a second example of a method for use in monitoring plant growth features
- Figure 2 is a schematic diagram of an example of a system for monitoring plant growth features
- Figure 3 is a schematic diagram of an example of a processing system
- Figure 4 is a schematic diagram of an example of a client device
- Figure 5A is a schematic diagram of an example of an imaging system
- Figure 5B is a schematic diagram of an example of the physical configuration of the imaging system of Figure 5 A;
- Figure 6 is a flow chart of a specific example of a process for acquiring images
- Figures 7A to 7H are example images of a plot of maize plants captured at different times of day;
- Figures 8A to 8D are example images captured under different lighting conditions
- Figure 9 is a graph of an example of camera settings for a subset of images with consistent lighting
- Figures 10A and 10B are example images showing changes in ground plane location
- Figure 10C is a graph showing examples of changes in ground point location
- Figure 1 1 is a flow chart of a specific example of a process of analysing images for monitoring plant growth features
- Figure 12 is a flow chart of a specific example of a height analysis process
- Figures 13A to 13K are schematic diagrams showing steps in developing a perspective correction model
- Figures 14A to 14H are example images showing canopy motion tracking for height estimation of maize plants
- Figures 15A to 15H are example images showing canopy motion tracking for height estimation of soybean plants
- Figure 16 is an example image showing the use of the perspective correction model and canopy position in plant height estimation
- Figure 17 is a flow chart of a specific example of a flowering analysis process
- Figures 18A to 18F are example images showing steps in the flowering analysis process
- Figure 19 is a flow chart of a specific example of an emergence and canopy analysis process;
- Figures 20A to 20H are example images showing steps in the emergence analysis process;
- Figures 21A and 21B are example images showing steps in the canopy analysis process
- Figures 22A to 22D are example graphs showing a comparison of measured and estimated height for soybean plants
- Figures 23A to 23H are example graphs showing a comparison of measured and estimated height for maize plants
- Figures 24A to 24D are example graphs showing height curves estimated for soybean plants
- Figures 25A to 25D are example graphs showing example height curves for maize plants.
- Figures 26A to 26D are example graphs showing proportion of flower pixels for a plot of maize plants
- Figures 27A to 27D are example graphs showing emergence of maize plants
- Figures 28A to 28D are example graphs showing canopy cover for maize plants
- Figures 29A to 29D are example graphs showing canopy cover for soybean plants.
- Figures 30A to 30D are example graphs showing example canopy cover for maize plants.
- Example processes for use in monitoring plant growth features will now be described in more detail with reference to Figures 1A and 1B.
- the processing devices can form part of one or more processing systems, such as computer systems, and are typically in communication with at least one imaging device.
- the imaging device is part of an imaging system, and the electronic processing device may be integrated into, distributed between, or in communication with the one or more imaging systems.
- plant growth feature is intended to refer to a feature that relates in some way to plant growth.
- the feature could be a feature of one or more plants, such as a plant height, emergence date, canopy cover, flowering status or the like.
- the plant growth feature could also be a feature that has an impact on the growth of the plant, such as environmental conditions, including the presence, absence or degree of moisture, including rainfall or fog, or the presence or absence of diseases or pests. It will therefore be appreciated that the term encompasses features that relate generally to the growth of plants, and include, but are not limited to features of plants themselves.
- plot is intended to refer to an area including one or more plants.
- the plants can be arranged in columns and/or rows and are typically planted at a similar time and treated in a similar manner.
- Plots may form part of a larger site, such as a field, which may include a number of plots, although this is not intended to be limiting, and it will be appreciated that the techniques could be applied to individual plants, or any grouping of plants, including fields, sites, plots, or the like.
- the process is performed in order to acquire images that are suitable for use in subsequent downstream analysis processes.
- a plurality of images of a plot are acquired over the course of a time period.
- the time period is typically one day, although this is not essential and could include shorter or longer time periods, such as a morning, or evening, a four hour window, or several days.
- the images of the plot are captured at different times during the time period so that the images are captured in a range of different lighting conditions.
- the images could be captured at set time intervals, such as every 20 minutes, hourly or the like, although this is not essential and alternatively, could be captured at set times, or when certain criteria are met, such as when defined lighting conditions are detected, or the like.
- the images are typically captured using an imaging device, such as a camera or similar, with this being performed using automated settings, so that capture of the image is optimised for the prevailing environmental conditions.
- an imaging device such as a camera or similar
- image properties of the captured images are determined, typically by retrieving this information from metadata associated with the images, such as EXIF (Exchangeable Image File Format) metadata stored as part of the image data by the camera or other image capture device.
- metadata associated with the images such as EXIF (Exchangeable Image File Format) metadata stored as part of the image data by the camera or other image capture device.
- image properties include, but are not limited to a date and/or time of capture, a capture location, an image orientation (rotation), an imaging device aperture, a shutter speed, a focal length, a metering mode, and ISO speed information.
- the image properties are used to select an image for analysis.
- this process typically involves selecting one or more images having image properties that meet defined criteria, such as set shutter speeds, aperture settings, or the like. This allows image(s) to be selected which are best suited for subsequent analysis, with different images optionally being selected for different analysis processes.
- the process of acquiring and selecting images is typically repeated on each of a number of different, and optionally consecutive time periods, such as on consecutive days so that sequences of images can be used in performing analysis at step 115, with results of the analysis being used to monitor plant growth features at step 120.
- the nature of the analysis performed will vary depending upon the preferred implementation, but this could include any one or more of height analysis to detect a plant height, flowering analysis to determine a flowering status of the plant, emergence analysis status, to determine a date of emergence, and a canopy analysis in order to ascertain a current extent of canopy cover.
- Other types of analysis that could be performed include a disease/pest analysis to identify the presence and/or absence of any diseases/pests, a status analysis to determine a plant status, such as an indication of plant health, or a condition analysis to determine features of growing conditions, such as levels of rainfall, sunlight, or the like.
- this process allows images to be captured utilising automatic settings on an imaging device, which optimises the image for the current lighting conditions. Images are captured throughout a time period, such as over the course of a day, with the properties of the captured images being examined to select an image which is most appropriate for performing a particular analysis process.
- different image analysis processes typically perform differently on images with different characteristics. For example canopy cover analysis might depend on detection of green colours within an image, and hence work best in bright images, and less well in dull images.
- the current approach collects multiple images throughout a day, using automatic camera settings, in order to optimise the captured image for the prevailing ambient conditions. The image properties are then examined, allowing the most appropriate image to be selected for use in subsequent analysis, thereby maximising the chance of a suitable image being captured on any given day.
- the above described process allows different images to be selected for use in different analysis processes, thereby optimising the image used for each analysis, whilst avoiding the need to reconfigure camera settings between capture of different images, other than through the use of existing automated setting processes.
- a first image is selected that is used to determine plant height and/or flowering status, whilst a second image is used to determine an emergence rate, a canopy cover, a presence of disease/pests or growing conditions.
- this is not essential and any combination could be used, including the use of additional images if appropriate.
- the first image utilises an image having a shutter speed closest to about 31.25 milliseconds, less than or equal to about 31.25 milliseconds, having an ISO of between 80 and 250 and more typically an ISO of about 64.
- the second image typically has a short exposure time and a low gain, and more typically has a shutter speed closest to, or less than or equal to 1.25 milliseconds with an ISO of about 50.
- the second image is also captured closest to about midday and is preferably an image having a largest file size.
- the optimal images can be determined by inspecting images over multiple days and times of the day to determine the image properties that most enhance the features required for image analysis. This could be performed manually, and/or could be performed using image analysis techniques, optionally used in conjunction with machine learning, allowing the optimum image properties to be automatically determined.
- step 150 an image of a plot is acquired, typically using the process described above with respect to Figure 1A.
- a geometric analysis of the image is performed at step 155.
- the geometric analysis is performed based on image features in the selected image, for example by identifying the location of defined points, such as vertices of the plot or site.
- Measured plot parameters such as the physical size of the plot are determined at step 160, either based on user inputs, or based on measurements of the plot detected using a physical sensor.
- This information can then be used to calculate a perspective projection model at step 165 which can correct for perspective distortions resulting from the pose of the imaging device relative to the plot. Once the perspective correction model has been determined, this can then be used in analysing the image to allow plant growth features to be determined, and hence used in monitoring plant growth features.
- plant locations are identified at step 170, for example by analysing the image to identify the location of columns and rows of plants and/or boundaries of the region of interest for analysis.
- Canopy features are identified at step 175, typically by examining movement between successive images, in order to detect a top of the canopy on the basis that the canopy will tend to move more than other parts of the image.
- the plant locations, canopy features and perspective projection model are used together with the measured plot parameters to calculate a plant height.
- this is achieved by measuring a distance vertically between the canopy and the base of each plant, with the vertical direction being ascertained using the perspective correction model. The resulting distance can then be scaled, using the measured plot parameters, enabling an actual plant height to be determined.
- the above described approach provides a mechanism for deriving a perspective correction model, which can then be used in analysing plant growth features, for example allowing plant height to be measured by analysing the image to make appropriate measurements of plant height.
- the current approach of using a perspective correction model derived from image features allows features such as the height, as well as other features such as the location and extent of the plot within the image to be calculated from a single monoscopic image, captured from any arbitrary orientation relative to the plot, enabling the height to be reliably detected using a basic hardware configuration, in a wide variety of situations, making this particularly suitable for wide scale deployment.
- the height can be measured by projecting upwardly from a plant location on a ground plane to a canopy feature, using the perspective correction model, with the measured distance being utilised in order to calculate plant height.
- the perspective projection model can be used to project upwardly from the plant location on a ground plane to the top of the canopy using this to determine the height of each plant.
- the method includes analysing a selected image to identify ground points and then using the ground points to determine the ground plane, with this being used to determine the location of each plant on the ground plane.
- the perspective projection model can be calculated using any image features, and could for example be based on set markers, such as flags positioned in the plot and hence present within the image. In one preferred example however this is performed on the basis of site vertices, site edges, plot vertices, plot edges and/or plot rows as these will typically be present in each image of a plot and also be readily identifiable, for example using image analysis techniques, such as edge detection or the like.
- the method for determining the perspective projection model includes calculating vanishing points including a first vanishing point based on lines parallel to crop rows, a second vanishing point based on lines perpendicular to crop rows and a third vanishing point based on the lines parallel to crop stems. The vanishing points are then utilised in order to allow for perspective within the image to be corrected, for example, allowing measurements to be made upwardly from a plant location to the top of the canopy and to accurately measure a top of the canopy.
- the above described methods may also involve analysing a selected image to identify ground points and using ground points to identify movement of the camera relative to the plot between different images. This can be used to correct for relative movement between the camera and the plot, for example in case the camera is moved between successive images. For example, this can allow a single perspective projection model to be calculated, with this being corrected on a daily basis based on the position of the camera relative to the plot.
- the process for determining flowering status is typically performed based on an analysis of colour in a captured image. Specifically, in one example, this involves extracting a saturation channel for the selected image, enhancing a contrast in the saturation channel and then identifying flowers using the contrast enhanced saturation channel by performing a connected component analysis.
- this typically involves extracting a saturation channel from HSB (hue, saturation and brightness) space to create a saturation channel image.
- the saturation channel image is then inverted to create an inverted image with local equalisation being applied to the inverted image to create a contrast enhanced image.
- a colour threshold can then be applied to the contrast enhanced image to create a binary image with this then being analysed to perform the connected component analysis.
- the use of a contrast enhanced saturation channel can more accurately identify regions having a colour corresponding to that of the expected flower colour, as well as to more easily distinguish from noise and other parts of the plant.
- the method of performing the connected component analysis typically involves identifying connected components in the binary image, counting a number of large connected components which exceed more than 5% of pixels in the binary image and determining if the plot is flowering if there is more than one additional large connected component detected in the plot than on a previous day.
- this aims to identify rapid increases in large connected components, which will in turn correspond to a flowering event for the plants in the plot.
- the method can also include analysing a selected image to identify green pixels, and then using a number of detected green pixels to determine an emergence or canopy cover status.
- the number of green pixels is determined when a green value of the red, green and blue channels is greater than about 0.34, with emergence being detected if any green pixels are identified in a plot and with the canopy cover being based on a proportion of green pixels to total pixels for the plot.
- This green pixel detection algorithm would vary for image sensors that have different in-built colour filtering and for example for a Britecell image sensor, a more appropriate value is about 0.31 for the Samsung Galaxy S7.
- the disease and/or pest can be determined by identifying other defined colours, in which case the process typically involves applying colour segmentation to the selected image to identify pixels having a defined colour, and then using a number of identified pixels to determine a presence of a disease or pests. In this instance, it will be appreciated that different colours might be used to identify different pests or diseases. Additionally, a reduction in amounts of green pixels can also be used to identify the presence of pests and/or diseases, either through obstruction of or consumption of plant material.
- analysing growing conditions can be performed in a similar manner, for example by examining soil colouring to determine an extent of soil moisture, identifying rain, fog, sunlight levels or the like.
- a plant profile is determined which is indicative of expected plant growth features. This can define for a given plant, the features that would be expected at a particular stage of growth, for example defining how plant height and canopy cover should vary over time after emergence. Monitored plant growth features can then be compared to the expected plant growth features defined in the plant profile to determine a plant status. For example, this can be used to identify circumstances in which plant growth is below expectations, which could in turn be indicative of an issue, such as poor growing conditions, insufficient fertiliser application or the like. In situations in which plant growth features fall outside expected plant growth feature ranges, a notification can be generated, for example to alert a crop manager, or farmer, that intervention might be required.
- the above described techniques allow images of plants to be analysed to monitor characteristics of the plants, including the plant height, emergence date, canopy cover and flowering status.
- the images can be captured using an imaging device, for example forming part of a client device, or custom imaging system, with the images optionally being transferred to a remote server for analysis.
- one or more processing systems 210 are provided coupled to one or more client devices 220 and one or more imaging systems 230, via one or more communications networks 240, such as the Internet, and/or a number of local area networks (LANs).
- communications networks 240 such as the Internet, and/or a number of local area networks (LANs).
- processing systems 210, client devices 220 and imaging systems 230 could be provided, and the current representation is for the purpose of illustration only.
- the configuration of the networks 240 is also for the purpose of example only, and in practice the processing systems 210, client devices 220 and imaging systems 230 can communicate via any appropriate mechanism, such as via wired or wireless connections, including, but not limited to mobile networks, private networks, such as an 802.11 networks, the Internet, LANs, WANs, or the like, as well as via direct or point-to-point connections, such as Bluetooth, or the like.
- processing systems 210 are identified as different entities, as will be appreciated from the following, there can be overlap in capabilities, so that for example a client device 220 could also function as an imaging device 230, and hence the differentiation between devices is intended to be for the purpose of illustration and is not intended to be limiting.
- the processing systems 210 are adapted to receive and analyse images captured by the imaging systems 230, and provide access to resulting analysis via the client devices 220.
- the processing systems 210 are shown as single entities, it will be appreciated they could include a number of processing systems distributed over a number of geographically separate locations, for example as part of a cloud based environment. Thus, the above described arrangements are not essential and other suitable configurations could be used.
- the processing system 210 includes at least one microprocessor 300, a memory 301, an optional input/output device 302, such as a keyboard and/or display, and an external interface 303, interconnected via a bus 304 as shown.
- the external interface 303 can be utilised for connecting the processing system 210 to peripheral devices, such as the communications networks 240, databases, other storage devices, or the like.
- peripheral devices such as the communications networks 240, databases, other storage devices, or the like.
- a single external interface 303 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided.
- the microprocessor 300 executes instructions in the form of applications software stored in the memory 301 to allow the required processes to be performed.
- the applications software may include one or more software modules, and may be executed in a suitable execution environment, such as an operating system environment, or the like.
- the processing systems 210 may be formed from any suitable processing system, such as a suitably programmed PC, web server, network server, or the like.
- the processing system 210 is a standard processing system such as an Intel Architecture based processing system, which executes software applications stored on non-volatile (e.g., hard disk) storage, although this is not essential.
- the processing system could be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
- the client device 220 includes at least one microprocessor 400, a memory 401, an input/output device 402, such as a keyboard and/or display and an external interface 403, interconnected via a bus 404 as shown.
- the external interface 403 can be utilised for connecting the client device 220 to peripheral devices, such as the communications networks 240, databases, other storage devices, or the like.
- peripheral devices such as the communications networks 240, databases, other storage devices, or the like.
- a single external interface 403 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided.
- the client device 220 may also optionally include other interfaces, including an imaging device, such as a camera or the like.
- the microprocessor 400 executes instructions in the form of applications software stored in the memory 401, and to allow communication with one of the processing systems 210.
- the client device 220 be formed from any suitably programmed processing system and could include suitably programmed PCs, Internet terminal, lap-top, hand-held PC, a tablet, a smart phone, or the like.
- the client device 220 can be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
- FPGA Field Programmable Gate Array
- the imaging system includes at least one microprocessor 500, a memory 501, an optional input/output device 502, such as a keyboard and/or display, and an external interface 503, interconnected via a bus 504 as shown.
- the external interface 503 can be utilised for connecting the imaging system 230 to peripheral devices, such as the communications networks 240, databases, other storage devices, or the like.
- peripheral devices such as the communications networks 240, databases, other storage devices, or the like.
- a single external interface 503 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided.
- the imaging system 230 also includes an imaging device 505, such as a camera or the like.
- the microprocessor 500 executes instructions in the form of applications software stored in the memory 501 to allow the required processes to be performed.
- the applications software may include one or more software modules, and may be executed in a suitable execution environment, such as an operating system environment, or the like.
- the imaging systems 230 may be formed from any suitable imaging system, such as network enabled cameras, smartphones or the like.
- the processing system could be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
- the imaging device 505 can be provided in a housing 531, which is supported by a tower 532, extending upwardly from a base 533.
- the tower 532 may be connected to the base 533 via a drive unit 534 allowing an orientation of the housing 531 and hence a camera 505 to be adjusted, for example allowing a single imaging device to be used to capture images of multiple different plots.
- the processing system 210 is a server that operates to analyse images, with actions performed by the processing system 210 being performed by the processor 300 in accordance with instructions stored as applications software in the memory 301.
- the client device 220 is a user device to allow user interaction with the system, for example to review the results of image analysis, with actions performed by the client device 220 being performed by the processor 400 in accordance with instructions stored as applications software in the memory 401 and/or input commands received from a user via the I/O device 402.
- the imaging system is a smartphone, that captures and uploads images to the server 210, with actions performed by the imaging system 230 being performed by the processor 500 in accordance with instructions stored as applications software in the memory 501.
- the imaging system consisted of a smartphone and solar power system, supported by a raised platform, with this being used for sub-daily image capture of the whole trial site throughout the crop season at a high temporal resolution (sub daily).
- the camera tower could provide a large data set of images at different times of day and provides a suitable dataset to enable development of a robust image analysis algorithm.
- UAVs Unmanned Aerial Vehicles
- ground vehicles could be used, this typically requires human oversight, and hence is less desirable.
- a smartphone was selected as the imaging system as it integrates a GPS, processor, camera and internet connection. Sony Xperia Z2 and Samsung Galaxy S7 smartphones were selected. An App was developed to collect an image every 20 minutes and upload it to a specified server. The mobile application was developed for Android phones, and written in Java and Android application framework.
- the imaging system acquires an image from the camera 505 with the image being uploaded to server 210 at step 605.
- the server then operates to save the image at step 610 with this process being repeated periodically throughout the day.
- Any number of appropriate images can be captured but in one example images are captured every 20, 30 or 60 minutes during daylight hours, although other times could be used.
- leaves appear yellow early in the day due to forward scattering of light (i.e. sun in front of camera), as shown in Figures 7A to 7D.
- the colours in the image shown in Figure 7E, captured at 1 :35 pm, are less saturated than at other times due to the camera performing automatic adjustments of exposure, gain and/or white balance to capture balanced images for the varying natural lighting.
- the flowers in the image captured at 5:35 pm shown in Figure 7G are brighter than the leaves and appear yellow.
- the images captured at 7:00 pm shown in Figure 7H have reduced sharpness because of low light.
- the colour is most intense in full sunlight around midday, shown in Figure 7D, which assist in enabling robust identification of canopy cover against a soil or stubble background.
- the optimal time for flower segmentation appears to be late afternoon at a time corresponding to sunset, in which there was back scattering of light (sun behind camera). Similarly, the optimal time for canopy segmentation was around midday.
- the server 210 retrieves all of the images for an entire day for a respective plot and examines the image properties at step 620, with the image properties being compared to predetermined image property criteria to select respective first and second images at step 625.
- this involves inspecting the EXIF (Exchangeable Image File) settings stored in the file properties of photos and comprising common settings (e.g. exposure and gain).
- EXIF Exchangeable Image File
- common settings e.g. exposure and gain.
- Automatic settings allow the camera to decide the optimal settings for aesthetically pleasing brightness and contrast for the given lighting situation.
- a first image for performing height and flowering analysis is typically selected based on the image closest to the following criteria:
- a second image for performing emergence and canopy cover is typically selected based on the image closest to the following criteria:
- Figures 8A to 8D shows a subset of images automatically selected using this process, which show a consistency in appearance, despite being captured at different times with different settings. Additionally, the graph of Figure 9 shows shutter speed and ISO settings for images captured under a range of lighting situations and demonstrate that there are a set of exposure and gain parameter values that result in selection of images with consistent appearance for image analysis.
- ground points within the image are identified with this being used to perform movement correction at step 635.
- This enables the ground plane to be located if the camera moves because of wind or maintenance, or if the camera is moved in order to image different plots.
- This also enables the ground plane to be determined from imagery collected using ground or aerial vehicles.
- Figures 10A to 10C show the change in the coordinates for one of the ground plane points 1000 in daily images. This demonstrates that a tracking algorithm is required to update the ground plane in the three point perspective. The selection of control points could be implemented automatically using flags in the field.
- each image is uploaded to the server 210. However, it will be appreciated that this is not essential, and alternatively images could be stored locally by the imaging system 230, and then analysed using the above described process, with only selected images being uploaded to the server 210 as required.
- the process is typically performed sequentially, with emergence and canopy cover analysis being performed initially to detect the early stages of growth, and the process only proceeding to performing height and/or flowering analysis in the event that the plants are sufficiently developed.
- this is not essential and it will be appreciated that alternatively each of the analysis processes could be performed in parallel.
- a selected second image for the current day is initially retrieved at step 1100.
- step 1105 it is determined if emergence has previously been recorded and if not the process moves on to step 1110 to perform emergence analysis.
- step 1115 it is determined if emergence is detected and if not the process returns to step 1100 allowing an image analysis to be performed on a subsequent day. Otherwise, if emergence is detected the emergence date is recorded at step 1120.
- the second image is analysed to perform a canopy analysis at step 1125, with this being used to determine a level of canopy cover, which is then recorded. This can optionally be compared to a threshold to determine if the canopy cover has reached a certain level. If not, this suggests that the plants are at an early growth stage and hence additional analysis may not be required, allowing the process to return to step 1100 to analyse an image on a subsequent day.
- step 1135 a first image for the day is retrieved at step 1135.
- a height analysis is then performed at step 1140 with the resulting height being stored and optionally compared to a threshold at step 1145. This is performed to determine if the plants are sufficiently developed for flowering to potentially occur. If not, the process returns to step 1100 otherwise a flowering analysis is performed at step 1150, with results being recorded.
- a first image is selected with this being used to identify certain image features at step 1205.
- the image features typically correspond to vertices of the plots, and examples of these are shown as A, B, C, D, E and F in Figure 13A.
- This can be performed using image analysis techniques. However, as this typically only needs to be completed a single time for each plot, for each camera and position, and can be corrected based on tracked relative movement between the camera and plot, this can alternatively be performed manually.
- a perspective projection model is calculated using a perspective transformation algorithm, which is required to identify the location of plots in a perspective image for the image analysis algorithm.
- a three point perspective model is required with: (i) all lines parallel to crop rows leading to the first vanishing point; (ii) all lines perpendicular to crop rows leading to the second vanishing point; and (iii) all lines parallel to crop stems leading to the third vanishing point.
- the perspective algorithm involves obtaining parameters of a plot that is as observed in the camera image, including a number of rows and columns (a minimum of one each), and a length and width of each plot in metres. From the image, features are analysed in order to establish the perspective vanishing lines on the ground plane of the camera image. From this, the perspective projection model is calculated, with the this being used to project the trial layout grid onto the ground plane of the camera image and then calculate the detected height of each plot from pixels within the top of each plot where flowers would be located.
- a first vanishing point 1301 is where these lines intersect, on the ground plane, and is calculated using same approach.
- lines 1313, 1314 are fitted through points A-D and B-C, with a second vanishing point 1302 being identified where these lines intersect, as shown in Figure 13C.
- a line 1315 is provided between the first and second vanishing points, as shown by the dotted line in Figure 13D, with this representing a horizon line.
- the horizon line is then offset so it passes through the line 1313 to form a measure line 1316, extending from an intersection between the measure line 1316 and line 1311 to point D.
- the measure line 1316 is then sub-divided and extended to the first vanishing point to get horizontal plot spacing, as shown by the lines 1317 in Figure 13E.
- Lines 1318, 1319 are fitted through the points A-E and C-F, with an intersection between the lines 1318, 1319 corresponding to a third vanishing point 1303, shown in Figure 13G.
- Lines 1320 at specified heights are projected from the third vanishing point 1303 to horizontal plot marker lines on the line 1313, with these then being projected to the first vanishing point 1301 as lines 1321.
- Lines 1322 are then projected from the third vanishing point 1303 to each plot along the left hand side of the site.
- the line 1322 passing through point A is projected to the first vanishing point 1301, to form line 1323 shown in Figure 13J, with the point of intersection of lines 1322 and 1323 being projected to the second vanishing point to form lines 1324, as shown in Figure 13K.
- step 1215 plant locations are identified with a current image being compared to previous images at step 1220 to identify canopy movement at step 1225.
- the tracking algorithm tracks visual canopy features in the centre of each plot daily from emergence.
- Figures 14A to 14H and 15A to 15H demonstrate the motion tracking for maize and soybean weekly from emergence.
- motion tracking is achieved using the OpenCV TrackerKCF framework, implemented with additional constraints where the tracked object only moves upward and within the plot boundaries.
- the distance that the tracked box moved was related to actual height using a perspective transformation by projecting the boundary of the tracked plot to the second vanishing point to determine the actual height, with the detected location of the object being projected to the ground plane along lines to the third vanishing point, as shown in Figure 16.
- This identifies the top of each plant, which can then be related to the actual height, so that the geometry of the scene perspective converts the tracked movement in pixels to distance for height estimation.
- a saturation channel of HSB space is extracted at step 1700, with the saturation channel being inverted at step 1705, before contrast enhancement is performed at step 1710 by applying a local equalisation (200 x 200) to the image.
- a local equalisation 200 x 200
- a colour threshold is then applied to the image to convert the image to black and white, with black pixels having a value less than 200 and white pixels with value greater than 200, with example resulting images being shown in Figures 18E and 18F.
- step 1720 connected components are identified in the image, using known approaches from the art, with a number of connected components having a number of pixels equal to greater than 5% of pixels in image being counted at step 1725.
- the count is compared to a count from previous day at step 1730, with the flowering status being determined based on results of the comparison at step 1735.
- the plot is determined to be flowering if there is more than one additional large connected component detected in each plot than the previous day.
- a second image is selected which is not noisy (low gain) and bright to emphasise the green colour of plants (short exposure).
- Colour segmentation is applied to the second image at step 1905, with this being used to identify green pixels at step 1910.
- this is achieved to identify pixels for which the green value, based on a sum of the red, green and blue channels, is greater than 0.34, but other values might be used for other imaging devices.
- the thresholding was 0.31 instead of 0.34.
- step 1915 it is determined if these are the first green pixels, and if so, an emergence date is determined at step 1920. Otherwise the green pixels are used to calculate a canopy cover at step 1925, based on a proportion of detected green pixels to total pixels in the plot.
- Example images illustrating progression of plant emergence over six days for a soybean crop are shown in Figures 20A to 20H, with Figures 20A, 20C, 20E, 20G showing original second images and Figures 20B, 20D, 20F, 20H showing processed images. Similarly, original and processed images for use in calculating canopy cover are shown in Figures 21A and 21B respectively.
- Results are presented for height, flowering, emergence rate and canopy cover detected. These results are presented for individual varieties of maize and soybean and also according to bay number, where bays 1, 2, 3 and 4 are 20 m, 30 m, 40 m, and 50 m from the camera tower, respectively. This can be used to determine the accuracy of the image analysis the further the crop is from the camera.
- Figures 22A to 22D and 23A to 23H compare height from the object tracking algorithm with ground truthing measurements for soybean and maize, for each of the four bays, respectively. These show a linear relationship between the measured and detected height. The 26 December maize height is underestimated for all bays because the manual measurements of the maize included the flower stalk, while the image analysis tracks the leaves throughout the season as the plants grows, rather than the flower stalk.
- Figures 24A to 24D and 25A to 25D show height curves for each variety of soybean and maize respectively.
- Tables 1, 2A and 2B compare the error in maize and soybean height for each camera tower and distance of plots from the camera tower. These show that the error in height for soybean was 3.9-16.4% (1.4-4.4 cm) and for maize was 3.8-15.8% (4.8-12.4cm) in the 2016/17 trial (Table 2A) and for maize was 0.7-6.9% (5.9-15.6cm) in the 2018 trial (Table 2B).
- the error in height detection generally increased as the camera was further from the plot.
- the error in height detected for soybean was higher for the N-S tower than the S-N tower.
- the average error was 6.5%, 5.9%, 7.7% and 8.5% for bays one, two, three and four from the camera, respectively. There was no significant difference in height detected between the uniform crop and variety trial. In the 2018 season, the average error was 2.1%, 3.9%, 5.0% and 5.7% for bays one, two, three and four from the camera, respectively.
- the duration of flowering was estimated using the image analysis, with results being shown in Table 3B for the 2018 season.
- the image analysis determined initial flowering date and maximum flowering date for the plots on average within one day in the 2018 season.
- Table 3B [0193] The graphs shown in Figures 26A to 26D compare the detected flowering dates for different distances from the 6m tower. This shows that the proportion of detected flower pixels decreased as the plots were further from the crop. However, the flowering was still detected in plots four bays (50 metres) from the camera.
- Tables 4 and 5 compare the canopy cover and ground truthing measurements for each camera tower and bay number from the camera.
- the error in canopy cover estimation was larger for maize (1.9-11.8%) than soybean (4.5-5.7%), primarily because the image analysis detected the percentage of detected green pixels while the field measurements were for the percentage width of the canopy in the crop row.
- Maize leaves are also thinner than soybean leaves and may have had gaps between the leaves leading to reduced canopy cover using the image analysis than the field measurements.
- the above described arrangements provide mechanisms to monitor plant growth features.
- the mechanisms can be used to allow plant growth in plots to be monitored using a monoscopic camera operating in a substantially automated fashion. This avoids the need for more complex sensing arrangements, whilst allowing measurements are accurately captured over the entire growing period.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- Environmental Sciences (AREA)
- Quality & Reliability (AREA)
- Medical Informatics (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Forests & Forestry (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Wood Science & Technology (AREA)
- Botany (AREA)
- Remote Sensing (AREA)
- Mechanical Engineering (AREA)
- Soil Sciences (AREA)
- Image Processing (AREA)
Abstract
A method of monitoring plant growth features, the method including, in one or more electronic processing devices: on each of a plurality of different time periods: acquiring a plurality images of the plot captured by an imaging device; determining image properties of the acquired images; selecting at least one image for analysis; analysing selected images for the plurality of time periods; and, using results of the analysis to monitor plant growth features.
Description
PLANT GROWTH FEATURE MONITORING
Background of the Invention
[0001] The present invention relates to a method and apparatus for monitoring plant growth features, and in one particular example to a method and apparatus for monitoring features relating to plant growth of crops in a plot.
Description of the Prior Art
[0002] The reference in this specification to any prior publication (or information derived from it), or to any matter which is known, is not, and should not be taken as an acknowledgment or admission or any form of suggestion that the prior publication (or information derived from it) or known matter forms part of the common general knowledge in the field of endeavour to which this specification relates.
[0003] A number of different mechanisms have been proposed for monitoring plant growth, but each of these generally suffers from one or more drawbacks.
[0004] For example, it is known to analyse crop growth based on point clouds obtained from drone mounted sensors, as set out for example in Nie, S, Wang, C, Xi,X, Luo, S, Li, S. and Tian, J (2018) "Estimating the height of wetland vegetation using airborne discrete-return LiDAR data", Optik - International Journal for Light and Electron Optics 154:267-274 and Sankey, T, Donager, J, McVay, J and Sankey, J (2017) "UAV lidar and hyperspectral fusion for forest monitoring in the southwestern USA", Remote Sensing of Environment 195:30-43. Similarly the use of drone mounted stereoscopic imagery is also known from Malambo, L, Popescu, SC, Murray, SC, Putman, E, Pugh, NA, Home, DW, Richardson, G, Sheridan, R, Rooney, WL, Avant, R, Vidrine, M, McCutchen, B, Baltensperger, D and Bishop, M (2018) “Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery”, International Journal of Applied Earth Observation and Geoinformation 64:31-42 and Zarco-Tejada, PJ, Diaz-Varela, R, Angileri, V and Loudjani, P (2014)“Tree height quantification using very high resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3D photo reconstruction methods”, European Journal of Agronomy 55:89-99.
[0005] However, UAVs require manual data collection and this may not be practical for all scenarios, whilst the sensors are typically expensive, making wide scale deployment problematic.
[0006] As an alternative approach Sritarapipat, T, Rakwatin, P and Kasetkasem, T (2014) “Automated rice crop measurement using a field server and digital image processing”, Sensors l4(l):900-926 and McCarthy, A and Tscharke, M (2014)“Automated camera-based height and flower detection for wheat and chickpea” in: 5th International Workshop on Applications of Computer Image Analysis and Spectroscopy in Agriculture (ASABE 2014), 12-13 Jul 2014, Montreal, Canada, describe comparing plant height with a marker bar standing next to the plants using cameras and image analysis. However, such marker bars typically only provide very localised measurements immediately surrounding the marker bar, can be difficult to detect in images and require positioning of a marker bar in each plot, which is unsuitable for wide scale deployment.
[0007] A further alternative approach is analysis of a shadow length in images, as described in Liasis, G and Stabrou, S (2016) “Satellite image analysis for shadow detection and building height detection”, ISPRS Journal of Photogrammetry and Remote Sensing 119:437- 450 and McCarthy, A and Tscharke, M (2014) (above). However, measuring the length of shadows requires line of sight to the ground and is not therefore suitable for use in closed canopy scenarios.
[0008] Flowering may be detected through both optical filtering and image analysis techniques. Image analysis techniques that have reported use for segmenting flowers include: Local Binary Patterns to determine the strength of textural differences within the image Guo, Z, Zhang, L and Zhang, D (2010) “Rotation invariant texture classification using LBP variance (LBPV) with global matching: Pattern Recognition 43(3):706-7l9 and Ojala, T, Pietikainen, M and Maenpaa, T (2002) Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(7):97l-987.
[0009] The use of colour segmentation is described in Thorp, KR and Dierig, DA (2011) “Color image segmentation approach to monitor flowering in lesquerella”, Industrial Crops
and Products 34(1): 1150-1159, whilst template matching (shape size and texture) is described in Lowe, DG (1999)“Object recognition from local scale -invariant features”. In\ Proceedings of the 7th IEEE International Conference on Computer Vision, Vol. 2, 1150- 1157. Spectral reflectance and colour segmentation techniques have been used to automatically determine when crops are flowering (Thorp and Dierig 2011 (above) and Thorp, KR, Dierig, DA, French, AN and Hunsaker, DJ (2011)“Analysis of hyperspectral reflectance data for monitoring growth and development of lesquerella”, Industrial Crops and Products 33(2):524-53 l . However, these techniques only allow specific analysis to be performed and cannot be used to monitor plant growth more broadly.
Summary of the Present Invention
[0010] In one broad form, an aspect of the present invention seeks to provide a method of monitoring plant growth features, the method including, in one or more electronic processing devices: on each of a plurality of different time periods: acquiring a plurality images of the plot captured by an imaging device; determining image properties of the acquired images; and, selecting at least one image for analysis; analysing selected images for the plurality of time periods; and, using results of the analysis to monitor plant growth features.
[0011] In one embodiment the method includes performing at least one of: a height analysis to determine a plant height; a flowering analysis to determine a flowering status; an emergence analysis to determine an emergence date; a canopy analysis to determine a canopy extent; a disease/pest analysis to determine a presence of disease/pests; a status analysis to determine a plant status; and, a condition analysis to determine growing conditions.
[0012] In one embodiment the method includes: selecting a first image that is used to determine at least one of: a plant height; and, a flowering status; and, selecting a second image that is used to determine at least one of: an emergence rate; a canopy cover; a presence of disease/pests; and, growing conditions.
[0013] In one embodiment the image properties include: for the first image, at least one of: a shutter speed closest to about 31.25ms; a shutter speed of less than or equal to about 31.25ms; an ISO of between 80 and 250; and, an ISO of about 64; and, for the second image, at least one of: has a short exposure time; has a low gain; a shutter speed closest to about
l .25ms; a shuter speed of less than or equal to about l.25ms; an ISO closest to about 50; an image captured closest to about l2pm; and, an image having a largest file size.
[0014] In one embodiment the images are monoscopic images and the method includes performing a geometric analysis of the image to correct for perspective distortion.
[0015] In one embodiment the method includes: performing a geometric analysis of image features in the selected image to determine a perspective projection model from measured plot parameters; and, analysing the image using the perspective projection model.
[0016] In one embodiment the measured plot parameters include: a number of rows and columns; and, a plot length and width.
[0017] In one embodiment the image features include at least one of: site vertices; site edges; plot vertices; plot edges; and, plot rows.
[0018] In one embodiment the method includes determining measurements of plant height from the image using the perspective projection model and the measured plot parameters.
[0019] In one embodiment the method includes: analysing the selected image to identify: plant locations on a ground plane; and, canopy features; and, measuring the plant height based on the plant locations, the canopy features and the perspective projection model.
[0020] In one embodiment the method includes: using the canopy features to determine a top of each plant; and, measuring the plant height by measuring along a perspective line upwardly from the plant locations to a top of each plant using the perspective projection model.
[0021] In one embodiment the method includes identifying canopy features by tracking movement between successive selected images for at least part of the plot.
[0022] In one embodiment the method includes: analysing the selected image to identify ground points; and, using the ground points to determine the ground plane.
[0023] In one embodiment the method includes determining the perspective projection model using vanishing points, including: a first vanishing point based on lines parallel to crop rows;
a second vanishing point based on lines perpendicular to crop rows; and, a third vanishing point based on lines parallel to crop stems.
[0024] In one embodiment the method includes using a perspective correction model to a identify a location of the plot in the selected image.
[0025] In one embodiment the method includes: analysing the selected image to identify ground points; using the ground points to identify movement of the camera relative to the plot between different images; and, using movement of the camera to analyse selected images.
[0026] In one embodiment the method includes determining a flowering status by: extracting a saturation channel for the selected image; enhancing a contrast in the saturation channel; and, identifying flowers using the contrast enhanced saturation channel by performing a connected component analysis.
[0027] In one embodiment the method includes determining a flowering status by: extracting a saturation channel from HSB (hue, saturation and brightness) space to create a saturation channel image; inverting the saturation channel image to create an inverted image; applying a local equalisation to the inverted image to create a contrast enhanced image; and, applying a colour threshold to the contrast enhanced image to create a binary image.
[0028] In one embodiment the method includes determining a flowering status by: identifying connected components in the binary image for a plot; counting a number of large connected components in the plot, the large connected components having a number of pixels equal to or greater than 5% of pixels in the binary image; and, determining the plot is flowering if there is more than one additional large connected component detected in the plot than in a previous time period.
[0029] In one embodiment the method includes: applying colour segmentation to the selected image to identify green pixels; and, using a number of detected green pixels to determine at least one of: an emergence status; and, a canopy cover status.
[0030] In one embodiment the method includes identifying green pixels when a green value of the sum of the red, green and blue channels is greater than at least one of: about 0.34; and, about 0.31.
[0031] In one embodiment the method includes determining emergence if any green pixels are identified in the plot.
[0032] In one embodiment the method includes calculating a canopy cover based on a proportion of green pixels to total pixels for the plot.
[0033] In one embodiment the method includes: applying colour segmentation to the selected image to identify pixels having a defined colour; and, using a number of identified pixels to determine at least one of: a presence of a disease or pests; and, growing conditions.
[0034] In one embodiment the method includes: determining a plant profile indicative of expected plant growth features; and, comparing monitored plant growth features to the plant profile to determine a plant status.
[0035] In one embodiment the method includes generating a notification if the monitor plant growth features fall outside expected plant growth feature ranges.
[0036] In one embodiment the method includes acquiring the images using a client device, and transferring the images to a remote server for analysis.
[0037] In one broad form, an aspect of the present invention seeks to provide a method of monitoring plant growth features in a plot, the method including, in one or more electronic processing devices: acquiring an image of the plot captured by an imaging device that generates monoscopic images; performing a geometric analysis of image features to determine a perspective projection model from measured plot parameters; and, analysing the image using the perspective projection model to monitor plant growth features.
[0038] In one embodiment the measured plot parameters include: a number of rows and columns; and, a plot length and width.
[0039] In one embodiment the image features include at least one of: site vertices; site edges; plot vertices; plot edges; and, plot rows.
[0040] In one embodiment the method includes determining measurements of plant height from the image using the perspective projection model and the measured plot parameters.
[0041] In one embodiment the method includes: analysing the selected image to identify: plant locations on a ground plane; and, canopy features; and, measuring the plant height based on the plant locations, the canopy features and the perspective projection model.
[0042] In one embodiment the method includes: using the canopy features to determine a top of each plant; and, measuring the plant height by measuring along a perspective line upwardly from the plant locations to a top of each plant using the perspective projection model.
[0043] In one embodiment the method includes identifying canopy features by tracking movement between successive selected images for at least part of the plot.
[0044] In one embodiment the method includes: analysing the selected image to identify ground points; and, using the ground points to determine the ground plane.
[0045] In one embodiment the method includes determining the perspective projection model using vanishing points, including: a first vanishing point based on lines parallel to crop rows; a second vanishing point based on lines perpendicular to crop rows; and, a third vanishing point based on lines parallel to crop stems.
[0046] In one embodiment the method includes using a perspective correction model to a identify a location of the plot in the selected image.
[0047] In one broad form, an aspect of the present invention seeks to provide a system for monitoring plant growth features in a plot, the system including one or more electronic processing devices that: on each of a plurality of different time periods: acquire a plurality images of the plot captured by an imaging device; determine image properties of the acquired images; and, select at least one image for analysis; analyse selected images for the plurality of time periods; and, use results of the analysis to monitor plant growth features.
[0048] In one broad form, an aspect of the present invention seeks to provide a system for monitoring plant growth features in a plot, the system including one or more electronic processing devices that: acquire an image of the plot captured by an imaging device that generates monoscopic images; perform a geometric analysis of image features to determine a perspective projection model from measured plot parameters; and, analyse the image using the perspective projection model to monitor plant growth features.
[0049] It will be appreciated that the broad forms of the invention and their respective features can be used in conjunction, interchangeably and/or independently, and reference to separate broad forms is not intended to be limiting.
Brief Description of the Drawings
[0050] Various examples and embodiments of the present invention will now be described with reference to the accompanying drawings, in which: -
[0051] Figure 1A is a flow chart of an example of a method for use in monitoring plant growth features;
[0052] Figure 1B is a flow chart of a second example of a method for use in monitoring plant growth features;
[0053] Figure 2 is a schematic diagram of an example of a system for monitoring plant growth features;
[0054] Figure 3 is a schematic diagram of an example of a processing system;
[0055] Figure 4 is a schematic diagram of an example of a client device;
[0056] Figure 5A is a schematic diagram of an example of an imaging system;
[0057] Figure 5B is a schematic diagram of an example of the physical configuration of the imaging system of Figure 5 A;
[0058] Figure 6 is a flow chart of a specific example of a process for acquiring images;
[0059] Figures 7A to 7H are example images of a plot of maize plants captured at different times of day;
[0060] Figures 8A to 8D are example images captured under different lighting conditions;
[0061] Figure 9 is a graph of an example of camera settings for a subset of images with consistent lighting;
[0062] Figures 10A and 10B are example images showing changes in ground plane location;
[0063] Figure 10C is a graph showing examples of changes in ground point location;
[0064] Figure 1 1 is a flow chart of a specific example of a process of analysing images for monitoring plant growth features;
[0065] Figure 12 is a flow chart of a specific example of a height analysis process;
[0066] Figures 13A to 13K are schematic diagrams showing steps in developing a perspective correction model;
[0067] Figures 14A to 14H are example images showing canopy motion tracking for height estimation of maize plants;
[0068] Figures 15A to 15H are example images showing canopy motion tracking for height estimation of soybean plants;
[0069] Figure 16 is an example image showing the use of the perspective correction model and canopy position in plant height estimation;
[0070] Figure 17 is a flow chart of a specific example of a flowering analysis process;
[0071] Figures 18A to 18F are example images showing steps in the flowering analysis process;
[0072] Figure 19 is a flow chart of a specific example of an emergence and canopy analysis process;
[0073] Figures 20A to 20H are example images showing steps in the emergence analysis process;
[0074] Figures 21A and 21B are example images showing steps in the canopy analysis process;
[0075] Figures 22A to 22D are example graphs showing a comparison of measured and estimated height for soybean plants;
[0076] Figures 23A to 23H are example graphs showing a comparison of measured and estimated height for maize plants;
[0077] Figures 24A to 24D are example graphs showing height curves estimated for soybean plants;
[0078] Figures 25A to 25D are example graphs showing example height curves for maize plants;
[0079] Figures 26A to 26D are example graphs showing proportion of flower pixels for a plot of maize plants;
[0080] Figures 27A to 27D are example graphs showing emergence of maize plants;
[0081] Figures 28A to 28D are example graphs showing canopy cover for maize plants;
[0082] Figures 29A to 29D are example graphs showing canopy cover for soybean plants; and
[0083] Figures 30A to 30D are example graphs showing example canopy cover for maize plants.
Detailed Description of the Preferred Embodiments
[0084] Example processes for use in monitoring plant growth features will now be described in more detail with reference to Figures 1A and 1B.
[0085] For the purpose of these examples, it is assumed that the process is performed at least in part utilising using one or more electronic processing devices. The processing devices can form part of one or more processing systems, such as computer systems, and are typically in communication with at least one imaging device. In one example, the imaging device is part of an imaging system, and the electronic processing device may be integrated into, distributed between, or in communication with the one or more imaging systems.
[0086] The term "plant growth feature" is intended to refer to a feature that relates in some way to plant growth. The feature could be a feature of one or more plants, such as a plant height, emergence date, canopy cover, flowering status or the like. The plant growth feature could also be a feature that has an impact on the growth of the plant, such as environmental conditions, including the presence, absence or degree of moisture, including rainfall or fog, or the presence or absence of diseases or pests. It will therefore be appreciated that the term encompasses features that relate generally to the growth of plants, and include, but are not limited to features of plants themselves.
[0087] The term "plot" is intended to refer to an area including one or more plants. The plants can be arranged in columns and/or rows and are typically planted at a similar time and treated in a similar manner. Typically there is a minimum of one row and one column for uniform commercial crops, although it will be appreciated that the techniques could be applied to plots having any suitable configuration. Plots may form part of a larger site, such as a field, which may include a number of plots, although this is not intended to be limiting, and it will be appreciated that the techniques could be applied to individual plants, or any grouping of plants, including fields, sites, plots, or the like.
[0088] In the example of Figure 1 A, the process is performed in order to acquire images that are suitable for use in subsequent downstream analysis processes.
[0089] In this example, at step 100 a plurality of images of a plot are acquired over the course of a time period. The time period is typically one day, although this is not essential and could include shorter or longer time periods, such as a morning, or evening, a four hour window, or several days. The images of the plot are captured at different times during the time period so that the images are captured in a range of different lighting conditions. The
images could be captured at set time intervals, such as every 20 minutes, hourly or the like, although this is not essential and alternatively, could be captured at set times, or when certain criteria are met, such as when defined lighting conditions are detected, or the like.
[0090] The images are typically captured using an imaging device, such as a camera or similar, with this being performed using automated settings, so that capture of the image is optimised for the prevailing environmental conditions.
[0091] At step 105 image properties of the captured images are determined, typically by retrieving this information from metadata associated with the images, such as EXIF (Exchangeable Image File Format) metadata stored as part of the image data by the camera or other image capture device. Examples of the image properties include, but are not limited to a date and/or time of capture, a capture location, an image orientation (rotation), an imaging device aperture, a shutter speed, a focal length, a metering mode, and ISO speed information.
[0092] At step 110 the image properties are used to select an image for analysis. In particular, this process typically involves selecting one or more images having image properties that meet defined criteria, such as set shutter speeds, aperture settings, or the like. This allows image(s) to be selected which are best suited for subsequent analysis, with different images optionally being selected for different analysis processes.
[0093] The process of acquiring and selecting images is typically repeated on each of a number of different, and optionally consecutive time periods, such as on consecutive days so that sequences of images can be used in performing analysis at step 115, with results of the analysis being used to monitor plant growth features at step 120.
[0094] The nature of the analysis performed will vary depending upon the preferred implementation, but this could include any one or more of height analysis to detect a plant height, flowering analysis to determine a flowering status of the plant, emergence analysis status, to determine a date of emergence, and a canopy analysis in order to ascertain a current extent of canopy cover. Other types of analysis that could be performed include a disease/pest analysis to identify the presence and/or absence of any diseases/pests, a status
analysis to determine a plant status, such as an indication of plant health, or a condition analysis to determine features of growing conditions, such as levels of rainfall, sunlight, or the like.
[0095] Accordingly, this process allows images to be captured utilising automatic settings on an imaging device, which optimises the image for the current lighting conditions. Images are captured throughout a time period, such as over the course of a day, with the properties of the captured images being examined to select an image which is most appropriate for performing a particular analysis process. In this regard, different image analysis processes typically perform differently on images with different characteristics. For example canopy cover analysis might depend on detection of green colours within an image, and hence work best in bright images, and less well in dull images.
[0096] Previous attempts to address such issues have focused on the use of fixed camera settings, to ensure optimal settings are used, and capturing of images at a fixed time, in order to try and ensure reasonably consistent illumination. However, this does not take into account changes in environmental conditions that occur on a day to day basis, meaning captured images are often unsuitable, for example due to under or over exposure due to differing levels of cloud cover or sunlight, the presence of rain, or the like. Additionally, the optimal settings may vary between different imaging devices, meaning significant work is required to establish optimal settings for each imaging device.
[0097] In contrast, the current approach collects multiple images throughout a day, using automatic camera settings, in order to optimise the captured image for the prevailing ambient conditions. The image properties are then examined, allowing the most appropriate image to be selected for use in subsequent analysis, thereby maximising the chance of a suitable image being captured on any given day.
[0098] A number of further features will now be described.
[0099] In one example, the above described process allows different images to be selected for use in different analysis processes, thereby optimising the image used for each analysis, whilst avoiding the need to reconfigure camera settings between capture of different images,
other than through the use of existing automated setting processes. In one example, a first image is selected that is used to determine plant height and/or flowering status, whilst a second image is used to determine an emergence rate, a canopy cover, a presence of disease/pests or growing conditions. However, it will be appreciated that this is not essential and any combination could be used, including the use of additional images if appropriate.
[0100] In one example, the first image utilises an image having a shutter speed closest to about 31.25 milliseconds, less than or equal to about 31.25 milliseconds, having an ISO of between 80 and 250 and more typically an ISO of about 64. Conversely, the second image typically has a short exposure time and a low gain, and more typically has a shutter speed closest to, or less than or equal to 1.25 milliseconds with an ISO of about 50. Typically the second image is also captured closest to about midday and is preferably an image having a largest file size. However, it will be appreciated that the particular settings used will vary depending on a range of different factors, such as the characteristics of the imaging device used, aspects of the plot environment, such as available amounts of natural light, the natures of the plants being analysed, or the like. Accordingly, in one example, the optimal images can be determined by inspecting images over multiple days and times of the day to determine the image properties that most enhance the features required for image analysis. This could be performed manually, and/or could be performed using image analysis techniques, optionally used in conjunction with machine learning, allowing the optimum image properties to be automatically determined.
[0101] In the example of Figure 1B, a specific approach for performing height analysis will now be described.
[0102] In this example, at step 150 an image of a plot is acquired, typically using the process described above with respect to Figure 1A.
[0103] A geometric analysis of the image is performed at step 155. The geometric analysis is performed based on image features in the selected image, for example by identifying the location of defined points, such as vertices of the plot or site.
[0104] Measured plot parameters, such as the physical size of the plot are determined at step 160, either based on user inputs, or based on measurements of the plot detected using a physical sensor.
[0105] This information can then be used to calculate a perspective projection model at step 165 which can correct for perspective distortions resulting from the pose of the imaging device relative to the plot. Once the perspective correction model has been determined, this can then be used in analysing the image to allow plant growth features to be determined, and hence used in monitoring plant growth features.
[0106] Whilst this can be used in a range of different analysis processes, in one particular example, this is used to determine plant height. To perform this, plant locations are identified at step 170, for example by analysing the image to identify the location of columns and rows of plants and/or boundaries of the region of interest for analysis. Canopy features are identified at step 175, typically by examining movement between successive images, in order to detect a top of the canopy on the basis that the canopy will tend to move more than other parts of the image.
[0107] Following this, at step 180, the plant locations, canopy features and perspective projection model are used together with the measured plot parameters to calculate a plant height. In particular this is achieved by measuring a distance vertically between the canopy and the base of each plant, with the vertical direction being ascertained using the perspective correction model. The resulting distance can then be scaled, using the measured plot parameters, enabling an actual plant height to be determined.
[0108] Accordingly, the above described approach provides a mechanism for deriving a perspective correction model, which can then be used in analysing plant growth features, for example allowing plant height to be measured by analysing the image to make appropriate measurements of plant height.
[0109] In this regard, analysing images, for example to measure plant height directly from images is generally problematic due to issues in scaling and accurate detection of plant extent. Previous attempts to address this issue having included the use of specific hardware
such as stereoscopic cameras, or other range finding devices, in order to detect the relative location of the camera and plant, with this being used in height detection. However, the need for more specialised and expensive equipment makes this unsuitable for many scenarios. Alternative solutions have involved the use of height scales positioned proximate to the plants. However, this requires an initial set-up stage and accurate imaging of the scales can be problematic.
[0110] In contrast, the current approach of using a perspective correction model derived from image features allows features such as the height, as well as other features such as the location and extent of the plot within the image to be calculated from a single monoscopic image, captured from any arbitrary orientation relative to the plot, enabling the height to be reliably detected using a basic hardware configuration, in a wide variety of situations, making this particularly suitable for wide scale deployment.
[0111] A number of further features will now be described.
[0112] In one example, the height can be measured by projecting upwardly from a plant location on a ground plane to a canopy feature, using the perspective correction model, with the measured distance being utilised in order to calculate plant height. Thus the perspective projection model can be used to project upwardly from the plant location on a ground plane to the top of the canopy using this to determine the height of each plant.
[0113] In one example, the method includes analysing a selected image to identify ground points and then using the ground points to determine the ground plane, with this being used to determine the location of each plant on the ground plane.
[0114] The perspective projection model can be calculated using any image features, and could for example be based on set markers, such as flags positioned in the plot and hence present within the image. In one preferred example however this is performed on the basis of site vertices, site edges, plot vertices, plot edges and/or plot rows as these will typically be present in each image of a plot and also be readily identifiable, for example using image analysis techniques, such as edge detection or the like.
[0115] The method for determining the perspective projection model includes calculating vanishing points including a first vanishing point based on lines parallel to crop rows, a second vanishing point based on lines perpendicular to crop rows and a third vanishing point based on the lines parallel to crop stems. The vanishing points are then utilised in order to allow for perspective within the image to be corrected, for example, allowing measurements to be made upwardly from a plant location to the top of the canopy and to accurately measure a top of the canopy.
[0116] The above described methods may also involve analysing a selected image to identify ground points and using ground points to identify movement of the camera relative to the plot between different images. This can be used to correct for relative movement between the camera and the plot, for example in case the camera is moved between successive images. For example, this can allow a single perspective projection model to be calculated, with this being corrected on a daily basis based on the position of the camera relative to the plot.
[0117] The process for determining flowering status is typically performed based on an analysis of colour in a captured image. Specifically, in one example, this involves extracting a saturation channel for the selected image, enhancing a contrast in the saturation channel and then identifying flowers using the contrast enhanced saturation channel by performing a connected component analysis.
[0118] In particular, this typically involves extracting a saturation channel from HSB (hue, saturation and brightness) space to create a saturation channel image. The saturation channel image is then inverted to create an inverted image with local equalisation being applied to the inverted image to create a contrast enhanced image. A colour threshold can then be applied to the contrast enhanced image to create a binary image with this then being analysed to perform the connected component analysis. The use of a contrast enhanced saturation channel can more accurately identify regions having a colour corresponding to that of the expected flower colour, as well as to more easily distinguish from noise and other parts of the plant.
[0119] The method of performing the connected component analysis typically involves identifying connected components in the binary image, counting a number of large connected
components which exceed more than 5% of pixels in the binary image and determining if the plot is flowering if there is more than one additional large connected component detected in the plot than on a previous day. Thus, this aims to identify rapid increases in large connected components, which will in turn correspond to a flowering event for the plants in the plot.
[0120] The method can also include analysing a selected image to identify green pixels, and then using a number of detected green pixels to determine an emergence or canopy cover status. For the Sony Xperia Z2, RGB colour sensor, the number of green pixels is determined when a green value of the red, green and blue channels is greater than about 0.34, with emergence being detected if any green pixels are identified in a plot and with the canopy cover being based on a proportion of green pixels to total pixels for the plot. This green pixel detection algorithm would vary for image sensors that have different in-built colour filtering and for example for a Britecell image sensor, a more appropriate value is about 0.31 for the Samsung Galaxy S7.
[0121] A similar approach can also be used in performing disease and/or pest detection. In this example, the disease and/or pest can be determined by identifying other defined colours, in which case the process typically involves applying colour segmentation to the selected image to identify pixels having a defined colour, and then using a number of identified pixels to determine a presence of a disease or pests. In this instance, it will be appreciated that different colours might be used to identify different pests or diseases. Additionally, a reduction in amounts of green pixels can also be used to identify the presence of pests and/or diseases, either through obstruction of or consumption of plant material.
[0122] Similarly, analysing growing conditions can be performed in a similar manner, for example by examining soil colouring to determine an extent of soil moisture, identifying rain, fog, sunlight levels or the like.
[0123] In a further example, a plant profile is determined which is indicative of expected plant growth features. This can define for a given plant, the features that would be expected at a particular stage of growth, for example defining how plant height and canopy cover should vary over time after emergence. Monitored plant growth features can then be compared to the expected plant growth features defined in the plant profile to determine a
plant status. For example, this can be used to identify circumstances in which plant growth is below expectations, which could in turn be indicative of an issue, such as poor growing conditions, insufficient fertiliser application or the like. In situations in which plant growth features fall outside expected plant growth feature ranges, a notification can be generated, for example to alert a crop manager, or farmer, that intervention might be required.
[0124] Accordingly, the above described techniques allow images of plants to be analysed to monitor characteristics of the plants, including the plant height, emergence date, canopy cover and flowering status. In one particular example, the images can be captured using an imaging device, for example forming part of a client device, or custom imaging system, with the images optionally being transferred to a remote server for analysis.
[0125] An example of a suitable hardware configuration will now be described with reference to Figures 2 to 5.
[0126] In this example, one or more processing systems 210 are provided coupled to one or more client devices 220 and one or more imaging systems 230, via one or more communications networks 240, such as the Internet, and/or a number of local area networks (LANs).
[0127] Any number of processing systems 210, client devices 220 and imaging systems 230 could be provided, and the current representation is for the purpose of illustration only. The configuration of the networks 240 is also for the purpose of example only, and in practice the processing systems 210, client devices 220 and imaging systems 230 can communicate via any appropriate mechanism, such as via wired or wireless connections, including, but not limited to mobile networks, private networks, such as an 802.11 networks, the Internet, LANs, WANs, or the like, as well as via direct or point-to-point connections, such as Bluetooth, or the like. Furthermore, whilst the processing systems 210, client devices 220 and imaging systems 230 are identified as different entities, as will be appreciated from the following, there can be overlap in capabilities, so that for example a client device 220 could also function as an imaging device 230, and hence the differentiation between devices is intended to be for the purpose of illustration and is not intended to be limiting.
[0128] In this example, the processing systems 210 are adapted to receive and analyse images captured by the imaging systems 230, and provide access to resulting analysis via the client devices 220. Whilst the processing systems 210 are shown as single entities, it will be appreciated they could include a number of processing systems distributed over a number of geographically separate locations, for example as part of a cloud based environment. Thus, the above described arrangements are not essential and other suitable configurations could be used.
[0129] An example of a suitable processing system 210 is shown in Figure 3. In this example, the processing system 210 includes at least one microprocessor 300, a memory 301, an optional input/output device 302, such as a keyboard and/or display, and an external interface 303, interconnected via a bus 304 as shown. In this example the external interface 303 can be utilised for connecting the processing system 210 to peripheral devices, such as the communications networks 240, databases, other storage devices, or the like. Although a single external interface 303 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided.
[0130] In use, the microprocessor 300 executes instructions in the form of applications software stored in the memory 301 to allow the required processes to be performed. The applications software may include one or more software modules, and may be executed in a suitable execution environment, such as an operating system environment, or the like.
[0131] Accordingly, it will be appreciated that the processing systems 210 may be formed from any suitable processing system, such as a suitably programmed PC, web server, network server, or the like. In one particular example, the processing system 210 is a standard processing system such as an Intel Architecture based processing system, which executes software applications stored on non-volatile (e.g., hard disk) storage, although this is not essential. However, it will also be understood that the processing system could be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
[0132] As shown in Figure 4, in one example, the client device 220 includes at least one microprocessor 400, a memory 401, an input/output device 402, such as a keyboard and/or display and an external interface 403, interconnected via a bus 404 as shown. In this example the external interface 403 can be utilised for connecting the client device 220 to peripheral devices, such as the communications networks 240, databases, other storage devices, or the like. Although a single external interface 403 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless or the like) may be provided. The client device 220 may also optionally include other interfaces, including an imaging device, such as a camera or the like.
[0133] In use, the microprocessor 400 executes instructions in the form of applications software stored in the memory 401, and to allow communication with one of the processing systems 210.
[0134] Accordingly, it will be appreciated that the client device 220 be formed from any suitably programmed processing system and could include suitably programmed PCs, Internet terminal, lap-top, hand-held PC, a tablet, a smart phone, or the like. However, it will also be understood that the client device 220 can be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
[0135] An example of an imaging system will now be described in more detail with reference to Figures 5 A and 5B.
[0136] In this example, the imaging system includes at least one microprocessor 500, a memory 501, an optional input/output device 502, such as a keyboard and/or display, and an external interface 503, interconnected via a bus 504 as shown. In this example the external interface 503 can be utilised for connecting the imaging system 230 to peripheral devices, such as the communications networks 240, databases, other storage devices, or the like. Although a single external interface 503 is shown, this is for the purpose of example only, and in practice multiple interfaces using various methods (e.g. Ethernet, serial, USB, wireless
or the like) may be provided. The imaging system 230 also includes an imaging device 505, such as a camera or the like.
[0137] In use, the microprocessor 500 executes instructions in the form of applications software stored in the memory 501 to allow the required processes to be performed. The applications software may include one or more software modules, and may be executed in a suitable execution environment, such as an operating system environment, or the like.
[0138] Accordingly, it will be appreciated that the imaging systems 230 may be formed from any suitable imaging system, such as network enabled cameras, smartphones or the like. However, it will also be understood that the processing system could be any electronic processing device such as a microprocessor, microchip processor, logic gate configuration, firmware optionally associated with implementing logic such as an FPGA (Field Programmable Gate Array), or any other electronic device, system or arrangement.
[0139] A range of different physical configurations can be used for the imaging system. In one example, as shown in Figure 5B, the imaging device 505 can be provided in a housing 531, which is supported by a tower 532, extending upwardly from a base 533. The tower 532 may be connected to the base 533 via a drive unit 534 allowing an orientation of the housing 531 and hence a camera 505 to be adjusted, for example allowing a single imaging device to be used to capture images of multiple different plots.
[0140] For the purpose of the remaining description it is assumed that the processing system 210 is a server that operates to analyse images, with actions performed by the processing system 210 being performed by the processor 300 in accordance with instructions stored as applications software in the memory 301. It is further assumed that the client device 220 is a user device to allow user interaction with the system, for example to review the results of image analysis, with actions performed by the client device 220 being performed by the processor 400 in accordance with instructions stored as applications software in the memory 401 and/or input commands received from a user via the I/O device 402. Finally, it is assumed that the imaging system is a smartphone, that captures and uploads images to the server 210, with actions performed by the imaging system 230 being performed by the
processor 500 in accordance with instructions stored as applications software in the memory 501.
[0141] The subsequent description will further make reference to trials performed at a trial site. For the purpose of the trial the imaging system consisted of a smartphone and solar power system, supported by a raised platform, with this being used for sub-daily image capture of the whole trial site throughout the crop season at a high temporal resolution (sub daily). The camera tower could provide a large data set of images at different times of day and provides a suitable dataset to enable development of a robust image analysis algorithm. Whilst other platforms, such as UAVs, (Unmanned Aerial Vehicles) or ground vehicles could be used, this typically requires human oversight, and hence is less desirable.
[0142] A smartphone was selected as the imaging system as it integrates a GPS, processor, camera and internet connection. Sony Xperia Z2 and Samsung Galaxy S7 smartphones were selected. An App was developed to collect an image every 20 minutes and upload it to a specified server. The mobile application was developed for Android phones, and written in Java and Android application framework.
[0143] An example of the process for acquiring images will now be described in more detail with reference to Figure 6.
[0144] In this example, at step 600 the imaging system acquires an image from the camera 505 with the image being uploaded to server 210 at step 605. The server then operates to save the image at step 610 with this process being repeated periodically throughout the day. Any number of appropriate images can be captured but in one example images are captured every 20, 30 or 60 minutes during daylight hours, although other times could be used.
[0145] Examples of different images captured at different times of the day are shown in Figures 7A to 7H, with these corresponding to images captured every two hours between 6: 15 am and 7:00 pm. A number of observations are apparent from the images.
[0146] For example, leaves appear yellow early in the day due to forward scattering of light (i.e. sun in front of camera), as shown in Figures 7A to 7D. The colours in the image shown in Figure 7E, captured at 1 :35 pm, are less saturated than at other times due to the camera
performing automatic adjustments of exposure, gain and/or white balance to capture balanced images for the varying natural lighting. The flowers in the image captured at 5:35 pm shown in Figure 7G are brighter than the leaves and appear yellow. The images captured at 7:00 pm shown in Figure 7H have reduced sharpness because of low light. The colour is most intense in full sunlight around midday, shown in Figure 7D, which assist in enabling robust identification of canopy cover against a soil or stubble background.
[0147] Based on these observations, the optimal time for flower segmentation appears to be late afternoon at a time corresponding to sunset, in which there was back scattering of light (sun behind camera). Similarly, the optimal time for canopy segmentation was around midday.
[0148] At step 615, the server 210 retrieves all of the images for an entire day for a respective plot and examines the image properties at step 620, with the image properties being compared to predetermined image property criteria to select respective first and second images at step 625.
[0149] Specifically, in one example this involves inspecting the EXIF (Exchangeable Image File) settings stored in the file properties of photos and comprising common settings (e.g. exposure and gain). This allows the camera to be set up to capture images using automatic settings for exposure and gain. Automatic settings allow the camera to decide the optimal settings for aesthetically pleasing brightness and contrast for the given lighting situation.
[0150] For height and flow detection, images are manually inspected and EXIF settings are obtained for images that clearly show features with shadows. For height detection this was found to be in lighting conditions of skylight (not direct sunlight, just ambient lighting in the sky), where the shutter speed was 31.25 ms and ISO (gain) was between 80 and 250. High ISO is undesirable because this is high gain and leads to noisy images. For canopy detection, images are manually inspected and EXIF settings are obtained for images that enhance the green pixels of the plants. This was found to be images that were not noisy and were bright, the shutter speed was less than 1/800 and ISO speed was 50. These parameters would change for different camera image sensors. For example, a camera with a smaller image sensor would absorb less light therefore longer shutter speeds may be required for brighter images.
[0151] Specifically, a first image for performing height and flowering analysis is typically selected based on the image closest to the following criteria:
• Shutter speed closest to 31.25
• Shutter speed less than or equal to 31.25
• ISO closest to 64
[0152] Specifically, a second image for performing emergence and canopy cover is typically selected based on the image closest to the following criteria:
• Shutter speed closest to 1.25
• Shutter speed less than or equal to 1.25
• ISO closest to 50
• Captured closest to l2pm
• Largest file size
[0153] Since the camera settings and appearance of individual images are being used to govern which images are analysed, the image analysis does not rely on image capture at a specified hard-coded time of day.
[0154] Figures 8A to 8D shows a subset of images automatically selected using this process, which show a consistency in appearance, despite being captured at different times with different settings. Additionally, the graph of Figure 9 shows shutter speed and ISO settings for images captured under a range of lighting situations and demonstrate that there are a set of exposure and gain parameter values that result in selection of images with consistent appearance for image analysis.
[0155] At step 630, ground points within the image are identified with this being used to perform movement correction at step 635. This enables the ground plane to be located if the camera moves because of wind or maintenance, or if the camera is moved in order to image different plots. This also enables the ground plane to be determined from imagery collected using ground or aerial vehicles. Figures 10A to 10C show the change in the coordinates for one of the ground plane points 1000 in daily images. This demonstrates that a tracking algorithm is required to update the ground plane in the three point perspective. The selection of control points could be implemented automatically using flags in the field.
[0156] In the above described example, each image is uploaded to the server 210. However, it will be appreciated that this is not essential, and alternatively images could be stored locally by the imaging system 230, and then analysed using the above described process, with only selected images being uploaded to the server 210 as required.
[0157] Once the selected images for a given day have been determined, these can be analysed using one or more different analysis techniques, and an example of the overall approach for analysing images will now be described with reference to Figure 11.
[0158] In this regard, the process is typically performed sequentially, with emergence and canopy cover analysis being performed initially to detect the early stages of growth, and the process only proceeding to performing height and/or flowering analysis in the event that the plants are sufficiently developed. However, this is not essential and it will be appreciated that alternatively each of the analysis processes could be performed in parallel.
[0159] Assuming sequential analysis is performed, then a selected second image for the current day is initially retrieved at step 1100. At step 1105 it is determined if emergence has previously been recorded and if not the process moves on to step 1110 to perform emergence analysis. At step 1115 it is determined if emergence is detected and if not the process returns to step 1100 allowing an image analysis to be performed on a subsequent day. Otherwise, if emergence is detected the emergence date is recorded at step 1120.
[0160] Assuming emergence has been detected, the second image is analysed to perform a canopy analysis at step 1125, with this being used to determine a level of canopy cover, which is then recorded. This can optionally be compared to a threshold to determine if the canopy cover has reached a certain level. If not, this suggests that the plants are at an early growth stage and hence additional analysis may not be required, allowing the process to return to step 1100 to analyse an image on a subsequent day.
[0161] Otherwise, assuming the canopy is sufficiently developed, if it is otherwise decided that height and flowering analysis are to be performed, then a first image for the day is retrieved at step 1135. A height analysis is then performed at step 1140 with the resulting height being stored and optionally compared to a threshold at step 1145. This is performed to
determine if the plants are sufficiently developed for flowering to potentially occur. If not, the process returns to step 1100 otherwise a flowering analysis is performed at step 1150, with results being recorded.
[0162] An example height detection analysis will now be described with reference to Figure 12
[0163] In this example at step 1200 a first image is selected with this being used to identify certain image features at step 1205. The image features typically correspond to vertices of the plots, and examples of these are shown as A, B, C, D, E and F in Figure 13A. This can be performed using image analysis techniques. However, as this typically only needs to be completed a single time for each plot, for each camera and position, and can be corrected based on tracked relative movement between the camera and plot, this can alternatively be performed manually.
[0164] At step 1210 a perspective projection model is calculated using a perspective transformation algorithm, which is required to identify the location of plots in a perspective image for the image analysis algorithm. In this regard, a three point perspective model is required with: (i) all lines parallel to crop rows leading to the first vanishing point; (ii) all lines perpendicular to crop rows leading to the second vanishing point; and (iii) all lines parallel to crop stems leading to the third vanishing point.
[0165] In broad terms the perspective algorithm involves obtaining parameters of a plot that is as observed in the camera image, including a number of rows and columns (a minimum of one each), and a length and width of each plot in metres. From the image, features are analysed in order to establish the perspective vanishing lines on the ground plane of the camera image. From this, the perspective projection model is calculated, with the this being used to project the trial layout grid onto the ground plane of the camera image and then calculate the detected height of each plot from pixels within the top of each plot where flowers would be located.
[0166] Once the comer points of the plot have been determined, these can then be used to calculate a first vanishing point. In one example, this is achieved by plotting lines 1311,
1312 through points A-B and C-D, as shown in Figure 13B, and calculating m and c in the equation:
y = m*x +c
where m = (i A -y B/( A - B ) and
c = -\*m*xh+yh.
[0167] A first vanishing point 1301 is where these lines intersect, on the ground plane, and is calculated using same approach.
[0168] Next, lines 1313, 1314 are fitted through points A-D and B-C, with a second vanishing point 1302 being identified where these lines intersect, as shown in Figure 13C. A line 1315 is provided between the first and second vanishing points, as shown by the dotted line in Figure 13D, with this representing a horizon line.
[0169] The horizon line is then offset so it passes through the line 1313 to form a measure line 1316, extending from an intersection between the measure line 1316 and line 1311 to point D. The measure line 1316 is then sub-divided and extended to the first vanishing point to get horizontal plot spacing, as shown by the lines 1317 in Figure 13E.
[0170] Next an intersection between the measure line 1316 and a projected B-C line 1314 is found, with the measure line being sub-divided into a number of vertical plots (columns), as shown in Figure 13E. Lines are then extended from each sub-division on the measuring line
1316 and the second vanishing point 1302, with the intersection of these lines and the lines
1317 forming the coordinates of the plots on the ground plane.
[0171] Lines 1318, 1319 are fitted through the points A-E and C-F, with an intersection between the lines 1318, 1319 corresponding to a third vanishing point 1303, shown in Figure 13G.
[0172] Lines 1320 at specified heights are projected from the third vanishing point 1303 to horizontal plot marker lines on the line 1313, with these then being projected to the first vanishing point 1301 as lines 1321. Lines 1322 are then projected from the third vanishing point 1303 to each plot along the left hand side of the site.
[0173] The line 1322 passing through point A is projected to the first vanishing point 1301, to form line 1323 shown in Figure 13J, with the point of intersection of lines 1322 and 1323 being projected to the second vanishing point to form lines 1324, as shown in Figure 13K.
[0174] Intersections between lines 1324 and lines 1321 to the first vanishing point form the coordinates of the plots at a specified height.
[0175] At step 1215, plant locations are identified with a current image being compared to previous images at step 1220 to identify canopy movement at step 1225.
[0176] In particular, the tracking algorithm tracks visual canopy features in the centre of each plot daily from emergence. Figures 14A to 14H and 15A to 15H demonstrate the motion tracking for maize and soybean weekly from emergence. In one specific example motion tracking is achieved using the OpenCV TrackerKCF framework, implemented with additional constraints where the tracked object only moves upward and within the plot boundaries.
[0177] This is used to identify the top of the canopy at step 1230 allowing canopy height to be determined at step 1235. In particular, the distance that the tracked box moved was related to actual height using a perspective transformation by projecting the boundary of the tracked plot to the second vanishing point to determine the actual height, with the detected location of the object being projected to the ground plane along lines to the third vanishing point, as shown in Figure 16. This identifies the top of each plant, which can then be related to the actual height, so that the geometry of the scene perspective converts the tracked movement in pixels to distance for height estimation.
[0178] An example of a process for determining flowering status will now be described with reference to Figure 17. This process is performed on first images, examples of which are shown in Figures 18A and 18B, and uses colour detection, with pre-processing being performed in order to enhance contrast between crop leaves and the flowers, which tend to look yellow in the captured images.
[0179] In this example, a saturation channel of HSB space is extracted at step 1700, with the saturation channel being inverted at step 1705, before contrast enhancement is performed at
step 1710 by applying a local equalisation (200 x 200) to the image. Examples of the resulting contrast enhanced images are shown in Figures 18C and 18D.
[0180] At step 1715 a colour threshold is then applied to the image to convert the image to black and white, with black pixels having a value less than 200 and white pixels with value greater than 200, with example resulting images being shown in Figures 18E and 18F.
[0181] At step 1720 connected components are identified in the image, using known approaches from the art, with a number of connected components having a number of pixels equal to greater than 5% of pixels in image being counted at step 1725. The count is compared to a count from previous day at step 1730, with the flowering status being determined based on results of the comparison at step 1735. Specifically the plot is determined to be flowering if there is more than one additional large connected component detected in each plot than the previous day.
[0182] An example of a process for detecting emergence and calculating canopy cover using colour segmentation will now be described with reference to Figure 19.
[0183] In this example, at step 1900 a second image is selected which is not noisy (low gain) and bright to emphasise the green colour of plants (short exposure). Colour segmentation is applied to the second image at step 1905, with this being used to identify green pixels at step 1910. Specifically for the Sony Xperia Z2 this is achieved to identify pixels for which the green value, based on a sum of the red, green and blue channels, is greater than 0.34, but other values might be used for other imaging devices. For the Samsung Galaxy S7, the thresholding was 0.31 instead of 0.34.
[0184] At step 1915 it is determined if these are the first green pixels, and if so, an emergence date is determined at step 1920. Otherwise the green pixels are used to calculate a canopy cover at step 1925, based on a proportion of detected green pixels to total pixels in the plot.
[0185] Example images illustrating progression of plant emergence over six days for a soybean crop are shown in Figures 20A to 20H, with Figures 20A, 20C, 20E, 20G showing original second images and Figures 20B, 20D, 20F, 20H showing processed images.
Similarly, original and processed images for use in calculating canopy cover are shown in Figures 21A and 21B respectively.
[0186] As previously mentioned a trial was performed using maize and soybean planted at 0.5 m row spacing with plots 2 metres wide and 10 metres wide. Three trial sites were planted including: a maize variety trial with four replicates of four varieties, surrounded by a buffer crop; a soybean variety trial with four replicates of four varieties, surrounded by a buffer crop; and a uniform maize trial for comparison with variety trial.
[0187] Seven camera towers were installed to monitor the trial sites, with six of the cameras on 6 m tall towers and a seventh camera was on a 10 m tall tower at the maize variety trial, with the taller tower providing a larger field of view and lower resolution per plot than the 6 m towers.
[0188] Results are presented for height, flowering, emergence rate and canopy cover detected. These results are presented for individual varieties of maize and soybean and also according to bay number, where bays 1, 2, 3 and 4 are 20 m, 30 m, 40 m, and 50 m from the camera tower, respectively. This can be used to determine the accuracy of the image analysis the further the crop is from the camera.
Height
[0189] Figures 22A to 22D and 23A to 23H compare height from the object tracking algorithm with ground truthing measurements for soybean and maize, for each of the four bays, respectively. These show a linear relationship between the measured and detected height. The 26 December maize height is underestimated for all bays because the manual measurements of the maize included the flower stalk, while the image analysis tracks the leaves throughout the season as the plants grows, rather than the flower stalk. Figures 24A to 24D and 25A to 25D show height curves for each variety of soybean and maize respectively.
[0190] Tables 1, 2A and 2B compare the error in maize and soybean height for each camera tower and distance of plots from the camera tower. These show that the error in height for soybean was 3.9-16.4% (1.4-4.4 cm) and for maize was 3.8-15.8% (4.8-12.4cm) in the 2016/17 trial (Table 2A) and for maize was 0.7-6.9% (5.9-15.6cm) in the 2018 trial (Table
2B). The error in height detection generally increased as the camera was further from the plot. The error in height detected for soybean was higher for the N-S tower than the S-N tower. In the 2016/17 season, the average error was 6.5%, 5.9%, 7.7% and 8.5% for bays one, two, three and four from the camera, respectively. There was no significant difference in height detected between the uniform crop and variety trial. In the 2018 season, the average error was 2.1%, 3.9%, 5.0% and 5.7% for bays one, two, three and four from the camera, respectively.
Table 1
Table 2A
Table 2B
Flowering
[0191] The date of the initial flowering was detected from the cameras, with results being shown in Table 3A for the 2016/17 season. These highlight that image analysis successfully determined initial flowering date for all plots for the 6 m tower and for 12 out of the 16 plots for the 10 m tower in the 2016/17 season.
Table 3A
[0192] The duration of flowering was estimated using the image analysis, with results being shown in Table 3B for the 2018 season. The image analysis determined initial flowering date and maximum flowering date for the plots on average within one day in the 2018 season.
Table 3B
[0193] The graphs shown in Figures 26A to 26D compare the detected flowering dates for different distances from the 6m tower. This shows that the proportion of detected flower pixels decreased as the plots were further from the crop. However, the flowering was still detected in plots four bays (50 metres) from the camera.
Emergence rate
[0194] The graphs shown in Figures 27A to 27D show the emergence rate curve over the season for the maize plots. The plants/m measured on germination date 7 November 2016 were 2.7 and 3.6 plants/m for maize and soybean, respectively. This was within one plant per metre estimated using image analysis for both crops.
Canopy cover
[0195] Tables 4 and 5 compare the canopy cover and ground truthing measurements for each camera tower and bay number from the camera. The error in canopy cover estimation was larger for maize (1.9-11.8%) than soybean (4.5-5.7%), primarily because the image analysis detected the percentage of detected green pixels while the field measurements were for the percentage width of the canopy in the crop row. Maize leaves are also thinner than soybean leaves and may have had gaps between the leaves leading to reduced canopy cover using the image analysis than the field measurements.
[0196] There was no significant difference in error for the soybean or maize canopy cover estimation at different distances from the camera, between the 6m and lOm tower, and between the uniform crop and variety trials.
Table 4
Table 5
[0197] The increase in the crop canopy from emergence until closed canopy is shown for each bay from the camera in Figures 28A to 28D, and for each variety of maize and soybean in Figures 29A to 29D and 30A and 30D respectively.
[0198] Accordingly, it will be appreciated that the above described arrangements provide mechanisms to monitor plant growth features. Specifically, the mechanisms can be used to allow plant growth in plots to be monitored using a monoscopic camera operating in a substantially automated fashion. This avoids the need for more complex sensing arrangements, whilst allowing measurements are accurately captured over the entire growing period.
[0199] Throughout this specification and claims which follow, unless the context requires otherwise, the word“comprise”, and variations such as“comprises” or“comprising”, will be understood to imply the inclusion of a stated integer or group of integers or steps but not the exclusion of any other integer or group of integers. As used herein and unless otherwise stated, the term "approximately" means ±20%.
[0200] It must be noted that, as used in the specification and the appended claims, the singular forms“a,”“an,” and“the” include plural referents unless the context clearly dictates otherwise. Thus, for example, reference to“a support” includes a plurality of supports. In this specification and in the claims that follow, reference will be made to a number of terms that shall be defined to have the following meanings unless a contrary intention is apparent.
[0201] It will of course be realised that whilst the above has been given by way of an illustrative example of this invention, all such and other modifications and variations hereto,
as would be apparent to persons skilled in the art, are deemed to fall within the broad scope and ambit of this invention as is herein set forth.
Claims
THE CLAIMS DEFINING THE INVENTION ARE AS FOLLOWS:
1) A method of monitoring plant growth features, the method including, in one or more electronic processing devices:
a) on each of a plurality of different time periods:
i) acquiring a plurality images of the plot captured by an imaging device;
ii) determining image properties of the acquired images; and,
iii) selecting at least one image for analysis;
b) analysing selected images for the plurality of time periods; and,
c) using results of the analysis to monitor plant growth features.
2) A method according to claim 1, wherein the method includes performing at least one of: a) a height analysis to determine a plant height;
b) a flowering analysis to determine a flowering status;
c) an emergence analysis to determine an emergence date;
d) a canopy analysis to determine a canopy extent;
e) a disease/pest analysis to determine a presence of disease/pests;
f) a status analysis to determine a plant status; and,
g) a condition analysis to determine growing conditions.
3) A method according to claim 1 or claim 2, wherein the method includes:
a) selecting a first image that is used to determine at least one of:
i) a plant height; and,
ii) a flowering status; and,
b) selecting a second image that is used to determine at least one of:
i) an emergence rate;
ii) a canopy cover;
iii) a presence of disease/pests; and,
iv) growing conditions.
4) A method according to claim 3, wherein the image properties include:
a) for the first image, at least one of:
i) a shutter speed closest to about 31 25ms;
ii) a shutter speed of less than or equal to about 31 25ms;
iii) an ISO of between 80 and 250; and,
iv) an ISO of about 64; and,
b) for the second image, at least one of:
i) has a short exposure time;
ii) has a low gain;
iii) a shutter speed closest to about l.25ms;
iv) a shutter speed of less than or equal to about l .25ms;
v) an ISO closest to about 50;
vi) an image captured closest to about l2pm; and,
vii)an image having a largest file size.
5) A method according to any one of the claims 1 to 4, wherein the images are monoscopic images and the method includes performing a geometric analysis of the image to correct for perspective distortion.
6) A method according to any one of the claims 1 to 5, wherein the method includes:
a) performing a geometric analysis of image features in the selected image to determine a perspective projection model from measured plot parameters; and,
b) analysing the image using the perspective projection model.
7) A method according to claim 6, wherein the measured plot parameters include:
a) a number of rows and columns; and,
b) a plot length and width.
8) A method according to claim 6 or claim 7, wherein the image features include at least one of:
a) site vertices;
b) site edges;
c) plot vertices;
d) plot edges; and,
e) plot rows.
9) A method according to any one of the claims 6 to 8, wherein the method includes determining measurements of plant height from the image using the perspective projection model and the measured plot parameters.
10) A method according to claim 9, wherein the method includes:
a) analysing the selected image to identify:
i) plant locations on a ground plane; and,
ii) canopy features; and,
b) measuring the plant height based on the plant locations, the canopy features and the perspective projection model.
11) A method according to claim 10, wherein the method includes:
a) using the canopy features to determine a top of each plant; and,
b) measuring the plant height by measuring along a perspective line upwardly from the plant locations to a top of each plant using the perspective projection model.
12) A method according to claim 10 or claim 11, wherein the method includes identifying canopy features by tracking movement between successive selected images for at least part of the plot.
13) A method according to any one of the claims 10 to 12, wherein the method includes: a) analysing the selected image to identify ground points; and,
b) using the ground points to determine the ground plane.
14)A method according to any one of the claims 6 to 13, wherein the method includes determining the perspective projection model using vanishing points, including:
a) a first vanishing point based on lines parallel to crop rows;
b) a second vanishing point based on lines perpendicular to crop rows; and,
c) a third vanishing point based on lines parallel to crop stems.
15) A method according to any one of the claims 6 to 14, wherein the method includes using a perspective correction model to a identify a location of the plot in the selected image.
16) A method according to any one of the claims 1 to 15, wherein the method includes:
a) analysing the selected image to identify ground points;
b) using the ground points to identify movement of the camera relative to the plot between different images; and,
c) using movement of the camera to analyse selected images.
17) A method according to any one of the claims 1 to 16, wherein the method includes determining a flowering status by:
a) extracting a saturation channel for the selected image;
b) enhancing a contrast in the saturation channel; and,
c) identifying flowers using the contrast enhanced saturation channel by performing a connected component analysis.
18) A method according to claim 17, wherein the method includes determining a flowering status by:
a) extracting a saturation channel from HSB (hue, saturation and brightness) space to create a saturation channel image;
b) inverting the saturation channel image to create an inverted image;
c) applying a local equalisation to the inverted image to create a contrast enhanced image; and,
d) applying a colour threshold to the contrast enhanced image to create a binary image.
19) A method according to claim 17 or claim 18, wherein the method includes determining a flowering status by:
a) identifying connected components in the binary image for a plot;
b) counting a number of large connected components in the plot, the large connected components having a number of pixels equal to or greater than 5% of pixels in the binary image; and,
c) determining the plot is flowering if there is more than one additional large connected component detected in the plot than in a previous time period.
20) A method according to any one of the claims 1 to 19, wherein the method includes:
a) applying colour segmentation to the selected image to identify green pixels; and, b) using a number of detected green pixels to determine at least one of:
i) an emergence status; and,
ii) a canopy cover status.
21)A method according to claim 20, wherein the method includes identifying green pixels when a green value of the sum of the red, green and blue channels is greater than at least one of:
a) about 0.34; and,
b) about 0.31.
22) A method according to claim 20 or claim 21, wherein the method includes determining emergence if any green pixels are identified in the plot.
23) A method according to any one of the claims 20 to 22, wherein the method includes calculating a canopy cover based on a proportion of green pixels to total pixels for the plot.
24) A method according to any one of the claims 1 to 23, wherein the method includes:
a) applying colour segmentation to the selected image to identify pixels having a defined colour; and,
b) using a number of identified pixels to determine at least one of:
i) a presence of a disease or pests; and,
ii) growing conditions.
25) A method according to any one of the claims 1 to 24, wherein the method includes:
a) determining a plant profile indicative of expected plant growth features; and, b) comparing monitored plant growth features to the plant profile to determine a plant status.
26) A method according to any one of the claims 1 to 25, wherein the method includes generating a notification if the monitor plant growth features fall outside expected plant growth feature ranges.
27) A method according to any one of the claims 1 to 26, wherein the method includes acquiring the images using a client device, and transferring the images to a remote server for analysis.
28) A method of monitoring plant growth features in a plot, the method including, in one or more electronic processing devices:
a) acquiring an image of the plot captured by an imaging device that generates monoscopic images;
b) performing a geometric analysis of image features to determine a perspective projection model from measured plot parameters; and,
c) analysing the image using the perspective projection model to monitor plant growth features.
29) A method according to claim 28, wherein the measured plot parameters include:
a) a number of rows and columns; and,
b) a plot length and width.
30)A method according to claim 28 or claim 29, wherein the image features include at least one of:
a) site vertices;
b) site edges;
c) plot vertices;
d) plot edges; and,
e) plot rows.
31) A method according to any one of the claims 28 to 30, wherein the method includes determining measurements of plant height from the image using the perspective projection model and the measured plot parameters.
32) A method according to claim 31, wherein the method includes:
a) analysing the selected image to identify:
i) plant locations on a ground plane; and,
ii) canopy features; and,
b) measuring the plant height based on the plant locations, the canopy features and the perspective projection model.
33) A method according to any one of the claims 28 to 32, wherein the method includes: a) using the canopy features to determine a top of each plant; and,
b) measuring the plant height by measuring along a perspective line upwardly from the plant locations to atop of each plant using the perspective projection model.
34)A method according to any one of the claims 28 to 33, wherein the method includes identifying canopy features by tracking movement between successive selected images for at least part of the plot.
35) A method according to any one of the claims 28 to 34, wherein the method includes: a) analysing the selected image to identify ground points; and,
b) using the ground points to determine the ground plane.
36)A method according to any one of the claims 28 to 35, wherein the method includes determining the perspective projection model using vanishing points, including:
a) a first vanishing point based on lines parallel to crop rows;
b) a second vanishing point based on lines perpendicular to crop rows; and,
c) a third vanishing point based on lines parallel to crop stems.
37) A method according to any one of the claims 28 to 36, wherein the method includes using a perspective correction model to a identify a location of the plot in the selected image.
38) A system for monitoring plant growth features in a plot, the system including one or more electronic processing devices that:
a) on each of a plurality of different time periods:
i) acquire a plurality images of the plot captured by an imaging device;
ii) determine image properties of the acquired images; and,
iii) select at least one image for analysis;
b) analyse selected images for the plurality of time periods; and,
c) use results of the analysis to monitor plant growth features.
39) A system according to claim 38, wherein the system performs the method of any one of the claims 1 to 27.
40) A system for monitoring plant growth features in a plot, the system including one or more electronic processing devices that:
a) acquire an image of the plot captured by an imaging device that generates monoscopic images;
b) perform a geometric analysis of image features to determine a perspective projection model from measured plot parameters; and,
c) analyse the image using the perspective projection model to monitor plant growth features.
41) A system according to claim 40, wherein the system performs the method of any one of the claims 28 to 37.
42) A system according to any one of the claims 38 to 41, wherein the imaging device is part of a smartphone.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201862691524P | 2018-06-28 | 2018-06-28 | |
| US62/691,524 | 2018-06-28 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020000043A1 true WO2020000043A1 (en) | 2020-01-02 |
Family
ID=68985304
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/AU2019/050670 Ceased WO2020000043A1 (en) | 2018-06-28 | 2019-06-27 | Plant growth feature monitoring |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2020000043A1 (en) |
Cited By (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111399515A (en) * | 2020-03-31 | 2020-07-10 | 连云港市水利学会 | Wetland environment electronic monitoring system based on natural factor disturbance |
| CN112595714A (en) * | 2020-11-26 | 2021-04-02 | 中国农业科学院农业资源与农业区划研究所 | Tobacco nutrition state discrimination method based on mobile phone image analysis |
| CN112633107A (en) * | 2020-12-16 | 2021-04-09 | 广东省林业科学研究院 | Intelligent forestry monitoring method, device, equipment and storage medium |
| CN113643231A (en) * | 2021-06-24 | 2021-11-12 | 河南农业大学 | Crop emergence quality detection method based on depth image |
| CN115443890A (en) * | 2022-09-05 | 2022-12-09 | 郑州信息科技职业学院 | Landscape's wisdom irrigation management system |
| WO2022258653A1 (en) * | 2021-06-10 | 2022-12-15 | Eto Magnetic Gmbh | Device for detecting a sprouting of sown seeds, agricultural sensor device, and agricultural monitoring and/or control method and system |
| CN115661639A (en) * | 2022-10-08 | 2023-01-31 | 国网山西省电力公司 | A regional anomaly monitoring method based on edge computing |
| KR20240080180A (en) * | 2024-02-20 | 2024-06-05 | 전남대학교산학협력단 | Device and operating method thereof for learning and inferring for each variety of kalanchoe |
| CN118397075A (en) * | 2024-06-24 | 2024-07-26 | 合肥工业大学 | Calculation method of mountain forest effective leaf area index based on fisheye camera |
| CN118735713A (en) * | 2024-08-09 | 2024-10-01 | 深圳市启明云端科技有限公司 | A method, device and system for intelligent management of animal husbandry information |
| US12148207B1 (en) * | 2023-06-14 | 2024-11-19 | Zhejiang Lab | Method and system for intelligent identification of rice growth potential based on UAV monitoring |
| RU2832186C2 (en) * | 2021-06-10 | 2024-12-23 | Это Магнетик Гмбх | Sowing germination detection device, agricultural sensor device, as well as agricultural monitoring and/or control system |
| CN120635728A (en) * | 2025-08-12 | 2025-09-12 | 浙江托普云农科技股份有限公司 | Vision-based plant canopy coverage calculation method, system and device |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2013016603A1 (en) * | 2011-07-27 | 2013-01-31 | Dow Agrosciences Llc | Plant growth kinetics captured by motion tracking |
| US20170286772A1 (en) * | 2016-04-04 | 2017-10-05 | BeeSprout LLC | Horticultural monitoring system |
| US20170323426A1 (en) * | 2016-05-05 | 2017-11-09 | The Climate Corporation | Using digital images of a first type and a feature set dictionary to generate digital images of a second type |
| US20170372137A1 (en) * | 2015-01-27 | 2017-12-28 | The Trustees Of The University Of Pennsylvania | Systems, devices, and methods for robotic remote sensing for precision agriculture |
| GB2553631A (en) * | 2017-06-19 | 2018-03-14 | Earlham Inst | Data Processing of images of a crop |
-
2019
- 2019-06-27 WO PCT/AU2019/050670 patent/WO2020000043A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2013016603A1 (en) * | 2011-07-27 | 2013-01-31 | Dow Agrosciences Llc | Plant growth kinetics captured by motion tracking |
| US20170372137A1 (en) * | 2015-01-27 | 2017-12-28 | The Trustees Of The University Of Pennsylvania | Systems, devices, and methods for robotic remote sensing for precision agriculture |
| US20170286772A1 (en) * | 2016-04-04 | 2017-10-05 | BeeSprout LLC | Horticultural monitoring system |
| US20170323426A1 (en) * | 2016-05-05 | 2017-11-09 | The Climate Corporation | Using digital images of a first type and a feature set dictionary to generate digital images of a second type |
| GB2553631A (en) * | 2017-06-19 | 2018-03-14 | Earlham Inst | Data Processing of images of a crop |
Non-Patent Citations (2)
| Title |
|---|
| GEE, C. ET AL.: "Crop/weed discrimination in perspective agronomic images", COMPUTERS AND ELECTRONICS IN AGRICULTURE, vol. 60, no. 1, January 2008 (2008-01-01), pages 49 - 59, XP022384158, DOI: 10.1016/j.compag.2007.06.003 * |
| LLORET, J. ET AL.: "A wireless sensor network for vineyard monitoring that uses image processing", SENSORS, vol. 11, no. 6, 7 June 2011 (2011-06-07), pages 6165 - 6196, XP055669927 * |
Cited By (20)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111399515B (en) * | 2020-03-31 | 2020-11-13 | 连云港市水利学会 | Wetland environment electronic monitoring system based on natural factor disturbance |
| CN111399515A (en) * | 2020-03-31 | 2020-07-10 | 连云港市水利学会 | Wetland environment electronic monitoring system based on natural factor disturbance |
| CN112595714A (en) * | 2020-11-26 | 2021-04-02 | 中国农业科学院农业资源与农业区划研究所 | Tobacco nutrition state discrimination method based on mobile phone image analysis |
| CN112633107A (en) * | 2020-12-16 | 2021-04-09 | 广东省林业科学研究院 | Intelligent forestry monitoring method, device, equipment and storage medium |
| CN117794354A (en) * | 2021-06-10 | 2024-03-29 | Eto电磁有限责任公司 | Devices for identifying germination of sown seeds, agricultural sensor devices, and agricultural monitoring and/or agricultural control methods and systems |
| WO2022258653A1 (en) * | 2021-06-10 | 2022-12-15 | Eto Magnetic Gmbh | Device for detecting a sprouting of sown seeds, agricultural sensor device, and agricultural monitoring and/or control method and system |
| RU2832186C2 (en) * | 2021-06-10 | 2024-12-23 | Это Магнетик Гмбх | Sowing germination detection device, agricultural sensor device, as well as agricultural monitoring and/or control system |
| CN113643231A (en) * | 2021-06-24 | 2021-11-12 | 河南农业大学 | Crop emergence quality detection method based on depth image |
| CN113643231B (en) * | 2021-06-24 | 2024-04-09 | 河南农业大学 | Crop seedling emergence quality detection method based on depth image |
| CN115443890A (en) * | 2022-09-05 | 2022-12-09 | 郑州信息科技职业学院 | Landscape's wisdom irrigation management system |
| CN115443890B (en) * | 2022-09-05 | 2023-09-29 | 郑州信息科技职业学院 | A smart irrigation management system for garden landscapes |
| CN115661639A (en) * | 2022-10-08 | 2023-01-31 | 国网山西省电力公司 | A regional anomaly monitoring method based on edge computing |
| US12148207B1 (en) * | 2023-06-14 | 2024-11-19 | Zhejiang Lab | Method and system for intelligent identification of rice growth potential based on UAV monitoring |
| KR20240080180A (en) * | 2024-02-20 | 2024-06-05 | 전남대학교산학협력단 | Device and operating method thereof for learning and inferring for each variety of kalanchoe |
| KR102692617B1 (en) * | 2024-02-20 | 2024-08-07 | 전남대학교산학협력단 | Device and operating method thereof for learning and inferring for each variety of kalanchoe |
| KR102707475B1 (en) * | 2024-02-20 | 2024-09-19 | 전남대학교산학협력단 | Device and operating method thereof for learning and inferring for each variety of kalanchoe |
| WO2025178180A1 (en) * | 2024-02-20 | 2025-08-28 | 전남대학교산학협력단 | Electronic device for learning and predicting quality factor for each variety of kalanchoe and operating method thereof |
| CN118397075A (en) * | 2024-06-24 | 2024-07-26 | 合肥工业大学 | Calculation method of mountain forest effective leaf area index based on fisheye camera |
| CN118735713A (en) * | 2024-08-09 | 2024-10-01 | 深圳市启明云端科技有限公司 | A method, device and system for intelligent management of animal husbandry information |
| CN120635728A (en) * | 2025-08-12 | 2025-09-12 | 浙江托普云农科技股份有限公司 | Vision-based plant canopy coverage calculation method, system and device |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2020000043A1 (en) | Plant growth feature monitoring | |
| Dorj et al. | An yield estimation in citrus orchards via fruit detection and counting using image processing | |
| US10614562B2 (en) | Inventory, growth, and risk prediction using image processing | |
| Gan et al. | Immature green citrus fruit detection using color and thermal images | |
| US20230292647A1 (en) | System and Method for Crop Monitoring | |
| RU2735151C2 (en) | Weeds identification in natural environment | |
| Vega et al. | Multi-temporal imaging using an unmanned aerial vehicle for monitoring a sunflower crop | |
| RU2764872C2 (en) | Weed detection in natural environment | |
| Yu et al. | Automatic image-based detection technology for two critical growth stages of maize: Emergence and three-leaf stage | |
| CN119128743B (en) | A decision-making method and system for multi-source information fusion | |
| Kawamura et al. | Discriminating crops/weeds in an upland rice field from UAV images with the SLIC-RF algorithm | |
| González-Esquiva et al. | Development of a visual monitoring system for water balance estimation of horticultural crops using low cost cameras | |
| Wang et al. | Side-view apple flower mapping using edge-based fully convolutional networks for variable rate chemical thinning | |
| Zhang et al. | Automatic flower cluster estimation in apple orchards using aerial and ground based point clouds | |
| Veramendi et al. | Method for maize plants counting and crop evaluation based on multispectral images analysis | |
| Syal et al. | A survey of computer vision methods for counting fruits and yield prediction | |
| Subeesh et al. | UAV imagery coupled deep learning approach for the development of an adaptive in-house web-based application for yield estimation in citrus orchard | |
| Castillo-Villamor et al. | The Earth Observation-based Anomaly Detection (EOAD) system: A simple, scalable approach to mapping in-field and farm-scale anomalies using widely available satellite imagery | |
| CN115861686A (en) | Litchi key growth period identification and detection method and system based on edge deep learning | |
| McCarthy et al. | Automated variety trial plot growth and flowering detection for maize and soybean using machine vision | |
| Swarup et al. | Strawberry plant wetness detection using color and thermal imaging | |
| Betitame et al. | A practical guide to UAV-based weed identification in soybean: Comparing RGB and multispectral sensor performance | |
| Yadav et al. | Supervised learning based greenery region detection using unnamed aerial vehicle for smart city application | |
| AHM et al. | A deep convolutional neural network based image processing framework for monitoring the growth of soybean crops | |
| Alexandridis et al. | LAI measurement with hemispherical photographs at variable conditions for assessment of remotely sensed estimations |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19825407 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 19825407 Country of ref document: EP Kind code of ref document: A1 |