US20240295954A1 - Systems and Methods for Providing Field Views Including Enhanced Agricultural Maps Having a Data Layer and Image Data - Google Patents
Systems and Methods for Providing Field Views Including Enhanced Agricultural Maps Having a Data Layer and Image Data Download PDFInfo
- Publication number
- US20240295954A1 US20240295954A1 US18/555,897 US202218555897A US2024295954A1 US 20240295954 A1 US20240295954 A1 US 20240295954A1 US 202218555897 A US202218555897 A US 202218555897A US 2024295954 A1 US2024295954 A1 US 2024295954A1
- Authority
- US
- United States
- Prior art keywords
- field
- map
- agricultural
- data
- user interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B79/00—Methods for working soil
- A01B79/005—Precision agriculture
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01C—PLANTING; SOWING; FERTILISING
- A01C7/00—Sowing
- A01C7/08—Broadcast seeders; Seeders depositing seeds in rows
- A01C7/10—Devices for adjusting the seed-box ; Regulation of machines for depositing quantities at intervals
- A01C7/102—Regulating or controlling the seed rate
- A01C7/105—Seed sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
Definitions
- Embodiments of the present disclosure relate generally to systems and methods for providing field views including enhanced agricultural maps having a data layer and image data.
- Planters are used for planting seeds of crops (e.g., corn, soybeans) in a field.
- crops e.g., corn, soybeans
- Some planters include a display monitor within a cab for displaying a coverage map that shows regions of the field that have been planted.
- the coverage map of the planter is generated based on planting data collected by the planter. A farmer or grower will interpret the coverage map during the planting to attempt to understand field conditions.
- FIG. 1 shows an example of a system for collecting data of agricultural fields and performing analysis of the data of agricultural fields in accordance with one embodiment
- FIG. 2 illustrates an architecture of an implement 200 for delivering applications (e.g., fluid applications, fluid mixture applications) to agricultural fields in accordance with one embodiment
- FIG. 3 illustrates a flow diagram of one embodiment for a method of customizing field views of agricultural fields with enhanced maps
- FIG. 4 illustrates a monitor or display device having a user interface 401 with a split screen view that includes a map of a data layer and an overview image in accordance with one embodiment
- FIG. 5 illustrates a monitor or display device having a user interface 501 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment
- FIG. 6 illustrates a monitor or display device having a user interface 601 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with another embodiment
- FIG. 7 illustrates a user interface 701 with map 710 and associated scale region 720 , and an image 750 for the selected icon 740 ;
- FIG. 8 displays a user interface 801 of FIG. 8 with map 710 and zoomed image 850 to show more details of the crops, weeds, and soil conditions at the geographical location for the selected icon 740 ;
- FIG. 9 illustrates the user interface 901 having an image 950 for the selected icon 940 ;
- FIG. 10 illustrates a zoomed image 951 to show more details of the crops, weeds, and soil conditions at the geographical location for the selected icon 940 ;
- FIG. 11 illustrates a monitor or display device having a user interface 1101 with a split screen view that includes icons overlaid on an overview image 1110 of a field and also the image 951 for a selected icon 940 in accordance with another embodiment;
- FIG. 12 illustrates a monitor or display device having a user interface 1201 with a split screen view that includes maps of different data layers in accordance with one embodiment
- FIG. 13 illustrates a monitor or display device having a user interface 1301 with a split screen view that includes maps of different data layers in accordance with one embodiment
- FIG. 14 illustrates a monitor or display device having a user interface 1401 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment
- FIG. 15 illustrates a user interface 1501 ;
- FIG. 16 illustrates a monitor or display device having a user interface 1601 with a split screen view that includes maps of different data layers in accordance with one embodiment
- FIG. 17 illustrates a monitor or display device having a user interface 1701 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment
- FIG. 18 illustrates that if an icon 1735 is selected from map 1710 , then an image 1850 is displayed;
- FIG. 19 shows an example of a system 2700 that includes a machine 2702 (e.g., tractor, combine harvester, etc.) and an implement 2740 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) in accordance with one embodiment;
- a machine 2702 e.g., tractor, combine harvester, etc.
- an implement 2740 e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.
- FIGS. 20 A and 20 B illustrate a flow diagram of one embodiment for a computer implemented method of measuring and quantifying crop emergence uniformity within an agricultural field
- FIG. 21 illustrates an example of a predetermined mask to select portions of an image in accordance with one embodiment
- FIG. 22 illustrates a sample predicted output from a deep learning model (DLM) in accordance with one embodiment
- FIG. 23 illustrates an example of a moving average of vegetation intensity or a weighted average of vegetation intensity along a row in accordance with one embodiment
- FIG. 24 illustrates a weighted value diagram 2400 that can be applied to the entire one or more rows of crops in the event that the image plane (e.g., forward looking image plane with upward tilt) of a sensor (e.g., camera) is not coplanar with the ground surface in accordance with one embodiment.
- image plane e.g., forward looking image plane with upward tilt
- sensor e.g., camera
- Described herein are systems and methods for providing enhanced field views including enhanced maps based on capturing images of crops during different stages.
- a computer implemented method for customizing field views of data displays that comprises obtaining a data layer for an agricultural parameter from sensors of an agricultural implement or machine during an application pass for a field and generating a user interface with an enhanced map that includes the data layer for the agricultural parameter and generating selectable icons overlaid at different geographic locations on the enhanced map for the field with the selectable icons representing captured images at the different geographic locations.
- Each selectable icon represents an image from the field to show crop, weed, or soil conditions.
- the user interface that comprises a split screen view with the enhanced map on a first side of the split screen view and an overview image of the field on a second side of the split screen view.
- a further aspect of the disclosure provides the agricultural implement that comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- a further aspect of the disclosure further comprises displaying the user interface with the enhanced map on the display device, receiving a user input to select an icon of the enhanced map, and generating an updated user interface with the enhanced map and an image that is associated with the selected icon changing color on the enhanced map.
- a further aspect of the disclosure provides the image that is displayed as a pop up window or over an overview image of the field.
- a further aspect of the disclosure includes the enhanced map that provides an ability to select icons throughout the field to show actual captured images of crops, weeds, and conditions of soil of the field.
- a further aspect of the disclosure includes the selectable icons that are generated and overlaid at different geographic locations on the enhanced map for the field based on a spatial trigger to capture an image during the application pass per unit area within the field, a threshold trigger for when an agricultural parameter exceeds a threshold for the agricultural parameter, a time based trigger for capturing images, or a burst capture of images.
- a further aspect of the disclosure includes the selectable icons that are generated and overlaid at different geographic locations on the enhanced map for the field based on a threshold trigger including a weed density exceeding a threshold trigger for weed density or an emergence value exceeding a threshold trigger for emergence data.
- a further aspect of the disclosure provides the agricultural parameter that comprises one or more of seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, and seed germination risk.
- a computing device that comprises a display device for displaying a user interface having a scale region and a field region for an agricultural parameter and a processor coupled to the display device.
- the processor is configured to generate a data layer for the agricultural parameter from sensors of an agricultural implement that collects the data during an application pass for a field, to generate the user interface with an enhanced map that includes the data layer for the agricultural parameter, and to generate selectable icons or symbols overlaid at different geographic locations on the enhanced map for the field with the selectable icons representing captured images at the different geographic locations.
- a further aspect of the disclosure includes the user interface that further comprises a split screen view with the enhanced map on a first side of the split screen view and an overview image of the field on a second side of the split screen view.
- a further aspect of the disclosure includes the agricultural implement that comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- a further aspect of the disclosure includes the display device to display the user interface with the enhanced map and to receive a user input to select an icon of the enhanced map, wherein the processor is configured to generate an updated user interface with the enhanced map and an image that is associated with a selected icon or symbol based on the user input with the selected icon or symbol changing color.
- a further aspect of the disclosure includes the updated user interface to provide a selectable orientation option to rotate an orientation of the images of the user interface, a selectable expand option to control sizing of a displayed map in a field region, a selectable icon or symbol option to enable or disable showing icons or symbols on the enhanced map, a selectable full map option to switch between a full screen view of map versus a split screen view having both of a map and an overview image, and a selectable statistics option to show statistics for the data layer.
- a further aspect of the disclosure includes the display device to receive a user input to modify the scale region and to display a modified scale region and a corresponding modified field region.
- a computer implemented method for customizing field views of a field region comprises obtaining a data layer for an agricultural parameter from sensors of an agricultural implement that collects data during an application pass for a field and generating selectable icons and overlaying the selectable icons at different geographic locations on an enhanced map of the data layer for the field based on a spatial trigger to capture an image during the application pass per unit area or when an agriculture parameter compares in a predetermined manner to a threshold trigger for the agricultural parameter.
- a further aspect of the disclosure further comprises comparing the agricultural parameter to the threshold trigger, determining whether the agricultural parameter exceeds the threshold trigger for a location within the field, and generating a selectable icon when the agricultural parameter exceeds the threshold trigger for the location within the field.
- a further aspect of the disclosure includes the threshold trigger that comprises a weed threshold that is compared to a weed density.
- a further aspect of the disclosure includes the threshold trigger that comprises an emergence threshold that is compared to an emergence value for plant emergence data.
- a further aspect of the disclosure further comprises displaying a user interface with the enhanced map that includes the data layer for the agricultural parameter and the selectable icons overlaid at different geographic locations on the enhanced map for the field.
- a further aspect of the disclosure includes the agricultural implement that comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- a computer implemented method for measuring and quantifying crop emergence uniformity within an agricultural field.
- the method comprises obtaining one or more images of biomass data for a region of interest within the agricultural field from one or more sensors of an agricultural implement or machine, which can be traversing the field to obtain the biomass data for various crop stages or for an application pass.
- the computer implemented method partitions a captured image into tiles, provides the tiles to a deep learning model to provide modeled tiles with predicted pixel values (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) and reassembles the modeled tiles on a per tile basis to display the targeted type of vegetation in dimensionality of the original one or more images.
- a further aspect of the disclosure includes applying a predetermined mask to select portions of the one or more images that correspond with the targeted type of vegetation pixels.
- a further aspect of the disclosure includes accumulating the targeted type of vegetation pixels to create one or more rows of crops (e.g., vertical lines) corresponding to vegetation intensity.
- crops e.g., vertical lines
- a further aspect of the disclosure includes applying a filter (e.g., one-dimensional filter) with a length corresponding to the spacing in pixels between individual plants of the targeted type of plants along a row of plant intensity (e.g., vertical line of plant intensity) to determine a simple moving average or a weighted average of vegetation intensity for the targeted type of plants.
- a filter e.g., one-dimensional filter
- a row of plant intensity e.g., vertical line of plant intensity
- a further aspect of the disclosure includes applying upper and lower thresholds to the simple moving average or a weighted average of vegetation intensity along the one or more rows of crops (one or more vertical lines) and determining a targeted plant uniformity based on the simple moving average or a weighted average of vegetation intensity of the targeted type of plants and the thresholding for lower (minimum) and upper (maximum) vegetation intensity.
- the portion of the crop row (or vertical line) meeting both thresholding criteria can represent an emergence score between 0 and 100%.
- Described herein are systems and methods for customizing views of visualized data (such as from agricultural fields for weed maps during different crop stages, crop emergence, etc.) based on sensors of agricultural implements or machines.
- At least one of an implement, a machine, an agricultural vehicle, an aerial device, a drone, a self-propelled device (e.g., robot, off-road vehicle, ATV, UTV), an electronic device, or a mobile device having sensors (e.g., image capturing devices) collects agricultural data before, during, or after an application pass.
- the agricultural data may include a data layer that is mapped as a field view on a monitor or display device and image data that overlays the data layer to enhance a user experience in viewing and understanding the agricultural data.
- icons e.g., camera icon, image icon
- the implement e.g., a machine, an agricultural vehicle, or an aerial device with the sensors (e.g., a camera or set of cameras) captured images of regions of the field.
- the captured images are used for weed identification or for crop emergence.
- an operator selects (e.g., user input, touch input) the icon from the monitor or display device
- the image from that geographic location in the field is displayed either as a pop up window over the map or in a side by side view with the map.
- the icon can change color to indicate which icon was selected.
- the image data can be overlaid, associated, merged, or combined with the data layer for a field view.
- the user can customize (e.g., change, expand, pan) a scale of a parameter for a sub region (e.g., scale region) of a user interface and a corresponding field view of an agricultural field of the user interface automatically changes in response to the customized change in order to have a customized view of the parameter being displayed in the field view.
- the user does not need to manually adjust the field view because this adjustment occurs automatically upon adjusting the scale region.
- expand can refer to both a positive expansion and a negative expansion (contraction).
- FIG. 1 shows an example of a system for collecting and analyzing agricultural data from agricultural fields in order to display customized agricultural data in accordance with one embodiment.
- Machines and implements of the system 100 perform agricultural operations (e.g., planting seed in a field, applying fluid applications to plants) of agricultural fields.
- the system 100 may be implemented as a cloud based system with servers, data processing devices, computers, etc.
- Aspects, features, and functionality of the system 100 can be implemented in servers, wireless nodes, planters, planter monitors, sprayers, sidedress bars, combines, laptops, tablets, computer terminals, client devices, user devices, handheld computers, personal digital assistants, cellular telephones, cameras, smart phones, mobile phones, computing devices, or a combination of any of these or other data processing devices.
- the system 100 can include a network computer or an embedded processing device within another device (e.g., display device), an implement, or within a machine (e.g., tractor cab, agricultural vehicle), or other types of data processing systems having fewer components or perhaps more components than that shown in FIG. 1 .
- the system 100 e.g., cloud based system
- agricultural operations can control and monitor planting and fluid applications using an implement or machine.
- the system 100 includes machines 140 , 142 , 144 , 146 and implements 141 , 143 , 145 coupled to a respective machine 140 , 142 , and 144 .
- the implements can include flow devices for controlling and monitoring applications (e.g., seeding, spraying, fertilization) of crops and soil within associated fields (e.g., fields 103 , 107 , 109 ).
- the system 100 includes an agricultural analysis system 102 that can include a weather store 150 with current and historical weather data, weather predictions module 152 with weather predictions for different regions, and at least one processing system 132 for executing instructions for controlling and monitoring different operations (e.g., fluid applications).
- the storage medium 136 may store instructions, software, software programs, etc. for execution by the processing system and for performing operations of the agricultural analysis system 102 .
- storage medium 136 may contain a fluid application prescription (e.g., fluid application prescription that relates georeferenced positions in the field to application rates).
- the implement 141 (or any of the implements) may include an implement with a pump, flow sensors and/or flow controllers that may be specifically the elements that are in communication with the network 180 for sending control signals or receiving as-applied data.
- the network 180 e.g., any wireless network, any cellular network (e.g., 4G, 5G), Internet, wide area network, WiMax, satellite, IP network, etc.) allows the system 102 , wireless nodes, machines, and implements of FIG.
- a monitor preferably includes a graphical user interface (“GUI”), a memory, a central processing unit (“CPU”), and a bus node.
- GUI graphical user interface
- CPU central processing unit
- the bus node preferably comprises a controller area network (“CAN”) node including a CAN transceiver, a controller, and a processor.
- the monitor is preferably in electrical communication with a speed sensor (e.g., a radar speed sensor mounted to a tractor) and a global positioning receiver (“GPS”) receiver mounted to the tractor (or in some embodiments to a toolbar of an implement).
- a speed sensor e.g., a radar speed sensor mounted to a tractor
- GPS global positioning receiver
- a monitor A of a first machine collects as applied data at various points in the field.
- the first machine may be coupled to the agricultural implement and causing the agricultural implement to traverse the field.
- the as applied data can be seeding information, such as percent singulation, skips, multiples, downforce, applied fluids, depth measurements, agronomic measurements, and anything else that is collected.
- the as applied data is collected and stored in a monitor data file of the monitor A, field boundary and prescriptions are embedded into the data file.
- File transfer from monitor A of the first machine to monitor B of a second machine can be accomplished through any data exchange, such as saving the file to a USB stick, via cloud exchange, or by direct vehicle to vehicle communications network.
- the first machine and the second machine are communicatively coupled to the network 180 and one or more files are transferred from the monitor A to the monitor B via the network 180 .
- Data recorded by monitor A at one location can be used to influence control of monitor B in other locations or the same location during a different application pass. For instance, when seeds are dropped, spatial data indicates that seeds have been applied (or covered) in that area. That coverage information can then be used by monitor B as the equipment traverses the field for a different application to instruct the control modules when to turn on or off. This information is used to automatically control the equipment. Many data channels exist that are mapped spatially to be viewed by the operator. In many cases, this data is not used by the monitor to automatically control itself while the equipment traverses the field. However, the operator is influenced by this information, and the operator may choose to operate the equipment in a different way based on data from previous field passes and his present location in the field. Sharing data between equipment can either influence the automatic control of the equipment, or it influences the operator, who then controls the equipment differently.
- FIG. 2 illustrates an architecture of an implement 200 for delivering applications (e.g., fluid applications, fluid mixture applications) to agricultural fields in accordance with one embodiment.
- the implement 200 includes at least one storage tank 250 , flow lines 260 and 261 , a flow controller 252 (e.g., valve), and at least one variable-rate pump 254 (e.g., electric, centrifugal, piston, etc.) for pumping and controlling application rate of a fluid (e.g., fluid application, semifluid mixture) from the at least one storage tank to different application units 210 - 217 , respectively of the implement.
- At least one flow sensor 270 can be utilized on the implement 200 either row-by-row or upstream of where the fluid branches out to the application units as illustrated in FIG. 2 .
- the flow controller 252 can be row-by-row as opposed to implement-wide as shown in FIG. 2 .
- the applications units are mechanically coupled to the frames 220 - 227 which are mechanically coupled to a bar 10 .
- Each application unit 210 - 217 can include flow sensors and components having a placement mechanism (e.g., planting contacting members, feelers, guidance members) for obtaining a proper orientation and/or positioning of a fluid outlet with respect to a plant in an agricultural field.
- a placement mechanism e.g., planting contacting members, feelers, guidance members
- FIG. 3 illustrates a flow diagram of one embodiment for a method 300 of providing enhanced field views of data displays based on vision scouting of crops, weeds, and field conditions.
- the method 300 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both.
- the method 300 is performed by processing logic of a processing system of a system 102 , machine, apparatus, implement, agricultural vehicle, aerial device, monitor, display device, user device, self-guided device, or self-propelled device (e.g., robot, ATV, UTV, etc.).
- the processing system executes instructions of a software application or program with processing logic.
- the software application or program can be initiated by the processing system.
- a monitor or display device receives user input and provides a customized display for operations of the method 300 .
- a software application is initiated on the processing system and displayed on a monitor or display device as a user interface.
- the processing system may be integrated with or coupled to a machine that performs an application pass (e.g., planting, tillage, fertilization, spraying, etc.).
- the processing system may be integrated with an apparatus (e.g., drone, image capture device) associated with the machine that captures images before, during, or after the application pass.
- the user interface includes a map of a data layer (e.g., seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, seed germination risk) for a field of interest and an overview image of the field of interest.
- a data layer e.g., seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to
- Seed germination risk can be germination/emergence (no germination/emergence, on time germination/emergence, or late germination/emergence) or factors other than time, such as, deformities, damaged seed, reduced vigor, or disease. Seed germination risk can be high, medium, or low, or it can be on-time emergence, late emergence, or no emergence.
- the data layer can be generated from data collected by sensors on an implement, a machine pulling the implement during a current application pass, an aerial device, a user device, a self-guided device, a self-propelled device, etc., or the data layer can be generated from a previous application pass through the field.
- the sensors may be in-situ sensors positioned on each row unit of an implement, spaced across several row units, or positioned on a machine.
- the software application receives user input, and generates an updated user interface that is displayed with the monitor or display device.
- the updated user interface is generated based on the user input and may include an enhanced map of the data layer and optionally the overview image of the field.
- the enhanced map includes the data layer and also icons or symbols to represent captured images at different georeferenced positions across the field.
- the icons e.g., camera icons, image icons
- the symbols can be positioned spatially at a certain approximate distance from each other within the field based on a user defined spatial or grid based input.
- the icons or symbols can be positioned on a field view based on a threshold trigger that is compared to an agricultural parameter (e.g., agricultural parameter is less than, equal to, or exceeds a threshold value for the agricultural parameter; weed pressure or density is less than, equal to, or exceeds a threshold trigger; emergence value is less than, equal to, or exceeds a threshold trigger, etc.) for the data layer at different locations within a field.
- an agricultural parameter e.g., agricultural parameter is less than, equal to, or exceeds a threshold value for the agricultural parameter; weed pressure or density is less than, equal to, or exceeds a threshold trigger; emergence value is less than, equal to, or exceeds a threshold trigger, etc.
- the icons or symbols can be positioned based on a user defined time period or predetermined time period (e.g., capture 1 image every 10 seconds, capture 1 image every 1 minute). In another example, at operation 306 , the icons or symbols can be positioned based on a burst capture of images at certain locations within the field.
- the software application receives a user input (e.g., touch user input) to select an icon or symbol of the enhanced map of the data layer.
- a user input e.g., touch user input
- the software application generates an updated user interface based on the user input with the selected icon changing color.
- the monitor or display device displays the updated user interface including the enhanced map having the selected icon changing color and an image of the selected icon being displayed as a pop up window or over the overview image of the field.
- the user experience and understanding of a color map of the data layer is improved by being able to select icons or symbols throughout a field to show actual captured images of crops, weeds, and conditions of the soil of the field in combination with the map of the data layer.
- the software application may optionally receive additional user input (e.g., expand (positive expansion, negative expansion or contraction), panning operation) to modify a scale of the scale region for the agricultural parameter.
- additional user input e.g., expand (positive expansion, negative expansion or contraction), panning operation
- a scale of the scale region for an agricultural parameter can be modified from being between 0 to 100 percent to being between 20 to 50% based on the user input.
- the displayed field region of the enhanced map is modified in a corresponding manner as the modified scale region and will only show values between 20 to 50% for this example.
- the software application generates a modified scale region and also a modified field region based on the additional user input.
- U.S. Pat. No. 10,860,189 which is incorporated by reference herein, describes how to generate a modified scale region and also a modified field region based on the user input.
- the monitor or display device displays the modified scale region and the corresponding modified field region.
- the operations 312 , 314 , and 316 can be repeated if additional user input for modifying the scale region are received by the software application.
- the user input can include first expand operation (e.g., pinch motion with 2 user input points contacting the scale region and moving towards each other to expand in (or contract), e.g., 1 finger and 1 thumb or 2 fingers), a second expand operation (e.g., expand with 2 user input points contacting the scale region moving away from each other to expand out), a first panning operation (e.g., panning with 1 user input point contacting the scale region and moving upwards (or downwards), e.g. 1 finger or 1 thumb), or a second panning operation (e.g., panning with 1 user input point contacting scale region and moving downwards (or upwards), e.g. 1 finger or 1 thumb).
- first expand operation e.g., pinch motion with 2 user input points contacting the scale region and moving towards each other to expand in (or contract), e.g., 1 finger and 1 thumb or 2 fingers
- a second expand operation e.g., expand with 2 user input points contacting the scale region moving away from each other to expand out
- the operations of the method(s) disclosed herein can be altered, modified, combined, or deleted.
- the methods in embodiments of the present disclosure may be performed with a device, an apparatus, or data processing system as described herein.
- the device, apparatus, or data processing system may be a conventional, general-purpose computer system or special purpose computers, which are designed or programmed to perform only one function, may also be used.
- FIG. 4 illustrates a monitor or display device having a user interface 401 with a split screen view that includes a map of a data layer and an overview image in accordance with one embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 401 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the map 410 e.g., weed map
- the overview image 450 shows an overview of the field and has a scale region 460 .
- the user interface 401 includes a selectable orientation option 480 to rotate an orientation of the images of the user interface with respect to a true North direction, a selectable plus/minus zoom option 481 , a selectable pinch to zoom option 482 , a selectable expand option 483 to control sizing of a displayed map in a field region, a selectable icon option 484 to enable or disable showing image icons or symbols on the map 410 , a selectable full map option 485 to switch between different viewing options (e.g., a full screen view of map 410 , a split screen view having both map 410 and overview image 450 , a split screen view having an image with no map, etc.) and a selectable statistics option 486 to show statistics (e.g., bar charts, numerical data, histograms, number of acres of a field having weed pressure or density that exceeds a threshold) for the data layer or the weed data of the weed pressure or weed density.
- statistics e.g., bar charts, numerical data,
- FIG. 5 illustrates a monitor or display device having a user interface 501 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 501 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the enhanced map 510 (e.g., enhanced weed map) shows a weed data layer across a field with selectable icons or symbols for images and a scale region 520 shows weed pressure, coverage, or weed density on a scale from 100% to 0%.
- the overview image 550 shows an overview of the field and has a scale region 560 .
- the images that are represented with icons or symbols are captured based on a spatial triggering (e.g., user provides an input prior to or during an application pass to capture an image during the application pass every acre, every 2 acres, every 5 acres, etc.) or grid based triggering as a machine pulls an implement through a field for an application pass.
- the icons or symbols and associated captured images are located approximately equidistant from each other as the implement traverses through the field for an application pass.
- the data layer of the map can also be generated based on capturing images from sensors of an implement, machine, or aerial device
- a grower provides an input prior to or during a spraying operation for a spatial or grid based triggering of image capturing devices or sensors during the spraying operation.
- the image capturing devices or sensors capture at least one image for every location that is triggered spatially or based on a grid as defined by the grower.
- FIG. 6 illustrates a monitor or display device having a user interface 601 with a split screen view that includes an enhanced map of a data layer with icons or symbols and an overview image in accordance with another embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 601 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the enhanced map 610 e.g., enhanced weed map
- the overview image 650 shows an overview of the field and has a scale region 660 .
- the images that are represented with icons or symbols are captured based on a threshold triggering (e.g., agricultural parameter exceeds a threshold value for the agricultural parameter, weed density exceeds a weed threshold trigger (e.g., 80%) then capture an image, emergence value exceeds an emergence threshold trigger then capture an image, etc.) as a machine pulls an implement through a field for an application pass.
- a threshold triggering e.g., agricultural parameter exceeds a threshold value for the agricultural parameter, weed density exceeds a weed threshold trigger (e.g., 80%) then capture an image, emergence value exceeds an emergence threshold trigger then capture an image, etc.
- the icons or symbols and associated captured images are located at a geographical location whenever the agricultural parameter threshold is triggered as the implement traverses through the field.
- the software application Upon selection of an icon or symbol from the user interface 601 of FIG. 6 , the software application displays a user interface 701 with map 710 and associated scale region 720 , an image 750 for the selected icon or symbol 740 of FIG. 7 and a scale region 760 .
- the image 750 is an actual field image of crops, weeds, and soil conditions for the selected location from the map 610 .
- navigation can occur from full screen to split screen view and then an image option 762 can have a drop down sub-menu to select a different data layer or agricultural parameter for display.
- the image option 762 displays an image 750 that was selected by selecting icon or symbol 740 .
- the software application Upon a pinch zoom input to the image 750 , the software application displays a user interface 801 of FIG. 8 with map 710 and zoomed image 850 to show more details of the crops, weeds, and soil conditions at the geographical location for the selected icon 740 .
- the software application Upon selection of a different icon 940 from the user interface 901 of FIG. 9 , the software application displays an image 950 for the selected icon 940 .
- the image 950 is an actual field image of crops, weeds, and soil conditions for the selected location from the map 910 .
- the weed coverage, pressure, or density exceeds a threshold and this triggers capturing the image 950 in real time from an implement or machine during an application pass or from a previous application pass.
- the software application Upon a pinch zoom input to the image 950 , the software application displays a zoomed image 951 of FIG. 10 to show more details of the crops, weeds, and soil conditions at the geographical location for the selected icon 940 .
- FIG. 11 illustrates a monitor or display device having a user interface 1101 with a split screen view that includes icons or symbols overlaid on an overview image 1110 of a field and also the image 951 for a selected icon 940 in accordance with another embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1101 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the selectable icons or symbols represent captured images and a scale region 1120 shows weed pressure, coverage or weed density on a scale from 100% to 0%.
- Selection of the icon 940 causes an image 951 to be displayed.
- the icons and associated captured images are located at geographical locations whenever the icons are spatially triggered as the implement traverses through the field.
- FIG. 12 illustrates a monitor or display device having a user interface 1201 with a split screen view that includes maps of different data layers in accordance with one embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1201 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the map 1210 e.g., commanded planting population map from a planter, planted population map based on data from a seed sensor
- the map 1250 e.g., actual emerged population map based on data from a sensor after plants emerge from the soil
- the user interface 1201 includes an orientation option 1280 to rotate an orientation of the images of the user interface with respect to a true North direction, a plus/minus zoom option 1281 , a pinch to zoom option 1282 , an expand option 1283 to control sizing of a displayed map in a field region, an icon option 1284 to enable or disable showing icons on the map 1210 , a full map option 1285 to switch between different viewing options (e.g., a full screen view of map 1210 , a split screen view having both map 1210 and map 1250 , a split screen view having an image with no map, etc.) and a statistics option 1286 to show statistics (e.g., bar charts, numerical data, histograms, number of acres of a field having emerged plant population below a threshold) for the data layer or the actual emerged population data.
- statistics e.g., bar charts, numerical data, histograms, number of acres of a field having emerged plant population below a threshold
- FIG. 13 illustrates a monitor or display device having a user interface 1301 with a split screen view that includes maps of different data layers in accordance with one embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1301 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the map 1310 e.g., commanded planting population map from a planter, planted population map based on data from a seed sensor
- the map 1350 e.g., emerged population deviation map based on data from sensors after plants emerge from the soil
- shows an emerged population deviation data layer across a field and a scale region 1360 shows emerged population deviation in units of 1,000 with respect to a target or the planted population.
- the scale regions 1320 and 1360 can show percentages for the planted population and the emerged population deviation, respectively. In one example, a 0% emerged population deviation indicates no difference between the planted population and the emerged population deviation and 100% emerged population deviation indicates that no plants emerged.
- FIG. 14 illustrates a monitor or display device having a user interface 1401 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1401 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the enhanced map 1410 e.g., enhanced actual emergence population map
- the overview image 1450 shows an overview of the field and has a scale region 1460 .
- the images that are represented with icons are captured based on a spatial triggering (e.g., user provides an input prior to or during an application pass to capture an image during the application pass every acre, every 2 acres, every 5 acres, etc.) or threshold triggering (e.g., actual emergence population is below, equal to, or exceeds an actual emergence population threshold) as a machine pulls an implement through a field for an application pass.
- the icons or symbols e.g., icon 1412 for spatial triggering, icon 1414 for threshold triggering
- associated captured images are located approximately equidistant from each other for spatially triggering and can be triggered more closely spaced or further apart from each other for threshold triggering as the implement traverses through the field for an application pass.
- the data layer of the map can also be generated based on capturing images from sensors of an implement, machine, or aerial device.
- a grower provides an input prior to or during a spraying operation for a spatial or grid based triggering of image capturing devices or sensors during the spraying operation.
- the image capturing devices or sensors capture at least one image for every location that is triggered spatially or based on a grid as defined by the grower.
- the user interface 1501 Upon selection of the icon 1414 , the user interface 1501 is generated as illustrated in FIG. 15 .
- the user interface 1501 includes the emergence population map 1410 and the image 1550 with the image being captured at a location of the icon 1414 .
- FIG. 16 illustrates a monitor or display device having a user interface 1601 with a split screen view that includes maps of different data layers in accordance with one embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1601 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the map 1610 e.g., commanded planting population map from a planter, planted population map based on data from a seed sensor
- the map 1650 e.g., actual relative emergence uniformity map based on data from sensors after plants emerge from the soil
- the 1.87 and greater stage is the target growth stage
- the 0.38-1.87 stage is one growth stage late in emergence
- the 0.38 and lower stage is two growth stages late in emergence.
- the scale regions 1620 and 1660 can show percentages for the planted population and the actual relative emergence uniformity, respectively.
- a 0% actual relative emergence uniformity indicates low uniformity
- 100% actual relative emergence uniformity indicates a target uniformity for actual relative emergence uniformity.
- Various plant phenotype characteristics can be shown with a map or a uniformity map such as growth stage, biomass, plant height, size, and stalk size.
- FIG. 17 illustrates a monitor or display device having a user interface 1701 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment.
- Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1701 that is displayed by the monitor or display device.
- an initiated software application e.g., field application
- the software application can provide different display regions that are selectable by a user.
- the enhanced map 1710 e.g., enhanced actual relative emergence uniformity map
- the image 1750 is generated to show plant, weed, and soil conditions at a location of the icon 1725 .
- the image 1750 shows a target relative emergence uniformity for the plants in this image.
- the images that are represented with icons or symbols are captured based on a spatial triggering (e.g., user provides an input prior to or during an application pass to capture an image during the application pass every acre, every 2 acres, every 5 acres, etc.) or threshold triggering (e.g., actual relative emergence uniformity compares in a predetermined manner (e.g., is below, equal to, or exceeds) an actual relative emergence uniformity threshold) as a machine pulls an implement through a field for an application pass.
- the icons and associated captured images are located approximately equidistant from each other for spatially triggering and can be triggered more closely spaced or further apart from each other for threshold triggering as the implement traverses through the field for an application pass.
- the data layer of the map can also be generated based on capturing images from sensors of an implement, machine, or aerial device.
- an image 1850 of user interface 1801 of FIG. 18 is displayed.
- the image 1850 is generated to show plant, weed, and soil conditions at a location of the icon 1735 .
- the image 1850 shows a below target relative emergence uniformity for the plants in this image.
- the scale region 1720 indicates a relative emergence uniformity.
- the 1.87 and greater stage is the target growth stage
- the 0.38-1.87 stage is one growth stage late in emergence
- the 0.38 and lower stage is two growth stages late in emergence.
- FIG. 19 shows an example of a system 700 that includes a machine 702 (e.g., tractor, combine harvester, etc.) and an implement 2740 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) in accordance with one embodiment.
- the machine 702 includes a processing system 2720 , memory 705 , machine network 2710 (e.g., a controller area network (CAN) serial bus protocol network, an ISOBUS network, etc.), and a network interface 715 for communicating with other systems or devices including the implement 2740 .
- CAN controller area network
- the machine network 2710 includes sensors 712 (e.g., speed sensors), controllers 711 (e.g., GPS receiver, radar unit) for controlling and monitoring operations of the machine, and an optional image capture device 714 for capturing images of crops and soil conditions of a field in accordance with embodiments of the present disclosure.
- the network interface 715 can include at least one of a GPS transceiver, a WLAN transceiver (e.g., WiFi), an infrared transceiver, a Bluetooth transceiver, Ethernet, or other interfaces from communications with other devices and systems including the implement 2740 .
- the network interface 715 may be integrated with the machine network 2710 or separate from the machine network 2710 as illustrated in FIG. 19 .
- the I/O ports 729 e.g., diagnostic/on board diagnostic (OBD) port
- OBD diagnostic/on board diagnostic
- the machine performs operations of a combine (combine harvester) for harvesting grain crops.
- the machine combines reaping, threshing, and winnowing operations in a single harvesting operation.
- An optional header 780 e.g., grain platform, flex platform
- the header 780 includes an orientation device 782 or mechanism for orienting a crop (e.g., corn, soybeans) for improving image capture with an image capture device 784 .
- the processing system 2720 may include one or more microprocessors, processors, a system on a chip (integrated circuit), or one or more microcontrollers.
- the processing system includes processing logic 726 for executing software instructions of one or more programs and a communication unit 728 (e.g., transmitter, transceiver) for transmitting and receiving communications from the machine via machine network 2710 or network interface 715 or implement via implement network 2750 or network interface 2760 .
- the communication unit 728 may be integrated with the processing system or separate from the processing system. In one embodiment, the communication unit 728 is in data communication with the machine network 2710 and implement network 2750 via a diagnostic/OBD port of the I/O ports 729 .
- Processing logic 726 including one or more processors may process the communications received from the communication unit 728 including agricultural data.
- the system 700 includes memory 705 for storing data and programs for execution (software 706 ) by the processing system.
- the memory 705 can store, for example, software components such as image capture software, software for customizing scale and corresponding field views of agricultural fields with expand and panning operations for performing operations or methods of the present disclosure, or any other software application or module, images (e.g., captured images of crops), alerts, maps, etc.
- the memory 705 can be any known form of a machine readable non-transitory storage medium, such as semiconductor memory (e.g., flash; SRAM; DRAM; etc.) or non-volatile memory, such as hard disks or solid-state drive.
- the system can also include an audio input/output subsystem (not shown) which may include a microphone and a speaker for, for example, receiving and sending voice commands or for user authentication or authorization (e.g., biometrics).
- the processing system 2720 communicates bi-directionally with memory 705 , machine network 2710 , network interface 715 , header 780 , display device 2730 , display device 725 , and I/O ports 729 via communication links 731 - 737 , respectively.
- Display devices 725 and 2730 can provide visual user interfaces for a user or operator.
- the display devices may include display controllers.
- the display device 725 (or computing device 725 ) is a portable tablet device or computing device with a touchscreen that displays images (e.g., captured images, localized view map layer, high definition field maps of as-planted or as-harvested data or other agricultural variables or parameters, yield maps, alerts, etc.) and data generated by an agricultural data analysis software application or field view software application and receives input (e.g., expand (positive expansion, negative expansion or contraction), panning) from the user or operator for a customized scale region and corresponding view of a region of a field, monitoring and controlling field operations, or any operations or methods of the present disclosure.
- images e.g., captured images, localized view map layer, high definition field maps of as-planted or as-harvested data or other agricultural variables or parameters, yield maps, alerts, etc.
- data generated by an agricultural data analysis software application or field view software application and receive
- the processing system 2720 and memory 705 can be integrated with the computing device 725 or separate from the computing device.
- the operations may include configuration of the machine or implement, reporting of data, control of the machine or implement including sensors and controllers, and storage of the data generated.
- the display device 2730 may be a display (e.g., display provided by an original equipment manufacturer (OEM)) that displays images and data for a localized view map layer, as-planted or as-harvested data, yield data, controlling a machine (e.g., planter, tractor, combine, sprayer, etc.), steering the machine, and monitoring the machine or an implement (e.g., planter, combine, sprayer, etc.) that is connected to the machine with sensors and controllers located on the machine or implement.
- OEM original equipment manufacturer
- a cab control module 770 may include an additional control module for enabling or disabling certain components or devices of the machine or implement. For example, if the user or operator is not able to control the machine or implement using one or more of the display devices, then the cab control module may include switches to shut down or turn off components or devices of the machine or implement.
- the implement 2740 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) includes an implement network 2750 , a processing system 2762 , a network interface 2760 , and optional input/output ports 766 for communicating with other systems or devices including the machine 702 .
- the implement network 2750 e.g., a controller area network (CAN) serial bus protocol network, an ISOBUS network, etc.
- the implement network 2750 includes an image capture device 756 for capturing images of crop development and soil conditions, sensors 752 (e.g., speed sensors, seed sensors for detecting passage of seed, downforce sensors, actuator valves, OEM sensors, etc.), controllers 754 (e.g., GPS receiver), and the processing system 2762 for controlling and monitoring operations of the machine.
- sensors 752 e.g., speed sensors, seed sensors for detecting passage of seed, downforce sensors, actuator valves, OEM sensors, etc.
- controllers 754 e.g., GPS receiver
- the OEM sensors may be moisture sensors or flow sensors for a combine, speed sensors for the machine, seed force sensors for a planter, liquid application sensors for a sprayer, or vacuum, lift, lower sensors for an implement.
- the controllers may include processors in communication with a plurality of seed sensors.
- the processors are configured to process images captured by image capture device 756 or seed sensor data and transmit processed data to the processing system 2762 or 2720 .
- the controllers and sensors may be used for monitoring motors and drives on a planter including a variable rate drive system for changing plant populations.
- the controllers and sensors may also provide swath control to shut off individual rows or sections of the planter.
- the sensors and controllers may sense changes in an electric motor that controls each row of a planter individually. These sensors and controllers may sense seed delivery speeds in a seed tube for each row of a planter.
- the network interface 2760 can be a GPS transceiver, a WLAN transceiver (e.g., WiFi), an infrared transceiver, a Bluetooth transceiver, Ethernet, or other interfaces from communications with other devices and systems including the machine 702 .
- the network interface 2760 may be integrated with the implement network 2750 or separate from the implement network 2750 as illustrated in FIG. 19 .
- the processing system 2762 communicates bi-directionally with the implement network 2750 , network interface 2760 , and I/O ports 766 via communication links 741 - 743 , respectively.
- the implement communicates with the machine via wired and possibly also wireless bi-directional communications 704 .
- the implement network 2750 may communicate directly with the machine network 2710 or via the network interfaces 715 and 2760 .
- the implement may also by physically coupled to the machine for agricultural operations (e.g., planting, harvesting, spraying, etc.).
- the memory 705 may be a machine-accessible non-transitory medium on which is stored one or more sets of instructions (e.g., software 706 ) embodying any one or more of the methodologies or functions described herein.
- the software 706 may also reside, completely or at least partially, within the memory 705 and/or within the processing system 2720 during execution thereof by the system 700 , the memory and the processing system also constituting machine-accessible storage media.
- the software 706 may further be transmitted or received over a network via the network interface device 715 .
- a machine-accessible non-transitory medium e.g., memory 705
- the machine-accessible non-transitory medium e.g., memory 705
- the term “machine-accessible non-transitory medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
- machine-accessible non-transitory medium shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.
- machine-accessible non-transitory medium shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
- Prior approaches for stand count determine a number of planted seeds and a number of growing plants per unit area. An expected result based on the number of planted seeds is compared to the number of growing plants to calculate a percentage. Stand count is used to evaluate seed quality (germination rate) and whether replanting is needed or not.
- Described herein are systems and methods for using sensors of agricultural implements or machines to capture images of crop emergence during different crop stages, determine a uniformity of the crop emergence, and quantify the uniformity of crop emergence.
- FIGS. 20 A and 20 B illustrate a flow diagram of one embodiment for a computer implemented method of determining and quantifying crop emergence uniformity within an agricultural field.
- the method 2000 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both.
- the computer implemented method 2000 is performed by processing logic of a processing system of a system 102 , machine, apparatus, implement, agricultural vehicle, aerial device, monitor, display device, user device, self-guided device, or self-propelled device (e.g., robot, ATV, UTV, etc.).
- the processing system executes instructions of a software application or program with processing logic.
- the software application or program can be initiated by the processing system.
- a monitor or display device receives user input and provides a customized display for operations of the method 2000 .
- a software application is initiated on the processing system and displayed on a monitor or display device as a user interface.
- the processing system may be integrated with or coupled to a machine that performs an application pass (e.g., planting, tillage, fertilization, spraying, etc.).
- the processing system may be integrated with an apparatus (e.g., drone, image capture device) associated with the machine that captures images before, during, or after the application pass.
- the user interface includes a map of a data layer (e.g., seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, seed germination risk) for a field of interest and an overview image of the field of interest.
- a data layer e.g., seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to
- Seed germination risk can be germination/emergence (no germination/emergence, on time germination/emergence, or late germination/emergence) or factors other than time, such as, deformities, damaged seed, reduced vigor, or disease. Seed germination risk can be high, medium, or low, or it can be on-time emergence, late emergence, or no emergence.
- the data layer can be generated from data collected by sensors on an implement, a machine pulling the implement during a current application pass, an aerial device, a user device, a self-guided device, a self-propelled device, etc., or the data layer can be generated from a previous application pass through the field.
- the sensors may be in-situ sensors positioned on each row unit of an implement, spaced across several row units, or positioned on a machine.
- the computer implemented method includes obtaining one or more images of biomass data for a region of interest of a field from one or more sensors of an agricultural implement, which can be traversing the field to obtain the biomass data for various crop stages or for an application pass.
- the sensors can be located on a machine, an agricultural vehicle, an aerial device, a drone, a self-propelled device (e.g., robot, off-road vehicle, ATV, UTV), to collect agricultural data before, during, or after an application pass.
- the computer implemented method partitions a captured image into tiles.
- the tiles e.g., n ⁇ m array of tiles
- cover an entire image and additional adjacent tiles e.g., left center, right center
- the computer implemented method provides the tiles as input to a deep learning model (DLM) to differentiate pixels of the tiles between a targeted type of vegetation (e.g., a crop, corn, soybean, wheat, etc.), a background, or other vegetation.
- the tile can correspond to one or more images that are provided to the DLM.
- a single high resolution image or a resized lower resolution image can be provided in alternative embodiments.
- the computer implemented method receives output from the DLM in terms of modeled tiles with predicted pixel values (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) and reassembles the modeled tiles on a per tile basis to display the targeted type of vegetation in dimensionality of the original one or more images.
- a sample predicted output from the DLM is illustrated in diagram 2200 of FIG. 22 .
- the rows 2202 , 2204 , 2206 , 2208 of crops represent predicted pixel values (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) for targeted vegetation.
- the computer implemented method resolves conflicts (e.g., ties or disagreements) for pixel classification from overlapping tiles with voting.
- a majority vote determines a truth (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) for pixel classification.
- a logical “OR” operation can be applied to the odd number of overlapping tiles such that if any overlapping tile identifies a targeted type of vegetation then the region is classified with the targeted type of vegetation.
- a logical OR operation can be applied to the even number of overlapping tiles such that if any overlapping tile identifies a targeted type of vegetation then the region is classified with the targeted type of vegetation.
- the computer implemented method applies a predetermined mask (e.g., binary mask, mask 2100 of FIG. 21 ) to select portions of the one or more images that correspond with the targeted type of vegetation pixels.
- a predetermined mask e.g., binary mask, mask 2100 of FIG. 21
- Regions of interest e.g., region 1 is first row of crop, region 2 is second row of the crop, etc.
- the selected portions can be inferred via the presence and orientation of specific types of vegetation that are detected via images.
- the method For the selected portions of the one or more images, the method accumulates the targeted type of vegetation pixels to create one or more rows of crops (e.g., vertical lines) corresponding to vegetation intensity at operation 2016 .
- crops e.g., vertical lines
- detected vegetation pixels that represent biomass are accumulated horizontally to create the one or more rows of crops corresponding to vegetation biomass intensity.
- the computer implemented method applies a filter (e.g., one-dimensional filter) with a length corresponding to the spacing in pixels between individual plants of the targeted type of plants along a row of plant intensity (e.g., vertical line of plant intensity) to determine a simple moving average or a weighted average of vegetation intensity of the targeted type of plants.
- a filter e.g., one-dimensional filter
- a row of plant intensity e.g., vertical line of plant intensity
- a filter with a length corresponding to the spacing in pixels between individual plants is convolved along the row of plant intensity.
- the filter can be uniform to represent a simple moving average, or weighted to produce a weighted average of vegetation along the row (or vertical line) of vegetation intensity
- additional weights 2402 e.g., weighted value less than 1), 2404 (e.g., weighted value equal to 1 near center of image), 2406 (e.g., weighted value greater than 1) as illustrated in the weighted value diagram 2400 of FIG. 24 can be applied to the entire one or more rows of crops in the event that the image plane (e.g., forward looking image plane with upward tilt) of a sensor (e.g., camera) is not coplanar with the ground surface.
- the sensor can have a tilt from 0 degrees (coplanar with ground plane so no weighting is needed) to 45 degrees.
- a camera that is not coplanar with a ground surface can involve having a perspective warp applied to the captured image to match coplanar perspective with the ground surface and thus compensative for the camera not being coplanar with the ground surface.
- a forward-looking image plane with upward tilt allows the sensors to capture images of a region of plants prior to an implement reaching the region of the plants and thus allows time for the implement to adjust parameters of an agricultural application if necessary prior to reaching the region of the plants.
- the adjusted vegetation intensity along the one or more rows of crops can be thresholded for a minimum vegetation intensity, revealing portions of a crop row with little or no presence of the desired vegetation.
- the adjusted vegetation intensity along the one or more rows of crops can be thresholded for a maximum vegetation intensity, revealing portions of a crop row with too much of the desired vegetation.
- the computer implemented method determines a targeted plant uniformity (e.g., emergence score) based on the simple moving average or a weighted average of the targeted type of plants and the thresholding for minimum and maximum vegetation intensity.
- a targeted plant uniformity e.g., emergence score
- the portion of the crop row (or vertical line) meeting both upper and lower thresholding criteria can represent an emergence score between 0 and 1 or between 0 and 100%.
- the emergence score indicates a distribution of the targeted vegetation over the region of interest.
- FIG. 21 illustrates an example of a predetermined mask to select portions of an image in accordance with one embodiment.
- the predetermined mask 2100 shows a number of pixels on the x axis and y axis and includes regions of interest 2102 , 2104 , 2106 , 2108 (e.g., region 2102 is first row of crop, region 2104 is second row of the crop, etc.) that align with crop rows to prescribe portions of the image that correspond with a row of a targeted vegetation or crop.
- the regions can be inferred based on orientation of vegetation from the model output.
- FIG. 23 illustrates an example of a moving average of vegetation intensity or a weighted average of vegetation intensity along a row in accordance with one embodiment.
- the moving or weighted average of vegetation intensity 2302 is determined based on operation 2018 of FIG. 20 .
- the diagram 2300 shows the moving or weighted average on a y axis and a pixel position of adjusted vegetation intensity from a bottom to a top of an image on an x axis.
- the moving average of vegetation intensity or a weighted average of vegetation intensity 2302 along a row is determined moving from a top to bottom of an image after convolution with a 1 dimensional filter.
- Upper threshold 2308 and lower threshold 2306 are shown as horizontal lines representing areas of the image with too much biomass if above the threshold 2308 or too little biomass if below threshold 2306 .
- An emergence score (e.g., 0 to 100%) is determined based on a portion of the biomass that is greater than the lower threshold 2306 and less than the upper threshold 2308 .
- the emergence score can be determined based on a percent of time that the moving average of vegetation intensity is greater than the lower threshold 2306 and less than the upper threshold 2308 .
- a computer implemented method for customizing field views of data displays comprises obtaining a data layer for an agricultural parameter from sensors of an agricultural implement or machine during an application pass for a field and generating a user interface with an enhanced map that includes the data layer for the agricultural parameter and selectable icons overlaid at different geographic locations on the enhanced map for the field.
- a computing device comprises a display device for displaying a user interface having a scale region and a field region for an agricultural parameter; and a processor coupled to the display device.
- the processor is configured to generate a data layer for the agricultural parameter from sensors of an agricultural implement or machine that collect the data during an application pass for a field and to generate the user interface with an enhanced map that includes the data layer for the agricultural parameter and selectable icons overlaid at different geographic locations on the enhanced map for the field.
- computer implemented method for customizing field views of a field region of data displays comprises obtaining a data layer for an agricultural parameter from sensors of an agricultural implement or machine that collects data during an application pass for a field and generating selectable icons and overlaying the selectable icons at different geographic locations on an enhanced map of the data layer for the field based on spatial trigger or a threshold trigger for the agricultural parameter.
- Example 1 A computer implemented method for customizing field views of a display device comprising: obtaining a data layer for an agricultural parameter from sensors of an agricultural implement during an application pass for a field; and generating a user interface with an enhanced map that includes the data layer for the agricultural parameter and selectable icons overlaid at different geographic locations on the enhanced map for the field.
- Example 2 the computer implemented method of Example 1, wherein the user interface further comprises a split screen view with the enhanced map on a first side of the split screen view and an overview image of the field on a second side of the split screen view.
- Example 3 the computer implemented method of any preceding Example, wherein the agricultural implement comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- Example 4 the computer implemented method of Example 3, further comprising: displaying the user interface with the enhanced map on the display device; receiving a user input to select an icon of the enhanced map; and generating an updated user interface with the enhanced map and an image that is associated with the selected icon changing color on the enhanced map.
- Example 5 the computer implemented method of Example 4, wherein the image is displayed as a pop up window or over an overview image of the field.
- Example 6 the computer implemented method of Example 5, wherein the enhanced map provides an ability to select icons throughout the field to show actual captured images of crops, weeds, and conditions of soil of the field.
- Example 7 the computer implemented method of any of Examples 1 to 6, wherein the selectable icons are generated and overlaid at different geographic locations on the enhanced map for the field based on a spatial trigger within the field, a threshold trigger for when an agricultural parameter exceeds a threshold for the agricultural parameter, a time based trigger for capturing images, or a burst capture of images.
- Example 8 the computer implemented method of any of Examples 1 to 6, wherein the selectable icons are generated and overlaid at different geographic locations on the enhanced map for the field based on a threshold trigger including a weed density exceeding a threshold trigger for weed density or an emergence value exceeding a threshold trigger for emergence data.
- Example 9 the computer implemented method of any preceding Example, wherein the agricultural parameter comprises one or more of seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, and seed germination risk.
- the agricultural parameter comprises one or more of seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, and seed germination
- Example 10 A computing device comprising: a display device for displaying a user interface having a scale region and a field region for an agricultural parameter; and a processor coupled to the display device, the processor is configured to generate a data layer for the agricultural parameter from sensors of an agricultural implement that collects the data during an application pass for a field and to generate the user interface with an enhanced map that includes the data layer for the agricultural parameter and selectable icons or symbols overlaid at different geographic locations on the enhanced map for the field.
- Example 11 the computing device of Example 10, wherein the user interface further comprises a split screen view with the enhanced map on a first side of the split screen view and an overview image of the field on a second side of the split screen view.
- Example 12 the computing device of Example 10 or 11, wherein the agricultural implement comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- Example 13 the computing device of Example 12, wherein the display device to display the user interface with the enhanced map and to receive a user input to select an icon of the enhanced map, wherein the processor is configured to generate an updated user interface with the enhanced map and an image that is associated with a selected icon or symbol based on the user input with the selected icon or symbol changing color.
- Example 14 the computing device of Example 13, wherein the updated user interface to provide a selectable orientation option to rotate an orientation of the images of the user interface, a selectable expand option to control sizing of a displayed map in a field region, a selectable icon or symbol option to enable or disable showing icons or symbols on the enhanced map, a selectable full map option to switch between a full screen view of map versus a split screen view having both of a map and an overview image, and a selectable statistics option to show statistics for the data layer.
- Example 15 the computing device of any of Examples 10 to 14, wherein the display device to receive a user input to modify the scale region and to display a modified scale region and a corresponding modified field region.
- Example 16 A computer implemented method for customizing field views of a field region comprising: obtaining a data layer for an agricultural parameter from sensors of an agricultural implement that collects data during an application pass for a field; and generating selectable icons and overlaying the selectable icons at different geographic locations on an enhanced map of the data layer for the field based on spatial trigger or a threshold trigger for the agricultural parameter.
- Example 17 the computer implemented method of Example 16, further comprising: comparing the agricultural parameter to the threshold trigger; determining whether the agricultural parameter exceeds the threshold trigger for a location within the field; and generating a selectable icon when the agricultural parameter exceeds the threshold trigger for the location within the field.
- Example 18 the computer implemented method of Example 17, wherein the threshold trigger comprises a weed threshold that is compared to a weed density.
- Example 19 the computer implemented method of Example 17, wherein the threshold trigger comprises an emergence threshold that is compared to an emergence value for plant emergence data.
- Example 20 the computer implemented method of any of Examples 16 to 19, further comprising: displaying a user interface with the enhanced map that includes the data layer for the agricultural parameter and the selectable icons overlaid at different geographic locations on the enhanced map for the field.
- Example 21 the computer implemented method of any of Examples 16 to 19, wherein the agricultural implement comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- the agricultural implement comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- the computer implemented method partitions a captured image into tiles, provides the tiles to a deep learning model to provide modeled tiles with predicted pixel values (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) and reassembles the modeled tiles on a per tile basis to display the targeted type of vegetation in dimensionality of the original one or more images.
- predicted pixel values e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation
- Example 23 the computer implemented method of Example 22, further comprising: applying a predetermined mask to select portions of the one or more images that correspond with the targeted type of vegetation pixels.
- Example 24 the computer implemented method of any of Examples 22-23, further comprising: accumulating the targeted type of vegetation pixels to create one or more rows of crops (e.g., vertical lines) corresponding to vegetation intensity.
- crops e.g., vertical lines
- Example 25 the computer implemented method of any of Examples 22-24, further comprising: applying a filter (e.g., one-dimensional filter) with a length corresponding to spacing in pixels between individual plants of the targeted type of plants along a row of plant intensity (e.g., vertical line of plant intensity) to determine a simple moving average or a weighted average of vegetation intensity for the targeted type of plants
- a filter e.g., one-dimensional filter
- a row of plant intensity e.g., vertical line of plant intensity
- Example 26 the computer implemented method of any of Examples 22-25, further comprising: applying upper and lower thresholds to the simple moving average or a weighted average of vegetation intensity along the one or more rows of crops (one or more vertical lines) and determining a targeted plant uniformity based on the simple moving average or a weighted average of vegetation intensity of the targeted type of plants and the thresholding for lower (minimum) and upper (maximum) vegetation intensity.
- the portion of the crop row (or vertical line) meeting both thresholding criteria can represent an emergence score between 0 and 100%.
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Soil Sciences (AREA)
- Environmental Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Agricultural Chemicals And Associated Chemicals (AREA)
- Image Processing (AREA)
- Instructional Devices (AREA)
Abstract
Description
- This application claims priority to U.S. Provisional Application Nos. 63/197,634, filed 7 Jun. 2021 and 63/269,693, filed 21 Mar. 2022, the disclosure of both are incorporated herein by reference in their entireties.
- Embodiments of the present disclosure relate generally to systems and methods for providing field views including enhanced agricultural maps having a data layer and image data.
- Planters are used for planting seeds of crops (e.g., corn, soybeans) in a field. Some planters include a display monitor within a cab for displaying a coverage map that shows regions of the field that have been planted. The coverage map of the planter is generated based on planting data collected by the planter. A farmer or grower will interpret the coverage map during the planting to attempt to understand field conditions.
- The present disclosure is illustrated by way of example, and not by way of limitation, in the FIGs. of the accompanying drawings and in which:
-
FIG. 1 shows an example of a system for collecting data of agricultural fields and performing analysis of the data of agricultural fields in accordance with one embodiment; -
FIG. 2 illustrates an architecture of animplement 200 for delivering applications (e.g., fluid applications, fluid mixture applications) to agricultural fields in accordance with one embodiment; -
FIG. 3 illustrates a flow diagram of one embodiment for a method of customizing field views of agricultural fields with enhanced maps; -
FIG. 4 illustrates a monitor or display device having a user interface 401 with a split screen view that includes a map of a data layer and an overview image in accordance with one embodiment; -
FIG. 5 illustrates a monitor or display device having a user interface 501 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment; -
FIG. 6 illustrates a monitor or display device having a user interface 601 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with another embodiment; -
FIG. 7 illustrates a user interface 701 with map 710 and associatedscale region 720, and animage 750 for theselected icon 740; -
FIG. 8 displays a user interface 801 ofFIG. 8 with map 710 and zoomed image 850 to show more details of the crops, weeds, and soil conditions at the geographical location for theselected icon 740; -
FIG. 9 illustrates the user interface 901 having an image 950 for theselected icon 940; -
FIG. 10 illustrates a zoomed image 951 to show more details of the crops, weeds, and soil conditions at the geographical location for the selectedicon 940; -
FIG. 11 illustrates a monitor or display device having a user interface 1101 with a split screen view that includes icons overlaid on an overview image 1110 of a field and also the image 951 for aselected icon 940 in accordance with another embodiment; -
FIG. 12 illustrates a monitor or display device having a user interface 1201 with a split screen view that includes maps of different data layers in accordance with one embodiment; -
FIG. 13 illustrates a monitor or display device having a user interface 1301 with a split screen view that includes maps of different data layers in accordance with one embodiment; -
FIG. 14 illustrates a monitor or display device having a user interface 1401 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment; -
FIG. 15 illustrates a user interface 1501; -
FIG. 16 illustrates a monitor or display device having a user interface 1601 with a split screen view that includes maps of different data layers in accordance with one embodiment; -
FIG. 17 illustrates a monitor or display device having a user interface 1701 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment; -
FIG. 18 illustrates that if anicon 1735 is selected from map 1710, then an image 1850 is displayed; -
FIG. 19 shows an example of a system 2700 that includes a machine 2702 (e.g., tractor, combine harvester, etc.) and an implement 2740 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) in accordance with one embodiment; -
FIGS. 20A and 20B illustrate a flow diagram of one embodiment for a computer implemented method of measuring and quantifying crop emergence uniformity within an agricultural field; -
FIG. 21 illustrates an example of a predetermined mask to select portions of an image in accordance with one embodiment; -
FIG. 22 illustrates a sample predicted output from a deep learning model (DLM) in accordance with one embodiment; -
FIG. 23 illustrates an example of a moving average of vegetation intensity or a weighted average of vegetation intensity along a row in accordance with one embodiment; and -
FIG. 24 illustrates a weighted value diagram 2400 that can be applied to the entire one or more rows of crops in the event that the image plane (e.g., forward looking image plane with upward tilt) of a sensor (e.g., camera) is not coplanar with the ground surface in accordance with one embodiment. - Described herein are systems and methods for providing enhanced field views including enhanced maps based on capturing images of crops during different stages. In an aspect of the disclosure there is provided a computer implemented method for customizing field views of data displays that comprises obtaining a data layer for an agricultural parameter from sensors of an agricultural implement or machine during an application pass for a field and generating a user interface with an enhanced map that includes the data layer for the agricultural parameter and generating selectable icons overlaid at different geographic locations on the enhanced map for the field with the selectable icons representing captured images at the different geographic locations. Each selectable icon represents an image from the field to show crop, weed, or soil conditions.
- According to an aspect of the disclosure there is provided the user interface that comprises a split screen view with the enhanced map on a first side of the split screen view and an overview image of the field on a second side of the split screen view.
- A further aspect of the disclosure provides the agricultural implement that comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- A further aspect of the disclosure further comprises displaying the user interface with the enhanced map on the display device, receiving a user input to select an icon of the enhanced map, and generating an updated user interface with the enhanced map and an image that is associated with the selected icon changing color on the enhanced map.
- A further aspect of the disclosure provides the image that is displayed as a pop up window or over an overview image of the field.
- A further aspect of the disclosure includes the enhanced map that provides an ability to select icons throughout the field to show actual captured images of crops, weeds, and conditions of soil of the field.
- A further aspect of the disclosure includes the selectable icons that are generated and overlaid at different geographic locations on the enhanced map for the field based on a spatial trigger to capture an image during the application pass per unit area within the field, a threshold trigger for when an agricultural parameter exceeds a threshold for the agricultural parameter, a time based trigger for capturing images, or a burst capture of images.
- A further aspect of the disclosure includes the selectable icons that are generated and overlaid at different geographic locations on the enhanced map for the field based on a threshold trigger including a weed density exceeding a threshold trigger for weed density or an emergence value exceeding a threshold trigger for emergence data.
- A further aspect of the disclosure provides the agricultural parameter that comprises one or more of seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, and seed germination risk.
- In an aspect of the disclosure there is provided a computing device that comprises a display device for displaying a user interface having a scale region and a field region for an agricultural parameter and a processor coupled to the display device. The processor is configured to generate a data layer for the agricultural parameter from sensors of an agricultural implement that collects the data during an application pass for a field, to generate the user interface with an enhanced map that includes the data layer for the agricultural parameter, and to generate selectable icons or symbols overlaid at different geographic locations on the enhanced map for the field with the selectable icons representing captured images at the different geographic locations.
- A further aspect of the disclosure includes the user interface that further comprises a split screen view with the enhanced map on a first side of the split screen view and an overview image of the field on a second side of the split screen view.
- A further aspect of the disclosure includes the agricultural implement that comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- A further aspect of the disclosure includes the display device to display the user interface with the enhanced map and to receive a user input to select an icon of the enhanced map, wherein the processor is configured to generate an updated user interface with the enhanced map and an image that is associated with a selected icon or symbol based on the user input with the selected icon or symbol changing color.
- A further aspect of the disclosure includes the updated user interface to provide a selectable orientation option to rotate an orientation of the images of the user interface, a selectable expand option to control sizing of a displayed map in a field region, a selectable icon or symbol option to enable or disable showing icons or symbols on the enhanced map, a selectable full map option to switch between a full screen view of map versus a split screen view having both of a map and an overview image, and a selectable statistics option to show statistics for the data layer.
- A further aspect of the disclosure includes the display device to receive a user input to modify the scale region and to display a modified scale region and a corresponding modified field region.
- In an aspect of the disclosure there is provided a computer implemented method for customizing field views of a field region that comprises obtaining a data layer for an agricultural parameter from sensors of an agricultural implement that collects data during an application pass for a field and generating selectable icons and overlaying the selectable icons at different geographic locations on an enhanced map of the data layer for the field based on a spatial trigger to capture an image during the application pass per unit area or when an agriculture parameter compares in a predetermined manner to a threshold trigger for the agricultural parameter.
- A further aspect of the disclosure further comprises comparing the agricultural parameter to the threshold trigger, determining whether the agricultural parameter exceeds the threshold trigger for a location within the field, and generating a selectable icon when the agricultural parameter exceeds the threshold trigger for the location within the field.
- A further aspect of the disclosure includes the threshold trigger that comprises a weed threshold that is compared to a weed density.
- A further aspect of the disclosure includes the threshold trigger that comprises an emergence threshold that is compared to an emergence value for plant emergence data.
- A further aspect of the disclosure further comprises displaying a user interface with the enhanced map that includes the data layer for the agricultural parameter and the selectable icons overlaid at different geographic locations on the enhanced map for the field.
- A further aspect of the disclosure includes the agricultural implement that comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- In an aspect of the disclosure there is provided a computer implemented method for measuring and quantifying crop emergence uniformity within an agricultural field. The method comprises obtaining one or more images of biomass data for a region of interest within the agricultural field from one or more sensors of an agricultural implement or machine, which can be traversing the field to obtain the biomass data for various crop stages or for an application pass. The computer implemented method partitions a captured image into tiles, provides the tiles to a deep learning model to provide modeled tiles with predicted pixel values (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) and reassembles the modeled tiles on a per tile basis to display the targeted type of vegetation in dimensionality of the original one or more images.
- A further aspect of the disclosure includes applying a predetermined mask to select portions of the one or more images that correspond with the targeted type of vegetation pixels.
- A further aspect of the disclosure includes accumulating the targeted type of vegetation pixels to create one or more rows of crops (e.g., vertical lines) corresponding to vegetation intensity.
- A further aspect of the disclosure includes applying a filter (e.g., one-dimensional filter) with a length corresponding to the spacing in pixels between individual plants of the targeted type of plants along a row of plant intensity (e.g., vertical line of plant intensity) to determine a simple moving average or a weighted average of vegetation intensity for the targeted type of plants.
- A further aspect of the disclosure includes applying upper and lower thresholds to the simple moving average or a weighted average of vegetation intensity along the one or more rows of crops (one or more vertical lines) and determining a targeted plant uniformity based on the simple moving average or a weighted average of vegetation intensity of the targeted type of plants and the thresholding for lower (minimum) and upper (maximum) vegetation intensity. The portion of the crop row (or vertical line) meeting both thresholding criteria can represent an emergence score between 0 and 100%.
- Within the scope of this application it should be understood that the various aspects, embodiments, examples and alternatives set out herein, and individual features thereof may be taken independently or in any possible and compatible combination. Where features are described with reference to a single aspect or embodiment, it should be understood that such features are applicable to all aspects and embodiments unless otherwise stated or where such features are incompatible.
- Described herein are systems and methods for customizing views of visualized data (such as from agricultural fields for weed maps during different crop stages, crop emergence, etc.) based on sensors of agricultural implements or machines.
- In one embodiment, at least one of an implement, a machine, an agricultural vehicle, an aerial device, a drone, a self-propelled device (e.g., robot, off-road vehicle, ATV, UTV), an electronic device, or a mobile device having sensors (e.g., image capturing devices) collects agricultural data before, during, or after an application pass. The agricultural data may include a data layer that is mapped as a field view on a monitor or display device and image data that overlays the data layer to enhance a user experience in viewing and understanding the agricultural data. On the map of the field, icons (e.g., camera icon, image icon) appear where the implement, a machine, an agricultural vehicle, or an aerial device with the sensors (e.g., a camera or set of cameras) captured images of regions of the field. In one example, the captured images are used for weed identification or for crop emergence. When an operator selects (e.g., user input, touch input) the icon from the monitor or display device, the image from that geographic location in the field is displayed either as a pop up window over the map or in a side by side view with the map. The icon can change color to indicate which icon was selected. The image data can be overlaid, associated, merged, or combined with the data layer for a field view.
- The user can customize (e.g., change, expand, pan) a scale of a parameter for a sub region (e.g., scale region) of a user interface and a corresponding field view of an agricultural field of the user interface automatically changes in response to the customized change in order to have a customized view of the parameter being displayed in the field view. The user does not need to manually adjust the field view because this adjustment occurs automatically upon adjusting the scale region. As used herein, expand can refer to both a positive expansion and a negative expansion (contraction).
- In the following description, numerous details are set forth. It will be apparent, however, to one skilled in the art, that embodiments of the present disclosure may be practiced without these specific details. In some instances, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring the present disclosure.
- Referring to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views,
FIG. 1 shows an example of a system for collecting and analyzing agricultural data from agricultural fields in order to display customized agricultural data in accordance with one embodiment. Machines and implements of thesystem 100 perform agricultural operations (e.g., planting seed in a field, applying fluid applications to plants) of agricultural fields. - For example, the
system 100 may be implemented as a cloud based system with servers, data processing devices, computers, etc. Aspects, features, and functionality of thesystem 100 can be implemented in servers, wireless nodes, planters, planter monitors, sprayers, sidedress bars, combines, laptops, tablets, computer terminals, client devices, user devices, handheld computers, personal digital assistants, cellular telephones, cameras, smart phones, mobile phones, computing devices, or a combination of any of these or other data processing devices. - The
system 100 can include a network computer or an embedded processing device within another device (e.g., display device), an implement, or within a machine (e.g., tractor cab, agricultural vehicle), or other types of data processing systems having fewer components or perhaps more components than that shown inFIG. 1 . The system 100 (e.g., cloud based system) and agricultural operations can control and monitor planting and fluid applications using an implement or machine. Thesystem 100 includes 140, 142, 144, 146 and implements 141, 143, 145 coupled to amachines 140, 142, and 144. The implements (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement) can include flow devices for controlling and monitoring applications (e.g., seeding, spraying, fertilization) of crops and soil within associated fields (e.g., fields 103, 107, 109). Therespective machine system 100 includes anagricultural analysis system 102 that can include aweather store 150 with current and historical weather data,weather predictions module 152 with weather predictions for different regions, and at least oneprocessing system 132 for executing instructions for controlling and monitoring different operations (e.g., fluid applications). Thestorage medium 136 may store instructions, software, software programs, etc. for execution by the processing system and for performing operations of theagricultural analysis system 102. In one example,storage medium 136 may contain a fluid application prescription (e.g., fluid application prescription that relates georeferenced positions in the field to application rates). The implement 141 (or any of the implements) may include an implement with a pump, flow sensors and/or flow controllers that may be specifically the elements that are in communication with thenetwork 180 for sending control signals or receiving as-applied data. The network 180 (e.g., any wireless network, any cellular network (e.g., 4G, 5G), Internet, wide area network, WiMax, satellite, IP network, etc.) allows thesystem 102, wireless nodes, machines, and implements ofFIG. 1 to communicate between each other when thesystem 102, wireless nodes, machines (e.g., 140, 142, 144, 146), or implements (e.g., 141, 143, 145) are connected to thenetwork 180. Examples of agricultural monitors are described in PCT Publication Nos. WO2008/086318, WO2012/129442, WO2013/049198, WO2014/026183, and WO2014/018717. An example of an agricultural monitor is the 20|20® monitor from Precision Planting, LLC. In one example, a monitor preferably includes a graphical user interface (“GUI”), a memory, a central processing unit (“CPU”), and a bus node. The bus node preferably comprises a controller area network (“CAN”) node including a CAN transceiver, a controller, and a processor. The monitor is preferably in electrical communication with a speed sensor (e.g., a radar speed sensor mounted to a tractor) and a global positioning receiver (“GPS”) receiver mounted to the tractor (or in some embodiments to a toolbar of an implement). - As an agricultural implement traverses a field, a monitor A of a first machine (e.g., 140, 142, 144, 146) collects as applied data at various points in the field. The first machine may be coupled to the agricultural implement and causing the agricultural implement to traverse the field. The as applied data can be seeding information, such as percent singulation, skips, multiples, downforce, applied fluids, depth measurements, agronomic measurements, and anything else that is collected. As, the as applied data is collected and stored in a monitor data file of the monitor A, field boundary and prescriptions are embedded into the data file.
- File transfer from monitor A of the first machine to monitor B of a second machine can be accomplished through any data exchange, such as saving the file to a USB stick, via cloud exchange, or by direct vehicle to vehicle communications network. In one example, the first machine and the second machine are communicatively coupled to the
network 180 and one or more files are transferred from the monitor A to the monitor B via thenetwork 180. - Data recorded by monitor A at one location can be used to influence control of monitor B in other locations or the same location during a different application pass. For instance, when seeds are dropped, spatial data indicates that seeds have been applied (or covered) in that area. That coverage information can then be used by monitor B as the equipment traverses the field for a different application to instruct the control modules when to turn on or off. This information is used to automatically control the equipment. Many data channels exist that are mapped spatially to be viewed by the operator. In many cases, this data is not used by the monitor to automatically control itself while the equipment traverses the field. However, the operator is influenced by this information, and the operator may choose to operate the equipment in a different way based on data from previous field passes and his present location in the field. Sharing data between equipment can either influence the automatic control of the equipment, or it influences the operator, who then controls the equipment differently.
-
FIG. 2 illustrates an architecture of an implement 200 for delivering applications (e.g., fluid applications, fluid mixture applications) to agricultural fields in accordance with one embodiment. The implement 200 includes at least onestorage tank 250, 260 and 261, a flow controller 252 (e.g., valve), and at least one variable-rate pump 254 (e.g., electric, centrifugal, piston, etc.) for pumping and controlling application rate of a fluid (e.g., fluid application, semifluid mixture) from the at least one storage tank to different application units 210-217, respectively of the implement. At least oneflow lines flow sensor 270 can be utilized on the implement 200 either row-by-row or upstream of where the fluid branches out to the application units as illustrated inFIG. 2 . Theflow controller 252 can be row-by-row as opposed to implement-wide as shown inFIG. 2 . - The applications units are mechanically coupled to the frames 220-227 which are mechanically coupled to a
bar 10. Each application unit 210-217 can include flow sensors and components having a placement mechanism (e.g., planting contacting members, feelers, guidance members) for obtaining a proper orientation and/or positioning of a fluid outlet with respect to a plant in an agricultural field. -
FIG. 3 illustrates a flow diagram of one embodiment for amethod 300 of providing enhanced field views of data displays based on vision scouting of crops, weeds, and field conditions. Themethod 300 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both. In one embodiment, themethod 300 is performed by processing logic of a processing system of asystem 102, machine, apparatus, implement, agricultural vehicle, aerial device, monitor, display device, user device, self-guided device, or self-propelled device (e.g., robot, ATV, UTV, etc.). The processing system executes instructions of a software application or program with processing logic. The software application or program can be initiated by the processing system. In one example, a monitor or display device receives user input and provides a customized display for operations of themethod 300. - At
operation 302, a software application is initiated on the processing system and displayed on a monitor or display device as a user interface. The processing system may be integrated with or coupled to a machine that performs an application pass (e.g., planting, tillage, fertilization, spraying, etc.). Alternatively, the processing system may be integrated with an apparatus (e.g., drone, image capture device) associated with the machine that captures images before, during, or after the application pass. In one example, the user interface includes a map of a data layer (e.g., seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, seed germination risk) for a field of interest and an overview image of the field of interest. Seed germination risk can be germination/emergence (no germination/emergence, on time germination/emergence, or late germination/emergence) or factors other than time, such as, deformities, damaged seed, reduced vigor, or disease. Seed germination risk can be high, medium, or low, or it can be on-time emergence, late emergence, or no emergence. - The data layer can be generated from data collected by sensors on an implement, a machine pulling the implement during a current application pass, an aerial device, a user device, a self-guided device, a self-propelled device, etc., or the data layer can be generated from a previous application pass through the field. The sensors may be in-situ sensors positioned on each row unit of an implement, spaced across several row units, or positioned on a machine.
- At
operation 304, the software application receives user input, and generates an updated user interface that is displayed with the monitor or display device. The updated user interface is generated based on the user input and may include an enhanced map of the data layer and optionally the overview image of the field. The enhanced map includes the data layer and also icons or symbols to represent captured images at different georeferenced positions across the field. In one example, atoperation 306, the icons (e.g., camera icons, image icons) or symbols can be positioned spatially at a certain approximate distance from each other within the field based on a user defined spatial or grid based input. - In another example, at
operation 306, the icons or symbols can be positioned on a field view based on a threshold trigger that is compared to an agricultural parameter (e.g., agricultural parameter is less than, equal to, or exceeds a threshold value for the agricultural parameter; weed pressure or density is less than, equal to, or exceeds a threshold trigger; emergence value is less than, equal to, or exceeds a threshold trigger, etc.) for the data layer at different locations within a field. - In another example, at
operation 306, the icons or symbols can be positioned based on a user defined time period or predetermined time period (e.g.,capture 1 image every 10 seconds,capture 1 image every 1 minute). In another example, atoperation 306, the icons or symbols can be positioned based on a burst capture of images at certain locations within the field. - At
operation 308, the software application receives a user input (e.g., touch user input) to select an icon or symbol of the enhanced map of the data layer. - At
operation 310, the software application generates an updated user interface based on the user input with the selected icon changing color. Atoperation 312, the monitor or display device displays the updated user interface including the enhanced map having the selected icon changing color and an image of the selected icon being displayed as a pop up window or over the overview image of the field. The user experience and understanding of a color map of the data layer is improved by being able to select icons or symbols throughout a field to show actual captured images of crops, weeds, and conditions of the soil of the field in combination with the map of the data layer. - At
operation 314, the software application (e.g., scale region of the user interface) may optionally receive additional user input (e.g., expand (positive expansion, negative expansion or contraction), panning operation) to modify a scale of the scale region for the agricultural parameter. For example, a scale of the scale region for an agricultural parameter can be modified from being between 0 to 100 percent to being between 20 to 50% based on the user input. The displayed field region of the enhanced map is modified in a corresponding manner as the modified scale region and will only show values between 20 to 50% for this example. - At
operation 316, the software application generates a modified scale region and also a modified field region based on the additional user input. U.S. Pat. No. 10,860,189, which is incorporated by reference herein, describes how to generate a modified scale region and also a modified field region based on the user input. The monitor or display device displays the modified scale region and the corresponding modified field region. The 312, 314, and 316 can be repeated if additional user input for modifying the scale region are received by the software application.operations - In one example, the user input can include first expand operation (e.g., pinch motion with 2 user input points contacting the scale region and moving towards each other to expand in (or contract), e.g., 1 finger and 1 thumb or 2 fingers), a second expand operation (e.g., expand with 2 user input points contacting the scale region moving away from each other to expand out), a first panning operation (e.g., panning with 1 user input point contacting the scale region and moving upwards (or downwards), e.g. 1 finger or 1 thumb), or a second panning operation (e.g., panning with 1 user input point contacting scale region and moving downwards (or upwards), e.g. 1 finger or 1 thumb).
- In some embodiments, the operations of the method(s) disclosed herein can be altered, modified, combined, or deleted. The methods in embodiments of the present disclosure may be performed with a device, an apparatus, or data processing system as described herein. The device, apparatus, or data processing system may be a conventional, general-purpose computer system or special purpose computers, which are designed or programmed to perform only one function, may also be used.
-
FIG. 4 illustrates a monitor or display device having a user interface 401 with a split screen view that includes a map of a data layer and an overview image in accordance with one embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 401 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The map 410 (e.g., weed map) shows a weed data layer across a field and a
scale region 420 shows weed coverage or weed density on a scale from 100% to 0%. The overview image 450 shows an overview of the field and has ascale region 460. - In one example, the user interface 401 includes a
selectable orientation option 480 to rotate an orientation of the images of the user interface with respect to a true North direction, a selectable plus/minus zoom option 481, a selectable pinch to zoomoption 482, a selectable expandoption 483 to control sizing of a displayed map in a field region, aselectable icon option 484 to enable or disable showing image icons or symbols on the map 410, a selectablefull map option 485 to switch between different viewing options (e.g., a full screen view of map 410, a split screen view having both map 410 and overview image 450, a split screen view having an image with no map, etc.) and aselectable statistics option 486 to show statistics (e.g., bar charts, numerical data, histograms, number of acres of a field having weed pressure or density that exceeds a threshold) for the data layer or the weed data of the weed pressure or weed density. -
FIG. 5 illustrates a monitor or display device having a user interface 501 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 501 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The enhanced map 510 (e.g., enhanced weed map) shows a weed data layer across a field with selectable icons or symbols for images and a
scale region 520 shows weed pressure, coverage, or weed density on a scale from 100% to 0%. The overview image 550 shows an overview of the field and has ascale region 560. The images that are represented with icons or symbols are captured based on a spatial triggering (e.g., user provides an input prior to or during an application pass to capture an image during the application pass every acre, every 2 acres, every 5 acres, etc.) or grid based triggering as a machine pulls an implement through a field for an application pass. The icons or symbols and associated captured images are located approximately equidistant from each other as the implement traverses through the field for an application pass. The data layer of the map can also be generated based on capturing images from sensors of an implement, machine, or aerial device. - In one example, a grower provides an input prior to or during a spraying operation for a spatial or grid based triggering of image capturing devices or sensors during the spraying operation. The image capturing devices or sensors capture at least one image for every location that is triggered spatially or based on a grid as defined by the grower.
-
FIG. 6 illustrates a monitor or display device having a user interface 601 with a split screen view that includes an enhanced map of a data layer with icons or symbols and an overview image in accordance with another embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 601 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The enhanced map 610 (e.g., enhanced weed map) shows a weed data layer across a field with selectable icons or symbols for images and a
scale region 620 shows weed coverage, weed pressure, or weed density on a scale from 100% to 0%. The overview image 650 shows an overview of the field and has ascale region 660. The images that are represented with icons or symbols are captured based on a threshold triggering (e.g., agricultural parameter exceeds a threshold value for the agricultural parameter, weed density exceeds a weed threshold trigger (e.g., 80%) then capture an image, emergence value exceeds an emergence threshold trigger then capture an image, etc.) as a machine pulls an implement through a field for an application pass. The icons or symbols and associated captured images are located at a geographical location whenever the agricultural parameter threshold is triggered as the implement traverses through the field. - Upon selection of an icon or symbol from the user interface 601 of
FIG. 6 , the software application displays a user interface 701 with map 710 and associatedscale region 720, animage 750 for the selected icon orsymbol 740 ofFIG. 7 and ascale region 760. Theimage 750 is an actual field image of crops, weeds, and soil conditions for the selected location from the map 610. In one example, navigation can occur from full screen to split screen view and then animage option 762 can have a drop down sub-menu to select a different data layer or agricultural parameter for display. Theimage option 762 displays animage 750 that was selected by selecting icon orsymbol 740. - Upon a pinch zoom input to the
image 750, the software application displays a user interface 801 ofFIG. 8 with map 710 and zoomed image 850 to show more details of the crops, weeds, and soil conditions at the geographical location for the selectedicon 740. - Upon selection of a
different icon 940 from the user interface 901 ofFIG. 9 , the software application displays an image 950 for the selectedicon 940. The image 950 is an actual field image of crops, weeds, and soil conditions for the selected location from the map 910. The weed coverage, pressure, or density exceeds a threshold and this triggers capturing the image 950 in real time from an implement or machine during an application pass or from a previous application pass. - Upon a pinch zoom input to the image 950, the software application displays a zoomed image 951 of
FIG. 10 to show more details of the crops, weeds, and soil conditions at the geographical location for the selectedicon 940. -
FIG. 11 illustrates a monitor or display device having a user interface 1101 with a split screen view that includes icons or symbols overlaid on an overview image 1110 of a field and also the image 951 for a selectedicon 940 in accordance with another embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1101 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The selectable icons or symbols represent captured images and a
scale region 1120 shows weed pressure, coverage or weed density on a scale from 100% to 0%. Selection of theicon 940 causes an image 951 to be displayed. The icons and associated captured images are located at geographical locations whenever the icons are spatially triggered as the implement traverses through the field. -
FIG. 12 illustrates a monitor or display device having a user interface 1201 with a split screen view that includes maps of different data layers in accordance with one embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1201 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The map 1210 (e.g., commanded planting population map from a planter, planted population map based on data from a seed sensor) shows a planted population data layer across a field and a
scale region 1220 shows seeds per acre in units of 1,000 (e.g.,scale region 1220 shows 28,000 to 30,000 seeds per acre). The map 1250 (e.g., actual emerged population map based on data from a sensor after plants emerge from the soil) shows an emerged population data layer across a field and ascale region 1260 shows plants per acre in units of 1,000 (e.g.,scale region 1260 shows 28,000 to 30,000 plants per acre). - In one example, the user interface 1201 includes an
orientation option 1280 to rotate an orientation of the images of the user interface with respect to a true North direction, a plus/minus zoom option 1281, a pinch to zoomoption 1282, an expandoption 1283 to control sizing of a displayed map in a field region, anicon option 1284 to enable or disable showing icons on the map 1210, afull map option 1285 to switch between different viewing options (e.g., a full screen view of map 1210, a split screen view having both map 1210 andmap 1250, a split screen view having an image with no map, etc.) and astatistics option 1286 to show statistics (e.g., bar charts, numerical data, histograms, number of acres of a field having emerged plant population below a threshold) for the data layer or the actual emerged population data. -
FIG. 13 illustrates a monitor or display device having a user interface 1301 with a split screen view that includes maps of different data layers in accordance with one embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1301 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The map 1310 (e.g., commanded planting population map from a planter, planted population map based on data from a seed sensor) shows a planted population data layer across a field and a
scale region 1320 shows seeds per acre in units of 1,000. The map 1350 (e.g., emerged population deviation map based on data from sensors after plants emerge from the soil) shows an emerged population deviation data layer across a field and ascale region 1360 shows emerged population deviation in units of 1,000 with respect to a target or the planted population. Alternatively, the 1320 and 1360 can show percentages for the planted population and the emerged population deviation, respectively. In one example, a 0% emerged population deviation indicates no difference between the planted population and the emerged population deviation and 100% emerged population deviation indicates that no plants emerged.scale regions -
FIG. 14 illustrates a monitor or display device having a user interface 1401 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1401 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The enhanced map 1410 (e.g., enhanced actual emergence population map) shows an actual emergence population data layer across a field with selectable icons or symbols for images and a
scale region 1420 shows actual emergence population in units of 1,000 (e.g., 28,000 to 30,000 actual emerged plants). The overview image 1450 shows an overview of the field and has ascale region 1460. The images that are represented with icons are captured based on a spatial triggering (e.g., user provides an input prior to or during an application pass to capture an image during the application pass every acre, every 2 acres, every 5 acres, etc.) or threshold triggering (e.g., actual emergence population is below, equal to, or exceeds an actual emergence population threshold) as a machine pulls an implement through a field for an application pass. The icons or symbols (e.g.,icon 1412 for spatial triggering, icon 1414 for threshold triggering) and associated captured images are located approximately equidistant from each other for spatially triggering and can be triggered more closely spaced or further apart from each other for threshold triggering as the implement traverses through the field for an application pass. The data layer of the map can also be generated based on capturing images from sensors of an implement, machine, or aerial device. - In one example, a grower provides an input prior to or during a spraying operation for a spatial or grid based triggering of image capturing devices or sensors during the spraying operation. The image capturing devices or sensors capture at least one image for every location that is triggered spatially or based on a grid as defined by the grower.
- Upon selection of the icon 1414, the user interface 1501 is generated as illustrated in
FIG. 15 . The user interface 1501 includes theemergence population map 1410 and the image 1550 with the image being captured at a location of the icon 1414. -
FIG. 16 illustrates a monitor or display device having a user interface 1601 with a split screen view that includes maps of different data layers in accordance with one embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1601 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The map 1610 (e.g., commanded planting population map from a planter, planted population map based on data from a seed sensor) shows a planted population data layer across a field and a
scale region 1620 shows seeds per acre in percentages with 94.4-100% being a target seed population. The map 1650 (e.g., actual relative emergence uniformity map based on data from sensors after plants emerge from the soil) shows an actual relative emergence uniformity data layer across a field and ascale region 1660 shows actual relative emergence uniformity in units of growth stages with respect to a target growth stage. The 1.87 and greater stage is the target growth stage, the 0.38-1.87 stage is one growth stage late in emergence, and the 0.38 and lower stage is two growth stages late in emergence. Alternatively, the 1620 and 1660 can show percentages for the planted population and the actual relative emergence uniformity, respectively. In one example, a 0% actual relative emergence uniformity indicates low uniformity and 100% actual relative emergence uniformity indicates a target uniformity for actual relative emergence uniformity. Various plant phenotype characteristics can be shown with a map or a uniformity map such as growth stage, biomass, plant height, size, and stalk size.scale regions -
FIG. 17 illustrates a monitor or display device having a user interface 1701 with a split screen view that includes an enhanced map of a data layer with icons and an overview image in accordance with one embodiment. Processing logic executes instructions of an initiated software application (e.g., field application) of a processing system to generate the user interface 1701 that is displayed by the monitor or display device. - The software application can provide different display regions that are selectable by a user. The enhanced map 1710 (e.g., enhanced actual relative emergence uniformity map) shows an actual relative emergence uniformity data layer across a field with selectable icons for images and a
scale region 1720 shows actual relative emergence uniformity on a scale to indicate a target growth stage or growth stages with late emergence. In response to selection oficon 1725, theimage 1750 is generated to show plant, weed, and soil conditions at a location of theicon 1725. Theimage 1750 shows a target relative emergence uniformity for the plants in this image. The images that are represented with icons or symbols are captured based on a spatial triggering (e.g., user provides an input prior to or during an application pass to capture an image during the application pass every acre, every 2 acres, every 5 acres, etc.) or threshold triggering (e.g., actual relative emergence uniformity compares in a predetermined manner (e.g., is below, equal to, or exceeds) an actual relative emergence uniformity threshold) as a machine pulls an implement through a field for an application pass. The icons and associated captured images are located approximately equidistant from each other for spatially triggering and can be triggered more closely spaced or further apart from each other for threshold triggering as the implement traverses through the field for an application pass. The data layer of the map can also be generated based on capturing images from sensors of an implement, machine, or aerial device. - If an
icon 1735 is selected from map 1710, then an image 1850 of user interface 1801 ofFIG. 18 is displayed. The image 1850 is generated to show plant, weed, and soil conditions at a location of theicon 1735. The image 1850 shows a below target relative emergence uniformity for the plants in this image. Thescale region 1720 indicates a relative emergence uniformity. The 1.87 and greater stage is the target growth stage, the 0.38-1.87 stage is one growth stage late in emergence, and the 0.38 and lower stage is two growth stages late in emergence. -
FIG. 19 shows an example of asystem 700 that includes a machine 702 (e.g., tractor, combine harvester, etc.) and an implement 2740 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) in accordance with one embodiment. Themachine 702 includes aprocessing system 2720,memory 705, machine network 2710 (e.g., a controller area network (CAN) serial bus protocol network, an ISOBUS network, etc.), and anetwork interface 715 for communicating with other systems or devices including the implement 2740. Themachine network 2710 includes sensors 712 (e.g., speed sensors), controllers 711 (e.g., GPS receiver, radar unit) for controlling and monitoring operations of the machine, and an optionalimage capture device 714 for capturing images of crops and soil conditions of a field in accordance with embodiments of the present disclosure. Thenetwork interface 715 can include at least one of a GPS transceiver, a WLAN transceiver (e.g., WiFi), an infrared transceiver, a Bluetooth transceiver, Ethernet, or other interfaces from communications with other devices and systems including the implement 2740. Thenetwork interface 715 may be integrated with themachine network 2710 or separate from themachine network 2710 as illustrated inFIG. 19 . The I/O ports 729 (e.g., diagnostic/on board diagnostic (OBD) port) enable communication with another data processing system or device (e.g., display devices, sensors, etc.). - In one example, the machine performs operations of a combine (combine harvester) for harvesting grain crops. The machine combines reaping, threshing, and winnowing operations in a single harvesting operation. An optional header 780 (e.g., grain platform, flex platform) includes a cutting mechanism to cause cutting of crops to be positioned into an auger. The
header 780 includes anorientation device 782 or mechanism for orienting a crop (e.g., corn, soybeans) for improving image capture with animage capture device 784. - The
processing system 2720 may include one or more microprocessors, processors, a system on a chip (integrated circuit), or one or more microcontrollers. The processing system includesprocessing logic 726 for executing software instructions of one or more programs and a communication unit 728 (e.g., transmitter, transceiver) for transmitting and receiving communications from the machine viamachine network 2710 ornetwork interface 715 or implement via implementnetwork 2750 ornetwork interface 2760. Thecommunication unit 728 may be integrated with the processing system or separate from the processing system. In one embodiment, thecommunication unit 728 is in data communication with themachine network 2710 and implementnetwork 2750 via a diagnostic/OBD port of the I/O ports 729. -
Processing logic 726 including one or more processors may process the communications received from thecommunication unit 728 including agricultural data. Thesystem 700 includesmemory 705 for storing data and programs for execution (software 706) by the processing system. Thememory 705 can store, for example, software components such as image capture software, software for customizing scale and corresponding field views of agricultural fields with expand and panning operations for performing operations or methods of the present disclosure, or any other software application or module, images (e.g., captured images of crops), alerts, maps, etc. Thememory 705 can be any known form of a machine readable non-transitory storage medium, such as semiconductor memory (e.g., flash; SRAM; DRAM; etc.) or non-volatile memory, such as hard disks or solid-state drive. The system can also include an audio input/output subsystem (not shown) which may include a microphone and a speaker for, for example, receiving and sending voice commands or for user authentication or authorization (e.g., biometrics). - The
processing system 2720 communicates bi-directionally withmemory 705,machine network 2710,network interface 715,header 780,display device 2730,display device 725, and I/O ports 729 via communication links 731-737, respectively. -
725 and 2730 can provide visual user interfaces for a user or operator. The display devices may include display controllers. In one embodiment, the display device 725 (or computing device 725) is a portable tablet device or computing device with a touchscreen that displays images (e.g., captured images, localized view map layer, high definition field maps of as-planted or as-harvested data or other agricultural variables or parameters, yield maps, alerts, etc.) and data generated by an agricultural data analysis software application or field view software application and receives input (e.g., expand (positive expansion, negative expansion or contraction), panning) from the user or operator for a customized scale region and corresponding view of a region of a field, monitoring and controlling field operations, or any operations or methods of the present disclosure. TheDisplay devices processing system 2720 andmemory 705 can be integrated with thecomputing device 725 or separate from the computing device. The operations may include configuration of the machine or implement, reporting of data, control of the machine or implement including sensors and controllers, and storage of the data generated. Thedisplay device 2730 may be a display (e.g., display provided by an original equipment manufacturer (OEM)) that displays images and data for a localized view map layer, as-planted or as-harvested data, yield data, controlling a machine (e.g., planter, tractor, combine, sprayer, etc.), steering the machine, and monitoring the machine or an implement (e.g., planter, combine, sprayer, etc.) that is connected to the machine with sensors and controllers located on the machine or implement. - A
cab control module 770 may include an additional control module for enabling or disabling certain components or devices of the machine or implement. For example, if the user or operator is not able to control the machine or implement using one or more of the display devices, then the cab control module may include switches to shut down or turn off components or devices of the machine or implement. - The implement 2740 (e.g., planter, cultivator, plough, sprayer, spreader, irrigation implement, etc.) includes an implement
network 2750, aprocessing system 2762, anetwork interface 2760, and optional input/output ports 766 for communicating with other systems or devices including themachine 702. The implement network 2750 (e.g., a controller area network (CAN) serial bus protocol network, an ISOBUS network, etc.) includes animage capture device 756 for capturing images of crop development and soil conditions, sensors 752 (e.g., speed sensors, seed sensors for detecting passage of seed, downforce sensors, actuator valves, OEM sensors, etc.), controllers 754 (e.g., GPS receiver), and theprocessing system 2762 for controlling and monitoring operations of the machine. The OEM sensors may be moisture sensors or flow sensors for a combine, speed sensors for the machine, seed force sensors for a planter, liquid application sensors for a sprayer, or vacuum, lift, lower sensors for an implement. For example, the controllers may include processors in communication with a plurality of seed sensors. The processors are configured to process images captured byimage capture device 756 or seed sensor data and transmit processed data to the 2762 or 2720. The controllers and sensors may be used for monitoring motors and drives on a planter including a variable rate drive system for changing plant populations. The controllers and sensors may also provide swath control to shut off individual rows or sections of the planter. The sensors and controllers may sense changes in an electric motor that controls each row of a planter individually. These sensors and controllers may sense seed delivery speeds in a seed tube for each row of a planter.processing system - The
network interface 2760 can be a GPS transceiver, a WLAN transceiver (e.g., WiFi), an infrared transceiver, a Bluetooth transceiver, Ethernet, or other interfaces from communications with other devices and systems including themachine 702. Thenetwork interface 2760 may be integrated with the implementnetwork 2750 or separate from the implementnetwork 2750 as illustrated inFIG. 19 . - The
processing system 2762 communicates bi-directionally with the implementnetwork 2750,network interface 2760, and I/O ports 766 via communication links 741-743, respectively. - The implement communicates with the machine via wired and possibly also wireless
bi-directional communications 704. The implementnetwork 2750 may communicate directly with themachine network 2710 or via the network interfaces 715 and 2760. The implement may also by physically coupled to the machine for agricultural operations (e.g., planting, harvesting, spraying, etc.). - The
memory 705 may be a machine-accessible non-transitory medium on which is stored one or more sets of instructions (e.g., software 706) embodying any one or more of the methodologies or functions described herein. Thesoftware 706 may also reside, completely or at least partially, within thememory 705 and/or within theprocessing system 2720 during execution thereof by thesystem 700, the memory and the processing system also constituting machine-accessible storage media. Thesoftware 706 may further be transmitted or received over a network via thenetwork interface device 715. - In one embodiment, a machine-accessible non-transitory medium (e.g., memory 705) contains executable computer program instructions which when executed by a processing system cause the system to perform operations or methods of the present disclosure including customizing scale and corresponding field views of agricultural fields with expand and panning operations. While the machine-accessible non-transitory medium (e.g., memory 705) is shown in an exemplary embodiment to be a single medium, the term “machine-accessible non-transitory medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term “machine-accessible non-transitory medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure. The term “machine-accessible non-transitory medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
- Prior approaches for stand count determine a number of planted seeds and a number of growing plants per unit area. An expected result based on the number of planted seeds is compared to the number of growing plants to calculate a percentage. Stand count is used to evaluate seed quality (germination rate) and whether replanting is needed or not.
- Described herein are systems and methods for using sensors of agricultural implements or machines to capture images of crop emergence during different crop stages, determine a uniformity of the crop emergence, and quantify the uniformity of crop emergence.
-
FIGS. 20A and 20B illustrate a flow diagram of one embodiment for a computer implemented method of determining and quantifying crop emergence uniformity within an agricultural field. Themethod 2000 is performed by processing logic that may comprise hardware (circuitry, dedicated logic, etc.), software (such as is run on a general purpose computer system or a dedicated machine or a device), or a combination of both. In one embodiment, the computer implementedmethod 2000 is performed by processing logic of a processing system of asystem 102, machine, apparatus, implement, agricultural vehicle, aerial device, monitor, display device, user device, self-guided device, or self-propelled device (e.g., robot, ATV, UTV, etc.). The processing system executes instructions of a software application or program with processing logic. The software application or program can be initiated by the processing system. In one example, a monitor or display device receives user input and provides a customized display for operations of themethod 2000. - At
operation 2002, a software application is initiated on the processing system and displayed on a monitor or display device as a user interface. The processing system may be integrated with or coupled to a machine that performs an application pass (e.g., planting, tillage, fertilization, spraying, etc.). Alternatively, the processing system may be integrated with an apparatus (e.g., drone, image capture device) associated with the machine that captures images before, during, or after the application pass. In one example, the user interface includes a map of a data layer (e.g., seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, seed germination risk) for a field of interest and an overview image of the field of interest. Seed germination risk can be germination/emergence (no germination/emergence, on time germination/emergence, or late germination/emergence) or factors other than time, such as, deformities, damaged seed, reduced vigor, or disease. Seed germination risk can be high, medium, or low, or it can be on-time emergence, late emergence, or no emergence. - The data layer can be generated from data collected by sensors on an implement, a machine pulling the implement during a current application pass, an aerial device, a user device, a self-guided device, a self-propelled device, etc., or the data layer can be generated from a previous application pass through the field. The sensors may be in-situ sensors positioned on each row unit of an implement, spaced across several row units, or positioned on a machine.
- At
operation 2004, the computer implemented method includes obtaining one or more images of biomass data for a region of interest of a field from one or more sensors of an agricultural implement, which can be traversing the field to obtain the biomass data for various crop stages or for an application pass. Alternatively, the sensors can be located on a machine, an agricultural vehicle, an aerial device, a drone, a self-propelled device (e.g., robot, off-road vehicle, ATV, UTV), to collect agricultural data before, during, or after an application pass. Atoperation 2006, the computer implemented method partitions a captured image into tiles. In one example, the tiles (e.g., n×m array of tiles) cover an entire image and additional adjacent tiles (e.g., left center, right center) that overlap the tiles are also utilized. Atoperation 2008, the computer implemented method provides the tiles as input to a deep learning model (DLM) to differentiate pixels of the tiles between a targeted type of vegetation (e.g., a crop, corn, soybean, wheat, etc.), a background, or other vegetation. The tile can correspond to one or more images that are provided to the DLM. A single high resolution image or a resized lower resolution image can be provided in alternative embodiments. - At
operation 2010, the computer implemented method receives output from the DLM in terms of modeled tiles with predicted pixel values (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) and reassembles the modeled tiles on a per tile basis to display the targeted type of vegetation in dimensionality of the original one or more images. A sample predicted output from the DLM is illustrated in diagram 2200 ofFIG. 22 . The 2202, 2204, 2206, 2208 of crops represent predicted pixel values (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) for targeted vegetation.rows - At
operation 2012, the computer implemented method resolves conflicts (e.g., ties or disagreements) for pixel classification from overlapping tiles with voting. In one example, if an odd number of tiles overlap a region, then a majority vote determines a truth (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) for pixel classification. Alternatively, a logical “OR” operation can be applied to the odd number of overlapping tiles such that if any overlapping tile identifies a targeted type of vegetation then the region is classified with the targeted type of vegetation. - In another example, if an even number of tiles overlap a region, then a logical OR operation can be applied to the even number of overlapping tiles such that if any overlapping tile identifies a targeted type of vegetation then the region is classified with the targeted type of vegetation.
- At
operation 2014, the computer implemented method applies a predetermined mask (e.g., binary mask,mask 2100 ofFIG. 21 ) to select portions of the one or more images that correspond with the targeted type of vegetation pixels. Regions of interest (e.g.,region 1 is first row of crop, region 2 is second row of the crop, etc.) that align with crop rows can be provided via the predetermined mask that prescribes portions of the image that correspond with a row of a targeted vegetation or crop, or the selected portions can be inferred via the presence and orientation of specific types of vegetation that are detected via images. - For the selected portions of the one or more images, the method accumulates the targeted type of vegetation pixels to create one or more rows of crops (e.g., vertical lines) corresponding to vegetation intensity at
operation 2016. In one example, for each region of the image corresponding to a crop row, detected vegetation pixels that represent biomass are accumulated horizontally to create the one or more rows of crops corresponding to vegetation biomass intensity. - At
operation 2018, the computer implemented method applies a filter (e.g., one-dimensional filter) with a length corresponding to the spacing in pixels between individual plants of the targeted type of plants along a row of plant intensity (e.g., vertical line of plant intensity) to determine a simple moving average or a weighted average of vegetation intensity of the targeted type of plants. In one example, a one-dimensional filter with a length corresponding to the spacing in pixels between individual plants is convolved along the row of plant intensity. The filter can be uniform to represent a simple moving average, or weighted to produce a weighted average of vegetation along the row (or vertical line) of vegetation intensity - At
optional operation 2020, additional weights 2402 (e.g., weighted value less than 1), 2404 (e.g., weighted value equal to 1 near center of image), 2406 (e.g., weighted value greater than 1) as illustrated in the weighted value diagram 2400 ofFIG. 24 can be applied to the entire one or more rows of crops in the event that the image plane (e.g., forward looking image plane with upward tilt) of a sensor (e.g., camera) is not coplanar with the ground surface. In one example, the sensor can have a tilt from 0 degrees (coplanar with ground plane so no weighting is needed) to 45 degrees. This allows adjustment for pixels at the top of the image representing a different ground surface area (e.g., larger ground surface area) than pixels at the bottom of the image. Alternatively, a camera that is not coplanar with a ground surface can involve having a perspective warp applied to the captured image to match coplanar perspective with the ground surface and thus compensative for the camera not being coplanar with the ground surface. - A forward-looking image plane with upward tilt allows the sensors to capture images of a region of plants prior to an implement reaching the region of the plants and thus allows time for the implement to adjust parameters of an agricultural application if necessary prior to reaching the region of the plants.
- At
operation 2022, the adjusted vegetation intensity along the one or more rows of crops (e.g., one or more vertical lines) can be thresholded for a minimum vegetation intensity, revealing portions of a crop row with little or no presence of the desired vegetation. The adjusted vegetation intensity along the one or more rows of crops (one or more vertical lines) can be thresholded for a maximum vegetation intensity, revealing portions of a crop row with too much of the desired vegetation. - At
operation 2024, the computer implemented method determines a targeted plant uniformity (e.g., emergence score) based on the simple moving average or a weighted average of the targeted type of plants and the thresholding for minimum and maximum vegetation intensity. The portion of the crop row (or vertical line) meeting both upper and lower thresholding criteria can represent an emergence score between 0 and 1 or between 0 and 100%. The emergence score indicates a distribution of the targeted vegetation over the region of interest. -
FIG. 21 illustrates an example of a predetermined mask to select portions of an image in accordance with one embodiment. Thepredetermined mask 2100 shows a number of pixels on the x axis and y axis and includes regions of 2102, 2104, 2106, 2108 (e.g.,interest region 2102 is first row of crop,region 2104 is second row of the crop, etc.) that align with crop rows to prescribe portions of the image that correspond with a row of a targeted vegetation or crop. Alternatively, the regions can be inferred based on orientation of vegetation from the model output. -
FIG. 23 illustrates an example of a moving average of vegetation intensity or a weighted average of vegetation intensity along a row in accordance with one embodiment. The moving or weighted average ofvegetation intensity 2302 is determined based onoperation 2018 ofFIG. 20 . The diagram 2300 shows the moving or weighted average on a y axis and a pixel position of adjusted vegetation intensity from a bottom to a top of an image on an x axis. In one example, the moving average of vegetation intensity or a weighted average ofvegetation intensity 2302 along a row is determined moving from a top to bottom of an image after convolution with a 1 dimensional filter.Upper threshold 2308 andlower threshold 2306 are shown as horizontal lines representing areas of the image with too much biomass if above thethreshold 2308 or too little biomass if belowthreshold 2306. An emergence score (e.g., 0 to 100%) is determined based on a portion of the biomass that is greater than thelower threshold 2306 and less than theupper threshold 2308. The emergence score can be determined based on a percent of time that the moving average of vegetation intensity is greater than thelower threshold 2306 and less than theupper threshold 2308. - Any of the following examples can be combined into a single embodiment or these examples can be separate embodiments. In one example of a first embodiment, a computer implemented method for customizing field views of data displays comprises obtaining a data layer for an agricultural parameter from sensors of an agricultural implement or machine during an application pass for a field and generating a user interface with an enhanced map that includes the data layer for the agricultural parameter and selectable icons overlaid at different geographic locations on the enhanced map for the field.
- In one example of the second embodiment, a computing device comprises a display device for displaying a user interface having a scale region and a field region for an agricultural parameter; and a processor coupled to the display device. The processor is configured to generate a data layer for the agricultural parameter from sensors of an agricultural implement or machine that collect the data during an application pass for a field and to generate the user interface with an enhanced map that includes the data layer for the agricultural parameter and selectable icons overlaid at different geographic locations on the enhanced map for the field.
- In one example of a third embodiment, computer implemented method for customizing field views of a field region of data displays comprises obtaining a data layer for an agricultural parameter from sensors of an agricultural implement or machine that collects data during an application pass for a field and generating selectable icons and overlaying the selectable icons at different geographic locations on an enhanced map of the data layer for the field based on spatial trigger or a threshold trigger for the agricultural parameter.
- The following are nonlimiting examples.
- Example 1—A computer implemented method for customizing field views of a display device comprising: obtaining a data layer for an agricultural parameter from sensors of an agricultural implement during an application pass for a field; and generating a user interface with an enhanced map that includes the data layer for the agricultural parameter and selectable icons overlaid at different geographic locations on the enhanced map for the field.
- Example 2—the computer implemented method of Example 1, wherein the user interface further comprises a split screen view with the enhanced map on a first side of the split screen view and an overview image of the field on a second side of the split screen view.
- Example 3—the computer implemented method of any preceding Example, wherein the agricultural implement comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- Example 4—the computer implemented method of Example 3, further comprising: displaying the user interface with the enhanced map on the display device; receiving a user input to select an icon of the enhanced map; and generating an updated user interface with the enhanced map and an image that is associated with the selected icon changing color on the enhanced map.
- Example 5—the computer implemented method of Example 4, wherein the image is displayed as a pop up window or over an overview image of the field.
- Example 6—the computer implemented method of Example 5, wherein the enhanced map provides an ability to select icons throughout the field to show actual captured images of crops, weeds, and conditions of soil of the field.
- Example 7—the computer implemented method of any of Examples 1 to 6, wherein the selectable icons are generated and overlaid at different geographic locations on the enhanced map for the field based on a spatial trigger within the field, a threshold trigger for when an agricultural parameter exceeds a threshold for the agricultural parameter, a time based trigger for capturing images, or a burst capture of images.
- Example 8—the computer implemented method of any of Examples 1 to 6, wherein the selectable icons are generated and overlaid at different geographic locations on the enhanced map for the field based on a threshold trigger including a weed density exceeding a threshold trigger for weed density or an emergence value exceeding a threshold trigger for emergence data.
- Example 9—the computer implemented method of any preceding Example, wherein the agricultural parameter comprises one or more of seed data, commanded planter seed population, actual seed population determined from a seed sensor, a seed population deviation, singulation data, weed map, emergence data, emergence map, emergence environment score based on a combination of temperature and moisture correlated to how long a seed takes to germinate, emergence environment score based on a percentage of seeds planted that will germinate within a selected number of days, time to germination, time to emergence, and seed germination risk.
- Example 10—A computing device comprising: a display device for displaying a user interface having a scale region and a field region for an agricultural parameter; and a processor coupled to the display device, the processor is configured to generate a data layer for the agricultural parameter from sensors of an agricultural implement that collects the data during an application pass for a field and to generate the user interface with an enhanced map that includes the data layer for the agricultural parameter and selectable icons or symbols overlaid at different geographic locations on the enhanced map for the field.
- Example 11—the computing device of Example 10, wherein the user interface further comprises a split screen view with the enhanced map on a first side of the split screen view and an overview image of the field on a second side of the split screen view.
- Example 12—the computing device of Example 10 or 11, wherein the agricultural implement comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- Example 13—the computing device of Example 12, wherein the display device to display the user interface with the enhanced map and to receive a user input to select an icon of the enhanced map, wherein the processor is configured to generate an updated user interface with the enhanced map and an image that is associated with a selected icon or symbol based on the user input with the selected icon or symbol changing color.
- Example 14—the computing device of Example 13, wherein the updated user interface to provide a selectable orientation option to rotate an orientation of the images of the user interface, a selectable expand option to control sizing of a displayed map in a field region, a selectable icon or symbol option to enable or disable showing icons or symbols on the enhanced map, a selectable full map option to switch between a full screen view of map versus a split screen view having both of a map and an overview image, and a selectable statistics option to show statistics for the data layer.
- Example 15—the computing device of any of Examples 10 to 14, wherein the display device to receive a user input to modify the scale region and to display a modified scale region and a corresponding modified field region.
- Example 16—A computer implemented method for customizing field views of a field region comprising: obtaining a data layer for an agricultural parameter from sensors of an agricultural implement that collects data during an application pass for a field; and generating selectable icons and overlaying the selectable icons at different geographic locations on an enhanced map of the data layer for the field based on spatial trigger or a threshold trigger for the agricultural parameter.
- Example 17—the computer implemented method of Example 16, further comprising: comparing the agricultural parameter to the threshold trigger; determining whether the agricultural parameter exceeds the threshold trigger for a location within the field; and generating a selectable icon when the agricultural parameter exceeds the threshold trigger for the location within the field.
- Example 18—the computer implemented method of Example 17, wherein the threshold trigger comprises a weed threshold that is compared to a weed density.
- Example 19—the computer implemented method of Example 17, wherein the threshold trigger comprises an emergence threshold that is compared to an emergence value for plant emergence data.
- Example 20—the computer implemented method of any of Examples 16 to 19, further comprising: displaying a user interface with the enhanced map that includes the data layer for the agricultural parameter and the selectable icons overlaid at different geographic locations on the enhanced map for the field.
- Example 21—the computer implemented method of any of Examples 16 to 19, wherein the agricultural implement comprises a planter, sprayer, or irrigation implement having row units with each row unit having a sensor for capturing images to obtain the data layer.
- Example 22—A computer implemented method for measuring and quantifying crop emergence uniformity within an agricultural field comprises obtaining one or more images of biomass data for a region of interest within the agricultural field from one or more sensors of an agricultural implement, which can be traversing the field to obtain the biomass data for various crop stages or for an application pass. The computer implemented method partitions a captured image into tiles, provides the tiles to a deep learning model to provide modeled tiles with predicted pixel values (e.g., 1 for targeted vegetation, 0 for weeds or other non-targeted vegetation) and reassembles the modeled tiles on a per tile basis to display the targeted type of vegetation in dimensionality of the original one or more images.
- Example 23—the computer implemented method of Example 22, further comprising: applying a predetermined mask to select portions of the one or more images that correspond with the targeted type of vegetation pixels.
- Example 24—the computer implemented method of any of Examples 22-23, further comprising: accumulating the targeted type of vegetation pixels to create one or more rows of crops (e.g., vertical lines) corresponding to vegetation intensity.
- Example 25—the computer implemented method of any of Examples 22-24, further comprising: applying a filter (e.g., one-dimensional filter) with a length corresponding to spacing in pixels between individual plants of the targeted type of plants along a row of plant intensity (e.g., vertical line of plant intensity) to determine a simple moving average or a weighted average of vegetation intensity for the targeted type of plants
- Example 26—the computer implemented method of any of Examples 22-25, further comprising: applying upper and lower thresholds to the simple moving average or a weighted average of vegetation intensity along the one or more rows of crops (one or more vertical lines) and determining a targeted plant uniformity based on the simple moving average or a weighted average of vegetation intensity of the targeted type of plants and the thresholding for lower (minimum) and upper (maximum) vegetation intensity. The portion of the crop row (or vertical line) meeting both thresholding criteria can represent an emergence score between 0 and 100%.
- It is to be understood that the above description is intended to be illustrative, and not restrictive. Many other embodiments will be apparent to those of skill in the art upon reading and understanding the above description. The scope of the disclosure should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
Claims (21)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US18/555,897 US20240295954A1 (en) | 2021-06-07 | 2022-05-25 | Systems and Methods for Providing Field Views Including Enhanced Agricultural Maps Having a Data Layer and Image Data |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202163197634P | 2021-06-07 | 2021-06-07 | |
| US202263269693P | 2022-03-21 | 2022-03-21 | |
| US18/555,897 US20240295954A1 (en) | 2021-06-07 | 2022-05-25 | Systems and Methods for Providing Field Views Including Enhanced Agricultural Maps Having a Data Layer and Image Data |
| PCT/IB2022/054916 WO2022259072A1 (en) | 2021-06-07 | 2022-05-25 | Systems and methods for providing field views including enhanced agricultural maps having a data layer and image data |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20240295954A1 true US20240295954A1 (en) | 2024-09-05 |
Family
ID=82020111
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/555,897 Pending US20240295954A1 (en) | 2021-06-07 | 2022-05-25 | Systems and Methods for Providing Field Views Including Enhanced Agricultural Maps Having a Data Layer and Image Data |
Country Status (6)
| Country | Link |
|---|---|
| US (1) | US20240295954A1 (en) |
| EP (1) | EP4351305A1 (en) |
| AU (1) | AU2022288683A1 (en) |
| BR (1) | BR112023021739A2 (en) |
| CA (1) | CA3213508A1 (en) |
| WO (1) | WO2022259072A1 (en) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20230046882A1 (en) * | 2021-08-11 | 2023-02-16 | Deere & Company | Obtaining and augmenting agricultural data and generating an augmented display |
| US20240184416A1 (en) * | 2021-03-30 | 2024-06-06 | Schlumberger Technology Corporation | Integrated energy data science platform |
| US12439840B2 (en) | 2021-08-11 | 2025-10-14 | Deere & Company | Obtaining and augmenting agricultural data and generating an augmented display |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN117541654B (en) * | 2024-01-09 | 2024-05-24 | 广东泰一高新技术发展有限公司 | Detail enhancement method for high-resolution remote sensing image |
| WO2025215426A1 (en) * | 2024-04-11 | 2025-10-16 | Precision Planting Llc | System and method for selecting and displaying on a display device of a first machine data from a second machine |
Citations (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120139845A1 (en) * | 2010-12-03 | 2012-06-07 | Research In Motion Limited | Soft key with main function and logically related sub-functions for touch screen device |
| US20150206255A1 (en) * | 2011-05-13 | 2015-07-23 | HydroBio, Inc | Method and system to prescribe variable seeding density across a cultivated field using remotely sensed data |
| US20170374323A1 (en) * | 2015-01-11 | 2017-12-28 | A.A.A Taranis Visual Ltd | Systems and methods for agricultural monitoring |
| US20180368331A1 (en) * | 2017-06-22 | 2018-12-27 | Kubota Corporation | Grass management system and grass management method |
| US20190147249A1 (en) * | 2016-05-12 | 2019-05-16 | Bayer Cropscience Aktiengesellschaft | Recognition of weed in a natural environment |
| WO2019099748A1 (en) * | 2017-11-15 | 2019-05-23 | Precision Planting Llc | Seed trench closing sensors |
| US20200019777A1 (en) * | 2018-07-10 | 2020-01-16 | Adroit Robotics | Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area |
| WO2020031473A1 (en) * | 2018-08-06 | 2020-02-13 | 株式会社クボタ | External shape calculation system, external shape calculation method, external shape calculation program, storage medium having external shape calculation program stored therein, farm field map generation system, farm field map generation program, storage medium having farm field map generation program stored therein, and farm field map generation method |
| US20200225206A1 (en) * | 2017-10-02 | 2020-07-16 | Precision Planting, Llc | Systems and apparatuses for soil and seed monitoring |
| US20210256632A1 (en) * | 2019-09-30 | 2021-08-19 | Chioccoli, LLC | Systems and methods for aggregating harvest yield data |
| US20220110254A1 (en) * | 2020-10-09 | 2022-04-14 | Deere & Company | Map generation and control system |
| US20220113730A1 (en) * | 2020-10-09 | 2022-04-14 | Deere & Company | Machine control using a predictive speed map |
| US20220110256A1 (en) * | 2020-10-09 | 2022-04-14 | Deere & Company | Machine control using a predictive map |
| US20220197256A1 (en) * | 2019-05-31 | 2022-06-23 | Precision Planting Llc | Methods and systems for using duty cycle of sensors to determine seed or particle flow rate |
| JP2022542764A (en) * | 2019-07-15 | 2022-10-07 | ビーエーエスエフ アグロ トレードマークス ゲーエムベーハー | Method for generating application maps for treating farms with agricultural equipment |
| US20220346304A1 (en) * | 2019-08-05 | 2022-11-03 | Precision Planting Llc | Speed control of implements during transitions of settings of agricultural parameters |
| US11564345B1 (en) * | 2018-06-24 | 2023-01-31 | Climate Llc | Computer-implemented recommendation of side-by-side planting in agricultural fields |
| JP7471211B2 (en) * | 2020-12-10 | 2024-04-19 | 株式会社クボタ | Farm field map generation system |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| ES2627181T5 (en) | 2007-01-08 | 2021-01-11 | Climate Corp | Planter monitoring system and method |
| US9351440B2 (en) | 2011-03-22 | 2016-05-31 | Precision Planting Llc | Seed meter disc having agitation cavities |
| CA3161194C (en) | 2011-09-27 | 2024-10-22 | Prec Planting Llc | APPARATUS, SYSTEMS AND METHODS FOR DISTRIBUTION OF SEEDS |
| EP3259972B1 (en) | 2012-07-25 | 2021-12-08 | Precision Planting LLC | System and method for multi-row agricultural implement control and monitoring |
| US9699958B2 (en) | 2012-08-10 | 2017-07-11 | The Climate Corporation | Systems and methods for control, monitoring and mapping of agricultural applications |
| US10721859B2 (en) * | 2017-01-08 | 2020-07-28 | Dolly Y. Wu PLLC | Monitoring and control implement for crop improvement |
| US10860189B2 (en) | 2018-01-11 | 2020-12-08 | Precision Planting Llc | Systems and methods for customizing scale and corresponding views of data displays |
| US12279545B2 (en) * | 2019-07-26 | 2025-04-22 | Kansas State University Research Foundation | Automatic system for measuring spacing, depth, and geolocation of seeds |
-
2022
- 2022-05-25 AU AU2022288683A patent/AU2022288683A1/en active Pending
- 2022-05-25 WO PCT/IB2022/054916 patent/WO2022259072A1/en not_active Ceased
- 2022-05-25 BR BR112023021739A patent/BR112023021739A2/en unknown
- 2022-05-25 EP EP22729815.5A patent/EP4351305A1/en active Pending
- 2022-05-25 US US18/555,897 patent/US20240295954A1/en active Pending
- 2022-05-25 CA CA3213508A patent/CA3213508A1/en active Pending
Patent Citations (24)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20120139845A1 (en) * | 2010-12-03 | 2012-06-07 | Research In Motion Limited | Soft key with main function and logically related sub-functions for touch screen device |
| US20150206255A1 (en) * | 2011-05-13 | 2015-07-23 | HydroBio, Inc | Method and system to prescribe variable seeding density across a cultivated field using remotely sensed data |
| US20170374323A1 (en) * | 2015-01-11 | 2017-12-28 | A.A.A Taranis Visual Ltd | Systems and methods for agricultural monitoring |
| US20190147249A1 (en) * | 2016-05-12 | 2019-05-16 | Bayer Cropscience Aktiengesellschaft | Recognition of weed in a natural environment |
| US20180368331A1 (en) * | 2017-06-22 | 2018-12-27 | Kubota Corporation | Grass management system and grass management method |
| US20200225206A1 (en) * | 2017-10-02 | 2020-07-16 | Precision Planting, Llc | Systems and apparatuses for soil and seed monitoring |
| WO2019099748A1 (en) * | 2017-11-15 | 2019-05-23 | Precision Planting Llc | Seed trench closing sensors |
| US20240276904A1 (en) * | 2017-11-15 | 2024-08-22 | Precision Planting Llc | Seed trench closing sensors |
| US20200359559A1 (en) * | 2017-11-15 | 2020-11-19 | Precision Planting Llc | Seed trench closing sensors |
| US11678600B2 (en) * | 2017-11-15 | 2023-06-20 | Precision Planting Llc | Seed trench closing sensors |
| US11564345B1 (en) * | 2018-06-24 | 2023-01-31 | Climate Llc | Computer-implemented recommendation of side-by-side planting in agricultural fields |
| US20200019777A1 (en) * | 2018-07-10 | 2020-01-16 | Adroit Robotics | Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area |
| WO2020031473A1 (en) * | 2018-08-06 | 2020-02-13 | 株式会社クボタ | External shape calculation system, external shape calculation method, external shape calculation program, storage medium having external shape calculation program stored therein, farm field map generation system, farm field map generation program, storage medium having farm field map generation program stored therein, and farm field map generation method |
| US20220197256A1 (en) * | 2019-05-31 | 2022-06-23 | Precision Planting Llc | Methods and systems for using duty cycle of sensors to determine seed or particle flow rate |
| US12298741B2 (en) * | 2019-05-31 | 2025-05-13 | Precision Planting Llc | Methods and systems for using duty cycle of sensors to determine seed or particle flow rate |
| JP2022542764A (en) * | 2019-07-15 | 2022-10-07 | ビーエーエスエフ アグロ トレードマークス ゲーエムベーハー | Method for generating application maps for treating farms with agricultural equipment |
| US20220346304A1 (en) * | 2019-08-05 | 2022-11-03 | Precision Planting Llc | Speed control of implements during transitions of settings of agricultural parameters |
| US20210256632A1 (en) * | 2019-09-30 | 2021-08-19 | Chioccoli, LLC | Systems and methods for aggregating harvest yield data |
| US20220113730A1 (en) * | 2020-10-09 | 2022-04-14 | Deere & Company | Machine control using a predictive speed map |
| US20220110256A1 (en) * | 2020-10-09 | 2022-04-14 | Deere & Company | Machine control using a predictive map |
| US20220110254A1 (en) * | 2020-10-09 | 2022-04-14 | Deere & Company | Map generation and control system |
| US11844311B2 (en) * | 2020-10-09 | 2023-12-19 | Deere & Company | Machine control using a predictive map |
| US11845449B2 (en) * | 2020-10-09 | 2023-12-19 | Deere & Company | Map generation and control system |
| JP7471211B2 (en) * | 2020-12-10 | 2024-04-19 | 株式会社クボタ | Farm field map generation system |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20240184416A1 (en) * | 2021-03-30 | 2024-06-06 | Schlumberger Technology Corporation | Integrated energy data science platform |
| US20230046882A1 (en) * | 2021-08-11 | 2023-02-16 | Deere & Company | Obtaining and augmenting agricultural data and generating an augmented display |
| US12439840B2 (en) | 2021-08-11 | 2025-10-14 | Deere & Company | Obtaining and augmenting agricultural data and generating an augmented display |
Also Published As
| Publication number | Publication date |
|---|---|
| AU2022288683A1 (en) | 2023-10-05 |
| EP4351305A1 (en) | 2024-04-17 |
| CA3213508A1 (en) | 2022-12-15 |
| WO2022259072A1 (en) | 2022-12-15 |
| BR112023021739A2 (en) | 2023-12-26 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3510851B1 (en) | Systems and methods for customizing scale and corresponding views of data displays | |
| US20240295954A1 (en) | Systems and Methods for Providing Field Views Including Enhanced Agricultural Maps Having a Data Layer and Image Data | |
| US12150406B2 (en) | Methods and imaging systems for harvesting | |
| CN109714947B (en) | System, tool and method for passive seed orientation in agricultural fields | |
| EP3900511B1 (en) | Agricultural machine section control | |
| US10383275B2 (en) | Systems and method for monitoring, controlling, and displaying field operations | |
| US20240057508A1 (en) | Data transfer | |
| US20240224839A9 (en) | Systems and Methods for Determining State Data for Agricultural Parameters and Providing Spatial State Maps | |
| US20250285434A1 (en) | Systems and Methods for Vision-Based Plant Detection and Scouting Application Technology | |
| CN117295393A (en) | System and method for providing a field view including enhanced agricultural map and image data having a data layer | |
| WO2024150057A1 (en) | Method and system to provide a viewing and replay functionality for agricultural data layers | |
| EP4649389A1 (en) | Methods and systems for adjusting a range feature of an editor tool to automatically adjust a range of data values in a range region and automatically adjust a corresponding field view of a data display | |
| WO2025149802A1 (en) | Method and system to determine as-applied data for an applied product for one or more application passes in different regions of a field | |
| WO2024150056A1 (en) | Method and system to provide a region explorer function for selecting regions of interest of agricultural data layers and to provide data metrics for the regions of interest | |
| WO2025215426A1 (en) | System and method for selecting and displaying on a display device of a first machine data from a second machine | |
| WO2024134326A1 (en) | Methods for imaging a field |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: PRECISION PLANTING LLC, ILLINOIS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:STOLLER, JASON J.;KNUFFMAN, RYAN;REEL/FRAME:065262/0611 Effective date: 20220325 Owner name: PRECISION PLANTING LLC, ILLINOIS Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNORS:STOLLER, JASON J.;KNUFFMAN, RYAN;REEL/FRAME:065262/0611 Effective date: 20220325 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |