Launching Pocket Agronomist

We're proud to announce that Pocket Agronomist is now available for download from the App Store (U.S.-only right now). It is an application that combines convolutional neural networks to perform realtime crop disease diagnosis with augmented reality for measuring crop statistics. It was developed by Agricultural Intelligence, a joint venture between Perceptual Labs and Ag AI.

Pocket Agronomist is currently free to download and use. You can read more about it on our product page.

We unveiled Pocket Agronomist a little under a year ago in this post, and have been refining the core technology and expanding capabilities since then. Read on for more about what we've added since then:

 

ARKit for stand count measurement

StandCountiPhone.jpg

The largest addition is an augmented-reality-based tool for performing stand counts in the field. Stand counts are measures of how many plants are present per a set distance in a row within a field. If you then know the spacing of the rows, you can calculate crop density within a field.

Beyond simple density measurements, if you can measure the spacing between individual plants in a row, you can learn even more about a field. Uneven plant spacing (indicated by a high standard deviation in individual distance measurements) might mean a significantly lower yield for a field. This could be due to problems during planting or crop damage due to frost or hail. Being aware of issues early in the season might allow a farmer to quickly re-plant, and an accurate assessment after storm damage can make sure insurance policies pay the correct amount for lost yield.

At present, stand counts are gathered by hand using a labor-intensive process typically involving going to a field, laying out a tape measure, reading out the inch markers for each plant, writing down these values, and then transcribing them later into a spreadsheet which calculates the overall statistics. Many choose to simplify this process to cut down on time and only measure the number of plants in a set distance, losing the individual spacing statistics.

We found that this process could be simplified and automated using the latest augmented reality technologies being deployed to mobile devices. We conducted a series of experiments using Apple's ARKit and found that we could use it to identify the ground in a field reliably enough to perform stand count measurements that produce average plant distances matching hand measurements to within a third of an inch.

To do this, we use ARKit to identify the position of the ground relative to the device's camera in three-dimensional space. The user then aims a focus square along that virtual ground plane until it surrounds a plant. Tapping on the screen initiates causes a line to be drawn from the center of the screen to the virtual plane of the ground, and the base of the plant is labeled at that intersecting point in 3-D. That point is tracked as the device and its camera moves, and ideally should remain aligned with the base of the plant.

While that all sounds complicated, it's handled for the user automatically and all they have to do is point their phone camera at the bases of plants in a row and tap for each. Mistaken measurements can be undone easily, and all measurements are displayed as a 3-D overlay on the real world so that they can see what was measured down a row.

When a sufficient number of measurements have been taken, a press of a button will bring up all the statistics for a stand count, calculated automatically. No data entry is required, and all of these values can be emailed to anyone as a comma separated value file for later analysis.

We feel that this could be a significant time saver for farmers, insurance agents, and others, and wanted to get it into the hands of those people as soon as we could. This is why we are launching Pocket Agronomist now, so that people can make use of this throughout the remainder of the U.S. growing season.

Enhancing disease detection and expanding crop selection

Pocket Agronomist still features realtime disease detection using trained convolution neural networks, and we have continued to improve this functionality since we unveiled it last year. However, we are labeling this capability as "experimental" currently while we assess its performance in the field and work to gather training data for some less common corn diseases.

Data collection is another reason we are launching the product now. By default, when Pocket Agronomist is used to diagnose a disease case in the wild, an image of that disease is uploaded to our repository to be used in training our neural networks. You can opt out of this at any point via a simple switch. Images are anonymized, but are stamped with location data so that we can see what region they were taken from (which can aid in separating similar strains of crop diseases). This location data can also be selectively disabled.

A number of people helped us collect data throughout the last growing season, and more people have offered to do so once this was formally released. We're hoping to use this throughout this growing season to increase our disease recognition accuracies and fully validate the use of this application for diagnosis in the field.

To further help data collection efforts, we've added a tab within the application where you can take and upload images of diseases appearing in other crops beyond the currently-supported corn diseases. We plan to add a series of additional crops over time as we build our training datasets.

We're all extremely excited to see how people use this in the field, and to grow its capabilities throughout this season. Feel free to download it from the U.S. App Store today and give it a try.