Message Boards Message Boards

Analyzing Crop Yields by Drone

Posted 10 years ago

Over the summer, I had the opportunity to work with a Phantom 2 Vision+ drone from DJI. After a few exciting trial flights, I decided to investigate a real world application: fly over one of the abundant Illinois farm fields and record bird's eye images of the crops below. Then using this data, create a simple function to estimate the crop yield for that area.

enter image description here

Recently, I drove out to a farm with a fellow Wolfram employee who also happens to be an expert drone operator. We flew the drone over a stretch of soy fields and took pictures and shot video along the way. Here is a small sample video:

Drone flying over soy field (full video): https://vimeo.com/136947441

enter image description here

As you can see in the video, most areas of this field is in really good shape (at least when judged from this altitude). But there are some areas near a drainage ditch where the soy is not growing very well. This can happen because the soil is too moist for too long causing the root system to lack oxygen. As a result the soy plants are much smaller near the drainage area or completely missing.

Let's look at one particular example of this situation. We import the data from the SD card that was used as storage on the drone:

crop = Import["F:\\DCIM\\101MEDIA\\DJI01951.JPG"]

enter image description here

The first thing to notice is the large amount of barrel distortion in this image (note how the road looks highly curved). This is due to the drone camera having very wide angle lens. There is no built-in function in the Wolfram Language to simply correct for barrel distortion (yet), so I am going to focus on a center part of the image where the distortion is not too bad. The ideal situation here would be to fly very high with a very narrow lens.

So first, to get a smaller center part of the image, let's take a look at a subimage about 1/3 x 1/3 the size of the original, located a little bit right of the center:

{columns, rows} = ImageDimensions[crop];
{rowLow, rowHigh} = {Round[rows/3], Round[2 rows/3]};
{columnLow, columnHigh} = {Round[columns/2], Round[5 columns/6]};
HighlightImage[crop, Rectangle[{columnLow, rowLow}, {columnHigh, rowHigh}], Method -> "Brightness"]

enter image description here

Using this region we can now get a sample of the crop that contains both very healthy plants (on the left) and very small or missing plants (on the right):

cropSample = ImageTake[crop, {rowLow, rowHigh}, {columnLow, columnHigh}]

enter image description here

So now the question becomes: how do I measure what fraction of the harvest is lost in this sample? A simple approach is to just turn the image into a black and white image where a white color indicates a bright area (like the green plants) and a black color indicates the soil (dark in the original image):

binaryCrop1 = Binarize[cropSample]

enter image description here

That sort of works, but the original sample still had bright areas due to visible rocks and dead leaves, so let's see if we can refine this approach just a little bit. Instead of using just a plain binarize we can look at the dominant color in this image, which we expect to be greens and blacks:

dominantColors = DominantColors[cropSample, 10]

enter image description here

The result here is more or less as we would expect, with some additional grays (perhaps from dead leaves and rocks).

Now let's take a look at the first dominant color, a good fit for a healthy soy plant in this image. Using this color we can now calculate the color distance for each pixel giving us an image where dark colors are close matches and white colors are poor matches:

ColorDistance[cropSample, First[dominantColors]]

enter image description here

Again we can binarize this image. In this case also need to negate the black/white colors to get white/black so the colors in the binarized image have the same meaning as before (white=soy, black=not soy):

binaryCrop2 = ColorDistance[cropSample, First[dominantColors]] // Binarize // ColorNegate

enter image description here

This result actually looks quite nice: Areas with no soy plants now have a black pixel values (0) and areas with soy plants have white pixel values (1). So let's count the white pixels:

soyPixels = ImageMeasurements[binaryCrop2, "Total"]
(* Gives: 954,214.0 *)

And also lets count the total number of pixels in the image:

totalPixels = Times @@ ImageMeasurements[binaryCrop2, "Dimensions"]
(* Gives: 1,603,814 *)

And now we have our crude estimate for the amount of soy plants in this image, about 60%:

Quantity[100 soyPixels/totalPixels, "Percent"]
(* Gives: 59.4966% *)

Of course this is a very crude estimation, but perhaps not too far off from the actual yield when this particular spot is harvested. I actually expect this to be a slight overestimation, since smaller plants will likely have no soy beans or less soy beans than a fully grown plant. But I am not a crop analyst, so we'll have to go with 60% for now.

Finally, an analysis like this can easily be done on the fly (or "on the drone" in this case), where an embedded system like a Raspberry Pi or similar device can process data as it comes in, and even direct the drone to the areas of highest interest (with the most stressed plants).

POSTED BY: Arnoud Buzing
5 Replies

Another interesting article on the subject:

Artificial intelligence + nanosatellites + corn: This startup uses machine learning and satellite imagery to predict crop yields

enter image description here

POSTED BY: Vitaliy Kaurov

This looks like a very promising direction in Agriculture. I just saw this article:

Six Ways Drones Are Revolutionizing Agriculture

and it mentions the following summary:

  1. Soil and field analysis: Drones can be instrumental at the start of the crop cycle. They produce precise 3-D maps for early soil analysis, useful in planning seed planting patterns. After planting, drone-driven soil analysis provides data for irrigation and nitrogen-level management.

  2. Planting: Startups have created drone-planting systems that achieve an uptake rate of 75 percent and decrease planting costs by 85 percent. These systems shoot pods with seeds and plant nutrients into the soil, providing the plant all the nutrients necessary to sustain life.

  3. Crop spraying: Distance-measuring equipment—ultrasonic echoing and lasers such as those used in the light-detection and ranging, or LiDAR, method—enables a drone to adjust altitude as the topography and geography vary, and thus avoid collisions. Consequently, drones can scan the ground and spray the correct amount of liquid, modulating distance from the ground and spraying in real time for even coverage. The result: increased efficiency with a reduction of in the amount of chemicals penetrating into groundwater. In fact, experts estimate that aerial spraying can be completed up to five times faster with drones than with traditional machinery.

  4. Crop monitoring: Vast fields and low efficiency in crop monitoring together create farming’s largest obstacle. Monitoring challenges are exacerbated by increasingly unpredictable weather conditions, which drive risk and field maintenance costs. Previously, satellite imagery offered the most advanced form of monitoring. But there were drawbacks. Images had to be ordered in advance, could be taken only once a day, and were imprecise. Further, services were extremely costly and the images’ quality typically suffered on certain days. Today, time-series animations can show the precise development of a crop and reveal production inefficiencies, enabling better crop management.

  5. Irrigation: Drones with hyperspectral, multispectral, or thermal sensors can identify which parts of a field are dry or need improvements. Additionally, once the crop is growing, drones allow the calculation of the vegetation index, which describes the relative density and health of the crop, and show the heat signature, the amount of energy or heat the crop emits.

  6. Health assessment: It’s essential to assess crop health and spot bacterial or fungal infections on trees. By scanning a crop using both visible and near-infrared light, drone-carried devices can identify which plants reflect different amounts of green light and NIR light. This information can produce multispectral images that track changes in plants and indicate their health. A speedy response can save an entire orchard. In addition, as soon as a sickness is discovered, farmers can apply and monitor remedies more precisely. These two possibilities increase a plant’s ability to overcome disease. And in the case of crop failure, the farmer will be able to document losses more efficiently for insurance claims.

See more related content at PwC global report on the commercial applications of drone technology.

POSTED BY: Vitaliy Kaurov
Posted 10 years ago

Fantastic stuff Arnoud!

I did try to play around with ta similar concept using the Landsat information (http://landsat.usgs.gov/) using the earth explorer app.

Mathematica is such a wonderful tool!

More details in the use of Landsat images.

http://www.nass.usda.gov/EducationandOutreach/Reports,PresentationsandConferences/Presentations/BoryanSeffrinJohnsonFASSeminar06.pdf

POSTED BY: Diego Zviovich

Is there any way to estimate the total bushels estimated from the area rather than a percentage?

POSTED BY: Roman Kopytko

Yes, but you need the geo position and the elevation of the drone/camera as well as the camera field of view, to compute the area that's visible in the image.

And an estimate of the "normal" yield in bushels for a given acre.

POSTED BY: Arnoud Buzing
Reply to this discussion
Community posts can be styled and formatted using the Markdown syntax.
Reply Preview
Attachments
Remove
or Discard

Group Abstract Group Abstract