Futuristic Farming

 

With eyes in the sky and boots on the ground, farmers have more data than ever with which to make decisions. 

The low hum of a drone buzzing in the air. Fixed-wing aircraft gliding over the clouds. Satellites orbiting Earth. The future, as laid out in a sci-fi novel, is now. And agriculture, with its ever-increasing need to manage crops as efficiently as possible, embraces this powerful technology for what it is—another tool in the toolbox.

While those unmanned aerial vehicles (UAVs), satellites and airplanes have the power to get up high, it’s the cutting-edge images they capture that really assist on-the-ground, people-powered decisions, offering increasingly precise data throughout the crop cycle. From plant population, weeds and soil compaction to nitrogen deficiencies, over-irrigation and pests, the powerful snapshots not only help pinpoint where those issues lie in a field, but why, and what solutions are possible.

What the Images Can Tell Us 

Each plant in a field is, in essence, its own sensor. Its unique surface and texture provide a sort of data signature in the form of reflected light. Multispectral imagery captures those reflections and, in turn, can provide insight to the health of the plant.

So, when you view a normalized difference vegetation index (NDVI) or a normalized difference red edge index (NDRE), it’s simply a calculation of visible and near-infrared light reflected by that vegetation. Or, says Darren Goebel, agronomist and AGCO’s director of global commercial crop care, “it’s the intensity of photosynthesis occurring in the plants.”

More specifically, the red spectrum of an NDVI map is highly sensitive to low chlorophyll, while the red edge spectrum is sensitive to a wider range of chlorophyll. Other than that, the two types of images basically do the same thing—reflect photosynthetic activity.

In practice, then, red areas in NDVI images could indicate a dead crop, bare soils or even a building. Conversely, adds Goebel, “green areas in an image may indicate a very healthy growing crop, and the darker the green, the healthier the crop.”

Improving Imagery Through Research

Think of imagery research as a bit of an agricultural arms race. “Two years ago, the whole industry was concerned about the UAVs and what they could and couldn’t do”—from limited battery life and FAA regulations, says Terry Griffin, assistant professor, department of ag economics at Kansas State University and faculty adviser to the Kansas Ag Research and Technology Association (KARTA), a group of producers who share their technology and on-farm experimentation. “Everyone was ignoring what UAVs actually did—take images—and they were more concerned about the hardware.”

But, he adds, that interest in the hardware has actually driven the research toward better imagery. These days you’d be hard-pressed to find a university or equipment company not involved with some type of imagery or UAV research.

Many producers and agronomists are increasingly using UAVs, as well as fixed-wing aircraft and satellites, in farm operations to produce images, such as NDVI and NDRE.

“Traditional NDVI and multispectral images have been around for many years,” says AGCO’s Goebel. “But there’s a lot of work going into the continued utilization of very specific portions of the visible light spectrum,” he adds, citing the accepted use of red edge (NDRE) in more accurately pinpointing in-field nitrogen deficiencies.

“Scientists are still working to determine which wavelengths can tell specific weeds apart,” says Dennis Bowman, a crop systems educator at the University of Illinois Extension. “You can definitely see a difference in the reflectance of waterhemp versus soybeans, but at a practical, commercial level, we still can’t tell waterhemp from velvetleaf.”

At Mississippi State University, researchers recently published a study that deployed UAVs over corn plots to assess different drydown rates once corn hits the black layer and starts senescing. Jason Ward, who joined AGCO this past fall as the Global ATS precision farming training manager, was involved in the study as assistant Extension professor, focusing on precision ag research.

“We wanted to see what the variability was around multi-hybrid planting,” he says. The researchers found significant differences in drydown rates across hybrids. “What that means to me,” Ward explains, “is the difference was large enough that it could impact performance of the combine and could impact actual yield estimation.”

This year marks the first that AGCO’s Crop Tour will fly drones over fields to collect imagery, using reports and analytics from ag insight company Aglytix to “provide head-to-head evaluation of the planters and different production methods and tools,” says Ward. “Say we’re looking at output from our SeedSense® 20/20 [monitor], and we see the singulation came out nicely, downforce map was consistent, no big issues in the field area, but we still have a stand problem. Maybe the seed just didn’t establish right.” Hopefully, he says, “using the imagery and technology for context, you’ve been able to cross some things off the list so you can drill down to your most actionable piece of the problem.”

As Ward explains, use of this technology may not allow a fix until the next year. Yet, based on the data layer, you can start asking yourself what you need to do differently, how you should manage weeds or whether you should change your fertilizer package. “And you may be able to budget ahead of time for some of those things,” he says.

Boots on the Ground

“A lot of companies talk about how aerial images are good indicators of what your yield map looks like,” explains Goebel. “But as an agronomist consulting with growers, I actually don’t want those images to look like my yield map,” he says, explaining he needs multiple factors reflected in those photos to help make better decisions. “I want to see what’s going on in the field—nitrogen management, weeds, insects—and use the imagery as a scouting tool to help affect change.”

Even the most high-tech layers of data aren’t a substitute for scouting. “You’re still going to have to walk to those spots, maybe take a soil or tissue sample, and then make a decision about what to do,” says Goebel.

“With a nitrogen deficiency, you’ll have to go out and very likely variable-rate-apply nitrogen. I can take that NDVI image, pull it into software and create a prescription nitrogen map, and then load it back into the sprayer to apply nitrogen or urea in spots that need it.”

The whole goal of advanced imagery—of these futuristic drones hovering over a field of beans or corn—is to gather data and context, provide benchmarks potentially allowing farmers to help reduce costs, and maximize efficiencies and returns. Still, producers need to interpret that data correctly and make smarter decisions based on it. Because, points out Ward, “in precision ag, we’ve been really good at generating large volumes of data, but not great about making it completely actionable … yet.”

So, while producers and researchers continue to leverage imagery, hardware and software to identify in-field issues and increase productivity across operations, Ward acknowledges there’s plenty of on-farm information that needs to be applied toward those analytics and algorithms. “Lots of local information comes from a farmer being there, knowing his field, his equipment, his operators. And that,” he points out, “frankly, doesn’t fit well into a database.

“I don’t think,” he concludes, “that we’ll ever be able to take the farmer’s brain out of the equation … and I don’t think we’d want to.”

 

Article written by Claire Vath, AGCO

Leave a comment