By Treena Hein
Efficiency, ease-of-use, rapid task completion – these are some of the many reasons there are tens of thousands of drones already in the air over farms across many parts of the world. Drones are becoming hugely popular for crop scouting, spraying crop protection products and doing spot applications of fertilizer as needed. They can even spread seed.
A step up from drones that will roll out on farms in the U.S. this summer is Guardian Agriculture’s SCI aircraft. Like a drone, it can carry a tank of crop protection product and/or sensors, with no emissions and no soil compaction. It’s far larger than a drone, however, and meant to be operated with automated, pre-programmed flying patterns that align with GPS field maps. Due to its size, aerospace-grade components and AI systems, it’s classified as an electric Vertical Take-Off and Landing (eVTOL) aircraft, and the first eVTOL of its kind to have (just recently) received approval for nation-wide operation from the U.S. Federal Aviation Administration (FAA).
And by this fall, an ultralight eVTOL, the RECON from RYSE Aero Tech, will also appear on U.S. farms. Like several other ultralight eVTOLs to hit the market later this year or in 2024, it will be used for military, recreational and medical/ emergency/disaster response purposes as well.
The design of ultralight aircraft of any type must follow the appropriate FAA guidelines (Part 103), which restrict these single-occupant aircraft in aspect such as weight, speed, altitude and where they can be flown (no urban areas or over gatherings of people). The RECON has a range of up to 45 km (25 miles) per charge (about 25 minutes), a maximum speed of 101 kph (63 mph) and a cost of USD$150 000. It weighs about 285
pounds, and can put down on land or water.
Most of the hundreds of the first RECON pre-purchasers are farmers, about 200 of them already. “From the start, we saw the potential for farm use to replace ATVs and trucks for various on-farm and ranching tasks,” says Mick Kowitz, RYSE founder and CEO. “Our focus is to get the first 100 RECONs out to farmers and gather a lot of feedback about their uses. As with anything new, when something is out there in actual use, people come up with all kinds of new uses.”
RECON on the farmCrop and spring field condition scouting is one of the most obvious uses for the RECON. But Kowitz says many farmers with whom RYSE has spoken put repairing farm equipment in the field at the top of the list.
That is, when the tractor breaks down, farmers could use their cell phone to someone back at the farm shop and ask that they fly out with a repair kit. (It’s expected, however, that eventually, automated operation for ultralight eVTOLs will be allowed as it’s already allowed now for smaller eVTOLs like the SCI. So, if your tractor breaks down in the field, you could ‘call the RECON,’ with a repair kit or specific part having been placed on board by someone else, do the repair and then send the RECON back to home base.)
“A lot of farmers hate using ATVs,” says Kowitz. “They break down, they are gas hogs and, if you rent land, you may have to drive quite a way, even across other farmers’ land, to get to your broken piece of equipment or whatever other task you need to do. Many farmers have said to us that if the RECON can save them time by going straight as the crow flies, it’s worth it. They’ve been excited about creating landing spots in their fields, which means they could get rid of some of their field roads along the edge, and after a few years when the compaction reduced, they could put those strips into production.”
However, to make this use case really effective – so that they could land right in a field provided the crops weren’t too mature – farmers suggested to RYSE that they should add stilts to the RECON. RYSE obliged. They added removable peg legs about 18 inches long that lock in on the base of the six outrigger floats. With these pegs, the RECON typically sinks into the soil only about 1.5 inches. (And in case you’re wondering, longer pegs aren’t feasible as the RECON landing isnot completely straight down but involves a tiny bit of a skid-forward motion.)
Returning to crop scouting for a moment, this of course can be done by drones – and like a drone, the RECON has cameras (German-made HD infrared cameras) to take footage or still pictures while you are hovering, and a GPS system to mark points for future inspection and action – but the RECON allows you that big picture view, and with the stilts, allows you to disembark.
And for longer trips to far-off fields or tagging new calves in herds far from the farm base, for example, a farmer could set up a few solar-powered charging stations around the farm. That is, the farmer lands at the station, swaps out spent batteries with fresh ones, and by the time of return, the first set of batteries are charged up again.
Crop focusThrough the focus groups and through other avenues, RYSE has received many other use ideas. “We attended farm shows last year (Farm Progress, Sunbelt Ag, and Ohio State's Farm Science Review) where we were able to speak directly to farmers from all around the U.S.,” says Kowitz. “We were able to orchestrate focus groups through extension agents and farmers as well throughout states in the Midwest. Additionally, we have been speaking to prospective buyers via incoming emails and calls to us. These folks were all able to identify many areas they could use the RECON.”
With regard to crop production, many farmers have asked RYSE about adding spray tanks, but Kowitz explains that ultralight regulations don’t presently allow that. “We expect that some farmers who want to use the RECON to spray will work with their local municipalities and seek an FAA exception,” he says. “That’s how things stand in the U.S. In other countries it’s different.”
Other uses centered on livestock. “This unit would make things so much easier on a cattle farm by locating missing cattle, locating cattle that need tags and shots, and finding baby calves that the heifer hid too well,” said one farmer.
“Calving season can be very muddy on a four-wheeler,” said another. “I like that this unit can make that job a lot easier.” Still another noted that “I could use this for checking fence lines, finding the herd of feral hogs creating havoc, among many other uses to make things more efficient at the farm.”
Kowitz concludes, “I think each farmer is going to use the RECON in different ways, depending on their type and size operation, personal preference, other transportation options and so on. I think most farmers will have
three or four main uses, but we’re very excited about uses that haven’t been identified yet.”
Safety aspects of the RECON
Safety features of ultralight eVTOLs include hardware and software. In terms of hardware, each of the six rotor systems are independent in terms of wiring and battery packs. There is no single point of failure or all-in-one centralized battery hub. For water landing, there are six independent outrigger floats and two fuselage floats.
In terms of software, a ‘Simplified Vehicle Operations System’ ensure a low learning curve and simple flight controls for the pilot. Risk of a hard landing is eliminated with advanced redundant AI with auto take-off and landing features. Automated data collection promotes self-learning and improvement of the AI over time. The operator platform is simple, consisting of a tablet main screen with a joystick on each side (and if you let go, the RECON just hovers). The AI system works to keep the machine steady when hovering even in winds as high as 25 mph. An optical LIDAR for laser-based obstacle avoidance assists the operator to avoid obstacles. ●
Crop scouting is just one use for the RECON on the farm. Photo: RYSE
The RECON has cameras to take footage or still pictures while hovering, and a GPS system to mark points for future inspection and action. Photo: RYSE
By this fall, an ultralight eVTOL, the RECONfrom RYSE Aero Tech, will appear on U.S. farms. Photo: RYSE
Our focus is to get the first 100 RECONs out to farmers and gather a lot of feedback about their uses.
By Janet Kanters
Researchers have developed a program that can generate planting and management recommendations for growers based on weather and economic variables and which can be tailored to the producer’s specific growing sites.
The new predictive technology was pioneered by a team of researchers in the Montana State University (MSU) College of Agriculture and Norm Asbjornson College of Engineering, and outlined in a February edition of the journal Agriculture.
The program, called On-Farm Precision Experimentation, or OFPE, was developed by Bruce Maxwell, a professor in the Department of Land Resources and Environmental Sciences, along with alumnus Paul Hegedus and graduate students Sasha Loewen and Hannah Duff. The team collaborated with MSU computer scientists, led by professor John Sheppard of MSU’s Gianforte School of Computing, along with graduate students Giorgio Morales and Amy Peerlinck.
Since 2015, the team has been developing and fine-tuning predictive
technology that can synthesize past farm, weather and economic data to model the likelihood of success for different management approaches, such as date and amount of fertilizer application, seeding rate or planting date. Based on those variables, the OFPE program generates suggestions for seeding rate and fertilizer applications that are most likely to result in the highest return on investment for a producer’s goals, from overall yield to grain protein content.
“This project combines not only data from farms, but also data that we collect from satellite sources, and one of the philosophies behind that is that we don’t want to generate another cost for farmers,” said Hegedus, the study’s primary author.
Since most modern agricultural equipment automatically tracks data related to seeding rate and fertilizer application, the OFPE program can analyze years’ worth of past data to draw connections between weather patterns, management approaches and final yield and profit. Combining that small-scale data with satellite data for past weather conditions allows the modeling program to learn the relationships between large-scale weather trends and production data. That predictive capability is what makes OFPE unique, said Hegedus.
“We use those models to take it one step further and simulate what their net return and yield outcomes might be under different conditions,” he said. “That could mean different management scenarios like fertilizer rates. Incorporating past data for those variables helps make predictions easier. For instance, if we think weather for 2023 might be like weather from 2005, we can plug in weather data from 2005 to help dial in what their yield might be under those weather conditions.”
Over time, the program will get better at making predictions, said Maxwell, because it will have more years’ worth of data to draw from. The ultimate goal is to create a tool where farmers can plug in their own data and receive a series of recommendations, each accompanied by a probability of success. Maxwell likened it to helping farmers become better gamblers, analyzing the odds of success for different possible approaches.
“Most agricultural research has been focused on a deterministic approach, finding the average that works the best across all scenarios,” said Maxwell. “We’ve changed that to the approach of identifying the chance you’ll get your desired outcome, based on the decisions you make. This ability to extract data that’s now so available means that we can draw on that to perfect these predictive models over time.”
Maxwell’s other graduate students have studied additional aspects of OFPE’s potential that may appeal to other types of growers. Duff’s work focused on the potential for ecological benefits if consistently unproductive plots of land are reverted to a natural state, populated by native plants and insects. Those areas, called ecological refuges, showed potential for higher yield due to the increased biodiversity they provide, and OFPE can simulate net
returns and results if a farmer were to create an ecological refuge on their own site. Loewen’s graduate studies applied a similar approach to organic systems.
Through the collaborations with MSU’s computing school, the OFPE project has also furthered the modeling approach by introducing advanced methods in deep learning, said Sheppard.
“By utilizing the rich data sets collected from on-farm experiments, we are able to learn not only what the likely results of different fertilizer or seeding rates might be but also quantify the uncertainty and explain why the predictions are made,” he said.
Sheppard’s graduate students are developing the primary modeling methods and developing new approaches to designing experiments, combined with optimizing the inputs to the fields at a site-specific level, said Sheppard. Morales’ work is focused on developing learning models to help farmers understand the impacts of their input decisions, and Peerlinck’s work incorporates the combination of economic and environmental objectives in the technology through a process known as “multi-objective optimization.”
So far, the takeaways from the OFPE research in conventional systems have been highly effective, said Hegedus. His own doctoral work compared the effectiveness of OFPE’s recommendations to previous approaches taken by farmers without the technology’s recommendations. And because OFPE can generate specific recommendations for different areas of a single farm, suggested approaches are often more precise and efficient.
“We found that in 100 percent of the fields across all weather conditions, our site-specific approaches would be more profitable,” he said. “What’s more, in half of those scenarios, the producer would also be applying less nitrogen fertilizer, which also saves them money.”
Maxwell said that the current application of OFPE requires producers to provide data to the MSU team, who can then set up an on-farm experiment to generate recommendations through the predictive program. He hopes that ultimately, that process can be automated so that producers can use it on their own to select between possible approaches. He also noted that OFPE has never been intended to make decisions for producers. Instead, it is designed to provide them with more accurate, detailed information to make those decisions themselves. ●
Bruce Maxwell. Photo: Kelly Gorham, MSU
John Sheppard. Photo: Adrian Sanchez-Gonzalez, MSU
OFPE can generate planting and management recommendations for growers based on weather and economic variables and can be tailored to the producer’s specific growing sites.
This ability to extract data that’s now so available means that we can draw on that to perfect these predictive models over time.
Guardian Agriculture, a developer of Electric Vertical Take-Off and Landing (eVTOL) systems for commercial-scale sustainable farming, has received approval from the U.S. Federal Aviation Administration (FAA) to operate its aircraft nationwide.
“We designed our system to meet the needs of commercial agriculture,” said Guardian Agriculture founder and CEO Adam Bercu. “Solving this real-world pain point is the right first step for eVTOL adoption at large.”
The Guardian SC1 platform – which already has more than USD$100 million in customer orders – is an autonomous, electric, aerial crop protection system designed specifically for large-scale agriculture.
The Guardian SC1 can carry 200-pound payloads and address a wide range of application spray volumes and application needs for growers. With four six-foot propellers and a 15-foot aircraft width, the SC1 efficiently covers 40 acres per hour of full-field crop protection to the grower. Combining an autonomous aircraft, a ground station supercharger, and software generating domestically stored data, the Guardian SC1 offers on-target application to fields when and where necessary.
With this approval, Guardian Agriculture expects to be the first eVTOL manufacturer with systems operating at scale across the U.S., and the first to generate thousands of hours of commercial flight time.
Guardian Agriculture will begin commercial operations in support of its Wilbur-Ellis customer in California in the coming months as it continues to ramp up its production capacity. ●
Japanese researchers have developed a four-wheeled, two orthogonal axes mechanism robot to maintain plants grown under the farming method of synecoculture.
Synecoculture involves growing mixed plant species together in high density. Synecoculture is an approach by Sony Computer Science Laboratories, Inc. (Sony CSL), and advocated by Dr. Masatoshi Funabashi, senior researcher at Sony CSL, in which ecosystems are artificially created for cultivating a rich variety of crops while also enriching local biodiversity. It can be a complex operation since varying species with different growing seasons and growing speeds are planted on the same land.
According to Tokyo-based Waseda University, while the operational issues present with synecoculture can be addressed by using an agricultural robot, most existing robots can only automate one of the three tasks (sowing, pruning, harvesting) in a simple farmland environment, thus falling short of the literacy and decision-making skills required of them to perform synecoculture. Moreover, the robots may make unnecessary contact with the plants and damage them, affecting their growth and the harvest.
A group of researchers led by Takuya Otani, an assistant professor at Waseda University, in collaboration with Sustainergy Company and Sony CSL, have designed a new robot that can perform synecoculture effectively. The robot is called SynRobo, with “syn” conveying the meaning of “together with” humans. It manages a variety of mixed plants grown in
the shade of solar panels, an otherwise unutilized space. An article describing their research was published in Volume 13, Issue 1 of Agriculture, on 21 December 2022. This article has been co-authored by Professor Atsuo Takanishi, also from Waseda University, other researchers of Sony CSL, and students from Waseda University.
According to Otani, the robot has a four-wheel mechanism that enables movement on uneven land and a robotic arm that expands and contracts to help overcome obstacles. The robot can move on slopes and avoid small steps.
“The system also utilizes a 360-degree camera to recognize and manoeuvre its surroundings,” said Otani. “In addition, it is loaded with various farming tools – anchors (for punching holes), pruning scissors
and harvesting setups. The robot adjusts its position using the robotic arm and an orthogonal axes table that can move horizontally.”
Besides these features, the researchers also invented techniques for efficient seeding. They coated seeds from different plants with soil to make equally-sized balls. These made their shape and size consistent, so the robot could easily sow seeds from multiple plants. Furthermore, a human-controlled manoeuvring system was developed to facilitate the robot’s functionality. The system helps it operate tools, implement automatic sowing and switch tasks.
The new robot could successfully sow, prune, and harvest in dense vegetation, making minimal contact with the environment during the tasks because of its small and
flexible body. In addition, the new manoeuvring system enabled the robot to avoid obstacles 50 percent better while reducing its operating time by 49 percent, compared to a simple controller.
“This research has developed an agricultural robot that works in environments where multiple species of plants grow in dense mixtures,” Otani said. “It can be widely used in general agriculture as well as synecoculture – only the tools need to be changed when working with different plants.”
Otani added the robot will contribute to improving the yield per unit area and increase farming efficiency. Moreover, its agricultural operation data will help automate the manoeuvring system. As a result, robots could assist agriculture in a plethora of environments. “In fact,
Sustainergy Company is currently preparing to commercialize this innovation in abandoned fields in Japan and desertified areas in Kenya, among other places.” ●
Researchers have developed a small and flexible agricultural robot for synecoculture farming. It has a four-wheel mechanism, two axes stand, robotic arm, camera unit, manoeuvring system and farming tools. Photo: Waseda University
Researchers of the Light Robots group at Tampere University in Finland have developed the first passively flying robot equipped with artificial muscle – an “artificial fairy” that could be utilized in pollination.
Hao Zeng, Academy Research Fellow and the group leader, and Jianfeng Yang, a doctoral researcher, have come up with a new design for their project called FAIRY – Flying Aero-robots based on Light Responsive Materials Assembly. They have developed a polymer-assembly robot that flies by wind and is controlled by light.
The artificial fairy developed by Zeng and Yang has several biomimetic features. Because of its high porosity (0.95) and lightweight (1.2 mg) structure, it can easily float in the air directed by the wind. What is more, a stable separated vortex ring generation enables long-distance wind-assisted travelling.
“The fairy can be powered and controlled by a light source, such as a laser beam or LED,” Zeng said. This means that light can be used to change the shape of the tiny dandelion seed-like structure. The fairy can adapt manually to wind direction and force by changing its shape. A light beam can also be used to control the take-off and landing actions of this polymer assembly.
Next, the researchers will focus on improving the material sensitivity to enable the operation of the device in sunlight. In addition, they will up-scale the structure so that it can carry micro-electronic devices such as GPS and sensors as well as biochemical compounds.
According to Zeng, there is potential for even more significant applications. “It sounds like science fiction, but the proof-of-concept experiments included in our research show that the robot we have developed provides an important step towards realistic applications suitable for artificial pollination,” he said.
In the future, millions of artificial fairies carrying pollen could be dispersed freely by natural winds and then steered by light toward specific areas awaiting pollination.
However, many problems need to be solved first. For example, how to control the landing spot in a precise way? How to reuse the devices and make them biodegradable? These issues require close collaboration with materials scientists and people working on microrobotics.
The FAIRY project started in September 2021 and will last until August 2026. It is funded by the Academy of Finland. The flying robot is researched in cooperation with
Dr. Wenqi Hu from Max Planck Institute for Intelligent Systems (Germany) and Dr. Hang Zhangfrom Aalto University. ●
For their artificial fairy, Hao Zeng and Jianfeng Yang got inspired by dandelion seeds. Photo: Jianfeng Yang, Tampere University.
Modern cameras and sensors, together with image processing algorithms and artificial intelligence (AI), are ushering in a new era of precision agriculture and plant breeding. In the near future, farmers and scientists will be able to quantify various plant traits by simply pointing special imaging devices at plants.
However, some obstacles must be overcome before these visions become a reality. A major issue faced during image-sensing is the difficulty of combining data from the same plant gathered from multiple image sensors, also known as multispectral or multimodal imaging. Different sensors are optimized for different frequency ranges and provide useful information about the plant. Unfortunately, the processof combining plant images acquired using multiple sensors, called 'registration,' can be notoriously complex.
Registration is even more complex when involving three-dimensional (3D) multispectral images of plants at close range. To properly align close-up images taken from different cameras, it is necessary to develop computational algorithms that can effectively address geometric distortions. Besides, algorithms that perform registration for close-range images are more susceptible to errors caused by uneven illumination. This situation is commonly faced in the presence of leaf shadows, as well as light reflection and scattering in dense canopies.
Against this backdrop, a research team including Professor Haiyan Cen from Zhejiang University, China, recently proposed a new approach for generating high-quality point clouds of plants by fusing depth images and snapshot spectral images. As explained in their paper, which was published in Plant Phenomics, the researchers employed a three-step image registration process which was combined with a novel artificial intelligence (AI)-based technique to correct for illumination effects.
"Our study shows that it is promising to use stereo references to correct plant spectra and generate high-precision, 3D, multispectral point clouds of plants," noted Prof. Cen.
The experimental setup consisted of a lifting platform which held a rotating stage at a preset distance from two cameras on a tripod; an RGB (red, green and blue)-depth camera and a snapshot multispectral camera. In each experiment, the researchers placed a plant on the stage, rotated the plant, and photographed it from 15 different angles.
They also took images of a flat surface containing Teflon hemispheres at various positions. The images of these hemispheres served as reference data for a reflectance correction method, which the team implemented using an artificial neural network.
For registration, the team first used image processing to extract the plant structure from the overall images, remove noise, and balance brightness. Then, they performed coarse registration using Speeded-Up Robust Features (SURF) – a method that can identify important image features that are mostly unaffected by changes in scale, illumination and rotation.
Finally, the researchers performed fine registration using a method known as "Demons." This approach is based on finding mathematical operators that can optimally 'deform' one image to match it with another.
These experiments showed that the proposed registration method significantly outperformed conventional approaches. Moreover, the proposed reflectance correction technique produced remarkable results, as Prof. Cen highlights: "We recommended using our correction method for plants in growth stages with low canopy structural complexity and flattened and broad leaves." The study also highlighted a few potential areas of improvement to make the proposed approach even more powerful.
Satisfied with the results, Prof. Cen concludes, “Overall, our method can be used to obtain accurate, 3D, multispectral point cloud model of plants in a controlled environment. The models can be generated successively without varying the illumination condition.”
In the future, techniques such as this one will help scientists, farmers and plant breeders easily integrate data from different cameras into one consistent format. This could not only help them visualize important plant traits, but also feed these data to emerging AI-based software to simplify or even fully automate analyses. ●