The first support for our research in precision agriculture came in 2003 when CS-LLC was awarded a Phase I SBIR for the demonstration of the feasibility of using a lightweight autonomous vehicle for weed control. Under this contract and a one-year extension we (1) demonstrated the ability of an autonomous vehicle to navigate fields of row crops, (2) developed a hierarchical algorithm for applying herbicide between the rows and to selected weeds in the seedline during the first few weeks following crop emergence, (3) designed and fabricated a herbicide applicator, end effector that can point and fire small quantities of concentrated herbicide directly onto selected leaves of targeted weeds, and (4) conducted a dosage response study that demonstrated that selected weeds can be effectively controlled by applying systemic herbicides to only small portions of the weed.
Despite the successful completion of Phase I, there was insufficient interest in a means to selectively apply herbicide to weeds in the presence of crop for a Phase II award. Following the USDA SBIR, CS-LLC obtained a Level One grant from the Kentucky Science and Technology Corporation (KSTC) for a utility patent for the applicator and key innovations in the autonomous platform. With the patent application in progress we began to search for support for continued development of the various technologies required to commercialize a leaf-specific herbicide application system. Development of the platform itself continued through internal R&D.
The machine vision system developed under the USDA-SBIR was relatively simple as it was limited to early crop growth when the rows are distinct (open canopy). We learned that there was great interest in methods for automated plant identification in the United Kingdom. This was due to strict regulations supporting bio-diversity and sustainability imposed by the U.K. government. Then as now, if a herbicide is detected in the human water supply at any measurable level, that herbicide is permanently banned. Much of the research effort in Europe by agro-chemical companies such as Bayer Crop Science and Syngenta is devoted to developing new herbicides.
CS-LLC began working with the University of Reading on the development of machine vision systems for detecting weeds in grain crops. While the specific weeds are different, the library of machine vision algorithms developed as part of this research are directly applicable to our leaf-specific weed management system. The U.K. system, named eyeWeed is capable of detecting black grass in wheat fields at any growth stage and generating weed density maps for spraying prescriptions and other field applications.
A machine vision system that can detect individual black grass weeds in a grain field such as wheat had not been considered a possibility. While drones and other remote sensing platform have demonstrated some capability to detect heavy their ability to detect individual weed plants is limited by the resolution of the cameras they carry and/or the distance between the camera and the weeds. The eyeWeed system offers 50 times the resolution and can not only distinguish grass weeds from grain fields it can count the number of grains in the crop seed heads. The value of the eyeWeed system is best demonstrated by an example. The two images below are representations of weed density maps. One generated by manual field scouting requiring 75 labor hours and the much higher resolution eyeWeed map in which the data was collected, processed and used to generate the weed density map in around 3.5 hours
Most recently the USDA awarded CS-LLC a Phase I SBIR to leverage the technology developed over the past ten years in Europe into a commercially viable solution for stopping the proliferation of herbicide-resistant weeds in commercial agriculture in the United States. Our primary target in this effor is herbicide-resistant weeds affecting soybean farmers. It is our position that a successful solution to this problem will be beneficial to soybean growers and will demonstrate the technology of leaf-specific herbicide application in general.
Over the last decade, weeds have become a resurgent agricultural problem, causing loss of yields and progressively becoming more difficult and expensive to control via post-emergent broadcast spraying. Existing technologies for variable rate application of herbicide do little to mitigate the problem.
Concurrent Solutions, LLC is developing EyeRate, a next-generation, real-time variable rate application system that optimizes both weed control and herbicide use. We are leveraging our expertise in machine vision and algorithmic design to formulate precise prescriptions in real time that minimize herbicide use (saving on herbicide costs) and maximize weed mortality (reducing the spread of herbicide resistant traits).
Building on our Phase I results, we are preparing to address five major technical objectives in our Phase II effort: (1) bulk collection of field data to “train” our machine vision system, (2) adapt our plant characteristics ID algorithms to new camera positions, (3) calculate variable rate prescriptions based on dosage studies that account for weed sizes and species, (4) develop precision control of existing spray nozzles to enable real-time spray modulation, and (5) build an integrated prototype and extensively field test it to demonstrate its commercial viability.
Our overarching goal is to build a practical, affordable, near-term product that farmers can start using immediately to save costs and improve yields. A successful Phase II will lead to a prototype system that could be brought to market within two years after project completion. Our initial market will be custom application contractors, to be followed by farm co-ops and large farm owners, corporate and family.
The Kentucky Soybean Board (KSPB) awarded CS-LLC a series of three grants totalling $33,000 for the development of a gantry system and a field image collection study to conduct preliminary experiments on splatter/runoff, platform motion, herbicide targeting and wind effects.
Weed control in field vegetables in the UK is increasingly challenging due to the loss of herbicide actives and demands by policy makers and consumers for lower pesticide use. Research at Reading in conjunction with Concurrent Solutions LLC in the USA, is developing a robotic weeder for field vegetables in the UK using image analysis to locate weed leaves and a novel applicator to apply droplets of herbicides to these leaves. No chemical is applied to the crop and none directly to the soil.
In glasshouse trials, efficacy of applying one droplet of herbicide per weed was determined. Dose-response relationships for control of Stellaria media L. Vill. with glyphosate and of Chenopodium album L. with glufosinate-ammonium showed ED50s of 3.0 and 4.4 μg per seedling compared to the calculated manufacturers’ recommended doses of 48.8 and 21.9 μg, respectively, for weed seedlings of the sizes treated. The question remains: is this efficacy reproducible in the field?
Manually applied droplets of glyphosate were made to the naturally occurring weed population in a transplanted cabbage crop in summer 2016. Efficacy of droplet applications to control weeds and to prevent crop yield loss were assessed in comparison to weed-free (hand-weeded), and weedy controls. Reductions in herbicide were compared with use of the pre-emergence herbicide, pendimethalin, and inter-row glyphosate sprays.
Droplet applications 3, 5 and 7 weeks after transplanting reduced residual weed biomass at harvest by 92% compared to the weedy control and gave a crop yield, which did not differ significantly from the weed-free control. At the same time, the total amount of herbicide active ingredient applied was 94% lower than the recommended rate for pendimethalin. - summary from article submitted to Aspects of Applied Biology
The FogCutter is a method and device extending the viewing range in a fog by limiting the spectral bandpass to one or more of the absorption bands of water. Since the camera is viewing the scene in the spectral region in which light is blocked (attenuated) by water, the fog droplets become dark. This is similar to viewing a person standing next of a car at night. When the headlights are on you cannot see the person. However when the lights are turned off the person next to the car becomes visible.
The reason this works is because the bandpass filter blocks all wavelengths of light except for the light in one of the water absorption bands (see label 208, 212, 216, and 220 in the figure on the right above). So rather than contributing to the noise (light polution) in the scene the fog droplets go dark. In addition to the optical filter the fogcutter method includes an electronic gain and contrast enhancement specifically tuned to match the camera CCD (focal plane).
This project has moved beyond basic research. Concurrent Solutions hae have been awarded a Patent on the concept,and we are building a demonstration prototype. In the figure below, the current prototype camera is shown on the left and the performance of a lower resolution side-by-side comparison of a scene with and without the FogCutter filter and image enhancement SW package is shown on the right.