Rural wireless is a foundation of digital and precision agriculture. ARA enables the research, development, and pilot of affordable, high-capacity, and low-latency rural wireless solutions, thus enabling transformative agriculture research and practice. If you are interested in using ARA and the Iowa State University agriculture facilities (e.g., research and teaching farms around the City of Ames) for research, education, and/or innovation in digital and precision agriculture, please contact us at ag@arawireless.org. In what follows, we elaborate on three examples of agriculture research activities enabled by the ARA deployment in central Iowa. (Having other use cases that we shall feature here? Please contact us or share them with the ara-users Google group!)
Real-Time, High-Throughput Phenotyping
When deploying field robots for high-throughput plant phenotyping and precision agriculture in general, remote monitoring and management of such a fleet of autonomous robots is critical to ensure their safe and optimized operations. Currently researchers typically rely on local radio communications but are under great bandwidth limitations, resulting in seriously compromised solutions. Affordable, high-capacity rural wireless networks enabled by ARA technologies will solve the problem of transferring large amounts of data collected in the field by robots that are often equipped with multiple high-resolution imaging sensors. For example, as a part of the NSF PhenoNet project, Dr. Lie Tang (ISU Plant Science Institute) has developed a fleet of PhenoBots for high-throughput plant phenotyping. As shown in the figure below, each PhenoBot is equipped with a maximum of 8 stereoscopic cameras (16 cameras in total) that will generate 800MB/Second or 3TB/hour image data when capturing detailed plant phenotypic data at 10 frames/second on-the-go. Having the capacity to transfer this huge amount of data in a near real-time fashion will be highly beneficial to the streamlined processing of field-based high-throughput plant phenotyping, for instance, enabling the use of high-performance cloud computing to run sophisticated data processing algorithms in real-time. Consequently, it will substantially reduce the onboard computational requirement of the robots, making them more feasible and economically viable. If you are interested in learning more about PhenoBots, please feel free to contact Dr. Lie Tang. (Example demo | slides)
Agriculture Automation
One high-impact application of Ultra-Reliable, Low-Latency Communication (URLLC) in rural communities is connected and automated ag vehicles (e.g., seeders, sprayers, and combines) that leverage real-time data analytics and automation to enable precision agriculture (see figure below). In large commercial farms, tens or even hundreds of ag vehicles may be used simultaneously. With high-throughput URLLC enabled by ARA, each ag vehicle can stream its high-volume sensing data (e.g., high-resolution multispectral 3D imagery of plants and soil) to edge/cloud computing infrastructures for real-time data analytics and decision making, and then use the decision feedback from the edge/cloud to perform tasks such as precision weed control in real-time. High-throughput URLLC can also enable remote AR/VR-based supervisory control of automated ag vehicles by farmers in their farm offices! With its extensive ag farm deployment and in collaboration with John Deere and rural communities, ARA enables, for the first time, the research and development of connected and automated ag vehicle applications such as those led led by Dr. Matthew Darr (ISU Digital Agriculture Innovation Team). If you are interested in agriculture automation, please feel free to contact Dr. Matthew Darr. (Example demo | slides)
Video-based Precision Livestock Farming
Video-based data analytics is a key enabler of precision livestock farming. Through automatic monitoring individual animal behavior, it enables the transition from in-person human observations to automated visual estimation of individual animal welfare through the recognition of animal behavior patterns. For instance, Dr. Joshua Peschel (ISU) studies machine learning approaches to automatically recognize individual animals, specifically their consumption rates (food and water) and physical state (standing, lying, etc). Here custom LaView camera systems act as the sensing node at each pen. Each custom system consists of a set of eight 4K color night-vision-capable cameras that permit consistent recording regardless of the environmental conditions (e.g., lights on or off; sun up or down). ARA enables the connectivity between the high-resolution cameras deployed in the ISU livestock research farms (e.g., Beef Nutrition Research Farm shown below) and edge/cloud computing facilities for real-time data analytics. If you are interested in learning more about image/video-based livestock precision farming (e.g., livestock health monitoring), please feel free to contact Dr. Joshua Peschel. (Example demo)