Mario Miranda Sazo, Cornell Cooperative Extension, Lake Ontario Fruit Team
Michael Basedow, Cornell Cooperative Extension, Eastern New York Commercial Horticulture Program
This winter we were interested in the technological capabilities and solutions being offered by three vision system companies for precision crop load management in apples in 2023 and in the future. Our main goal was to learn how these technologies can help NY growers to evaluate their fruit thinning decisions and yield estimations in high density orchards. We were fortunate to conduct a virtual discussion with four Ag-tech entrepreneurs well-versed in digital Ag, who were willing to share their experiences and knowledge during the statewide virtual apple fruit conference on March 3, 2023.
Here we summarize the responses provided by Dr. Dave Brown and Dr. Patrick Plonski, both from Pometa (formerly known Farm Vision), Jenny Lemieux from VIVID Technologies, and Charles Wu from Orchard Robotics. Our article finishes with a brief overview of the main light sensors and imagining technologies available today.
CCE: What are the main applications we can use in 2023? In the next two years?
Pometa: Crop load management (BETA blossom cluster counting, fruitlet counting, growth and predicted abscission, and the Fruit Growth Rate Model); Irrigation (fruit growth rates); Harvest (fruit color, size, and growth, hand scans or ATV mapping (> 1 inch), Harvest forecast by bins/acre and size distribution; Post-harvest (bin scanning); Weather services (frost, heat and dew alerts, station specific forecast).
VIVID Technology: In 2023 blossom counts, fruitlet and fruit sizes, counts to help with thinning and yield prediction, BETA Fruit Growth Rate Model; in 2024-2025 disease detection and pruning insights.
Orchard Robotics: In 2023 bud counting, blossom counting, counting and sizing of early-stage fruitlets all the way up until harvest, the size distribution model for precision thinning; in the future we will be looking at expanding into disease detection and early-warning of fire blight.
CCE: How accurate have your numbers been?
Pometa: Final crop load is as accurate as hand measurements with the fruit growth rate model; yield estimations for harvest have been within +/- 5% (within ~3 weeks of harvest).
VIVID Technology: 90% accuracy, with variation between farms and varieties.
Orchard Robotics: For full block yields we have demonstrated 93% accuracy. For fruitlets, we are within +/- 10% sizing accuracy at the earliest growth stages, and this accuracy improves throughout the season.
CCE: How early can fruitlet size be assessed?
Pometa: 5 mm, 25mm for ATV scans.
VIVID Technology: 10mm.
Orchard Robotics: About 5 mm, accuracy within 10% at 10mm.
CCE: What is the set up and ground-truthing process?
Pometa: Install iPhone app. Must use an iPhone 12, 13, or 14 Pro or Pro Max. To reach the top of trees, mount phone on a 3′ to 6′ long pole. ATV scans require mounting iPhone ~6 ft off ground on a fixed pole attached to the front of an ATV. We recommend a quadlock motorcycle mounting. Install plastic markers (~3 inches) on two trellis posts for reference row segments. For common training systems, ground truth data not required. Detailed vertical scans of reference segments are used to predict occlusion for ATV driving (30 to 60 seconds/scan). For an unfamiliar training system, six individual tree high quality ground truth measurements should be collected throughout the season.
VIVID Technology: Our team conducts the initial farm mapping and software set-up. We also provide a mounting system to attach the camera for scanning that can be left on overnight. The amount of ground truthing is dependent on the amount of an orchard scanned. The person operating the camera can do the ground truthing as it takes as long to count a tree and size a sample of what is on the tree. For 2023, Vivid Machines will be helping by providing ground-truthing as much as possible, as part of the service.
Orchard Robotics: A few minutes to mount the camera to a tractor, gator, or UTV. No additional infrastructure is necessary to start scanning. The system requires a one-time setup of your orchard structure for reference (telling it the name, variety, and location of each block). For accurate absolute data, we highly recommend calibration counts to inform our system’s occlusion models. Calibration counts are done by the grower, and the time varies depending on the number of calibration counts and blocks, but calibrating a single block should not take more than an hour of counting.
CCE: What is your pricing structure?
Pometa: $100/acre starting price for a minimum 100 acres (unlimited use for the season, per acre price declines substantially with volume); $1000/orchard one-time setup cost.
VIVID Technology: $5000 per year hardware lease and a $80 per acre subscription fee. Talk to us about our 2023 new customer pricing.
Orchard Robotics: Camera system at-cost for $10,000, option to lease a system for $4,000 / year. Free camera upgrades. Software subscription at $96/acre/year. Risk mitigation pricing strategy for first year users.
CCE: What is the data collection process and how much of the data collection and processing is automated?
Pometa: Scans are uploaded and processed automatically when iPhone connects to wifi. Hand-held scans of reference segments between two marked trellis posts. Growers set up ~10 of these per block. Scanning takes 30-60 seconds depending on fruit size and post distance. Scans are used to measure fruit growth rates, and to build occlusion models for ATV scanning. During the fruitlet phase, growers scan reference segments every 3-4 days in order to predict fruitlet drop. Ideally, growers scan blocks with an ATV mounting after fruit set to provide the first harvest forecast, then again an additional 1-2 times before harvest to dial in that forecast. In bin-scanning mode, post-harvest, growers can pass the iPhone over a bin to obtain size and color distributions for their harvest.
VIVID Technology: The user needs to hit ‘start/stop’ on the recording. Once the sensor is plugged in – all data upload and visualization is automatic. Camera sensor updates are done automatically.
Orchard Robotics: The entire process is automated. Doing an orchard scan is a simple, two-click process: one click to start the scan, and one click to stop the scan. All of the processing is handled automatically after the conclusion of the scan, and does not require any additional work from a grower (other than plugging the camera in at the end of the day to recharge!) After the couple of hours of processing time, growers can then view the data on either our provided tablet, or on our website. We supply everything you need to start scanning (tablet to control the camera, the camera system itself, and an external battery + charger).
CCE: How long does it take data to be processed into an actionable report?
Pometa: An hour for the reference segment, overnight for ATV full block scans.
VIVID Technology: Immediate data to growers in the orchard on a tablet/phone as soon as they stop recording. The app provides immediate fruit counts, average fruit size/tree, and tree counts. Once the grower connects to the internet, the data gets uploaded to the cloud, and the predictions display on a dashboard by 9am the next morning. This interactive dashboard provides all of the data insights collected to date throughout the season.
Orchard Robotics: We have a very powerful computer inside the camera system that processes all the data on-device, which means that you do not need a fast internet connection to upload tons of data. This also means that data can be returned as an actionable report quickly, usually within 2x the scan time. (i.e. for a 5 hour scan, you will have the data back within 10 hours, and this is something you can leave running overnight).
CCE: Can I integrate your hardware/software over my existing equipment?
Pometa: Yes. iPhone can be mounted on ATV or gator.
VIVID Technology: Yes, we have built a mounting system that can be adapted for different farm equipment allowing our sensor to be attached to a variety of vehicles.
Orchard Robotics: No additional infrastructure is necessary to start scanning.
CCE: How do we view the data?
Pometa: Web application for data display; iPhone app for data collection.
VIVID Technology: We have an app and a dashboard. The app runs on any phone or tablet. The data is aggregated at a row level at the moment, but individual trees can also be selected. The cloud-based dashboard can be viewed in a website browser and allows growers to get a more extensive view of their orchard. You can filter by date, variety, block etc. The dashboard provides information such as growth curves and size and count distribution.
Orchard Robotics: Scans are run on a tablet app we have developed. This tablet interface lets growers start, stop, and view orchard scans + the status of the camera system. We have a mobile/tablet app to view processed data immediately in the field, as well as a cloud-based website where growers can access, interact with, and export data.
CCE: What do you offer in terms of tech support?
Pometa: Remote support for east coast and midwest growers. Targeted in-field training and support for larger Pacific Northwest producers.
VIVID Technology: We provide field staff to scan and collect ground truth points for growers. Field staff are available to email, call, or message for quick answers. More technical support available should product suggestions or more complicated issues arise. Currently, our field staff communicates with the technical team on behalf of growers.
Orchard Robotics: Full on-site support and servicing whenever a grower needs it – just give us a call and we’ll be there!
Short Ag-Tech Summary of Main Light Sensors and Imagining Technologies (available as services through several more Ag-tech companies in the United States and Europe today)
Much like any plant growth regulator, insecticide, fungicide, or herbicide, knowing how sensors work or don’t work is crucial to using them effectively. Commercially available sensors, most commonly digital and multispectral, are useful in orchards for crop growth stage determination and general crop health and are available as services through several more Ag-tech companies in the U.S. today. The following information is a brief overview of these technologies.
Digital cameras operate within the visible light range (red, green, blue) and serve as ‘eye extenders’. They are relatively low-cost, readily available, easy to use, and small. While powerful, they provide limited information. They are best for looking at properties over a wide range: greenness, growth, weeds, pests, and visible disease. They have potential integrated pest management (IPM) uses for tree/vine training, crop/canopy management, and disease management.
Multispectral sensors operate by sensing discrete segments of visible and near infrared (NIR) light and are good for general crop stress detection and indirect problem identification. They can be imprecise and provide limited information, but are budget-flexible as they are available in a wide price range. They can be used to run the normalized difference vegetation index (NDVI) on the crop being stand, which is a good indicator of general crops stress from multiple causes, such as nutrient deficiency, water stress, weeds, pests and diseases. Potential IPM uses are tree/vine training, crop/canopy management, tree/vine nutrition, irrigation, and disease management.
Hyperspectral sensors operate throughout the light range of continuous visible to shortwave infrared (SWIR) wavelengths. Rather than sensing only a few key wavelengths like multispectral, hyperspectral sensors collect 100’s of wavelengths. They have potential applications for direct problem identification and trait quantification. They are currently very expensive, and need more commercial development and require expert interpretation. They can be used for specific biotic and abiotic stress detection and quantification.
Thermal sensors operate using longwave infrared light with applications for temperature monitoring and properties that change plant temperature. They are available at a moderate to high cost. High-resolution technologies are heavy, and data collected can become ‘noisy’. Used for properties that change plant temperature, including water content, water stress, and diseases that impact plant vascular activity. Potential IPM uses include soil management, tree/vine nutrition, irrigation, and disease management.
LiDAR (Light Detection and Ranging) operates in a very specific light region, either NIR or SWIR. Applications include laser, plant structure, plant height, and biomass. These technologies can be high cost and are best for use in the sky, rather than on the ground. They can be used for measuring elevation, plant height, leaf volume, and canopy density. Potential IPM uses include site selection, vine training, and crop/canopy management.