Editor's Note: This article originally appeared in the May/June 2025 print edition of Produce Grower under the headline “Advanced robotics in CEA.”

Throughout much of the 20th century, we imagined robots and AI would evolve in lockstep. According to one business writer, “There was a sense that technological ‘minds’ and ‘bodies’ would make their leaps forward together.”
Why this hasn’t happened is best explained by Moravec’s Paradox. To paraphrase: What is easy for a human is hard for a robot, and vice versa.
This makes sense when we consider that human survival through millennia depended more upon perception, cognition and rapid hand-eye coordination than upon math and puzzle-solving. We take for granted how easy it is for us to push aside soft leaves and grasp a soft fruit, and how much pressure to grasp with depending on its ripeness. For a robot, perceiving a ripe tomato or pepper fruit among all the leaves and stems is a feat, and deciding what path to steer its arm and end effector — the grasper, scissors or vacuum at the end of the arm — even more so.
The greenhouse is a difficult place. Compared to an industrial building, it is cluttered, dusty and humid, and the quarters are tight. Light is extremely variable due to shadows, particularly in a greenhouse. Varying crops challenge any standardization. A robot must be able to handle plants of different sizes, shapes and cultivation configurations. Many facilities will need modifications to accommodate robotic systems effectively — a big ask.

Advanced robotics will lag behind AI in CEA. Long timelines for commercialization hamper investment, and the challenges are complex. Only a handful of advanced robots are currently available, and most are for only tomatoes. Complete robotization of cultivation will require longer than the 10-year horizon of this RII guide, as it may depend on modifying cultivation systems that have been tested and refined over the last few decades.
Despite this clear-eyed assessment, the AI & Robotics Working Group believes most plant-touching tasks in greenhouse cultivation will see robotization by 2035, though not implemented industry-wide. Exciting and ingenious feats of engineering highlighted below account for this optimism. We focus on commercialized systems, though many more are in research and development.
Key applications and use cases
Many systems available commercially today are multitools. CEA operations will be unable to afford or have time to learn how to use multiple robots from different vendors. Mobile robotic platforms consist of a base unit for navigation, power and perception with swappable task-specific end effectors capable of achieving key applications.
Plant phenotyping: These platforms combine automated imaging systems mounted on rails or carts with computer vision algorithms to continuously track plant development, health status and projected yields. By processing data from multiple imaging and sensing technologies, including thermal cameras, LiDAR systems (Light Detection and Ranging), RGB cameras and environmental monitors, these systems provide actionable insights on key metrics like growth rates and harvest forecasting. The integration of thermal imaging reveals physiological stress responses through temperature patterns, while LiDAR generates precise 3D structural measurements of plant architecture and biomass. Alternative systems to robot platforms include handheld units that collect data while the grower walks through the crop, while another gathers weekly data using aerial drones.
Insect and disease scouting: Cart-mounted or rail-based systems scan foliage at multiple heights and angles, using specialized cameras to identify pest infestations and disease symptoms, in some cases before they become visible to human scouts. Detection systems primarily employ high-resolution RGB cameras with macro lenses for close-up pest identification, along with specialized ultraviolet (UV) and polarized light imaging that enhances visibility of insect activity, webbing and early disease lesions. These systems detect subtle color changes, irregular patterns and physical damage associated with specific pests and pathogens. This imaging augments traditional IPM programs by providing continuous monitoring, precise issue mapping and predictive analytics for pest population dynamics.
Pruning and de-leafing robots: Automated leaf removal has become increasingly important for improving light penetration and air circulation in dense crops like tomatoes and cucumbers. Specialized end effectors can detect leaves and safely remove them while avoiding damage to stems and fruit. These systems can operate 24/7 and can significantly reduce labor requirements for this repetitive task. One patented technology brings the plant to the robot, rather than sending the robot to the plant. It conveys aeroponically grown tomato plants through a series of robots — “like a car in a car wash,” the company reports — stripping lower leaves and harvesting ripe fruits before returning them to the cultivation room.
UV-light treatment robots: The use of UV-C light of 254nm wavelengths for disease control, particularly against powdery mildew, has emerged as a promising robotic application. Autonomous UV treatment can effectively replace chemical fungicides in some applications. UV light panels installed on mobile platforms move through greenhouse rows, illuminating plants on either side. These systems are particularly valuable because they can operate at night, when UV treatments are most effective, and worker exposure is eliminated.

Precision pesticide spraying robots: Robotic pesticide sprayers move autonomously between crop rows, sometimes spraying only the plants designated by a grower or pest detection robot. They can operate at night to reduce worker exposure. Modern spray robots incorporate a drift reduction technology and can adjust application rates based on plant density and disease pressure. University testing has indicated the volume of pesticides applied by this method was reduced by up to 42%. Manufacturers report reduced chemical use by 20 to 30% compared to traditional methods. Workers normally assigned to apply pesticides by traditional methods are spared from heat stress associated with wearing personal protective equipment in a hot greenhouse.
Vine lowering robots: This is the most recent development in greenhouse robotics and tackles one of the most strenuous tasks of tomato and cucumber production: the lowering and leaning of vines every seven to 10 days as they grow.

Automated guided vehicles (AGVs): These robots have become increasingly common in larger greenhouse operations, handling the movement of harvested produce, growing media and other materials throughout the facility. The integration of AGVs with harvesting systems has proved particularly valuable. Collection robots can automatically follow human harvesters or robotic harvesting systems, reducing the physical burden on workers and improving operational efficiency.
Harvesting robots: Harvesting remains one of the most economically significant applications for robotics in CEA. As reported by one expert, a majority of robots that have been commercialized today are tomato harvesters. Robots for sweet pepper, strawberry and cucumber are also available. Several mushroom harvesters for CEA operations are available, and greenhouse cut flower rose and gerbera systems are in development. Harvest systems employ computer vision to determine ripeness, reach with robotic arms and pick fruit using end effectors that grasp, cut or vacuum-grip. They collect and store their harvest on the mobile platform, making them autonomous.
Cutting systems have achieved higher success rates for leafy greens due to the crop’s more uniform and accessible nature. These robots typically use vision-guided cutting tools that can harvest entire heads of lettuce or make precise cuts for repeat harvesting of younger greens. Berry harvesting presents different challenges due to the fruit’s delicate nature. Strawberry harvesting robots often use soft grippers with integrated force sensors to handle berries without damage. These systems can achieve picking speeds of 4.6 seconds per berry when operating in optimized conditions.
Harvesting robots still need improvements. A 2024 academic review noted that better performance is needed in all elements, including “... quick location of ripe fruits, a correct separation of fruit and plants and the management of all harvested fruits. Furthermore, the speed of operation of the robot is in general much lower than the human operator speed harvesting a fruit or vegetable.” Another 2024 source reported harvest efficiency rates of 83 to 88% but noted that these figures were derived from simplified conditions, including surrounding leaves being removed, fruits separated manually and plant spacing increased. To their credit, manufacturers will admit that the technology is between a prototype and a completed system.
Sorting and grading systems: While automated sorting has existed for some time, modern robotic systems incorporate advanced computer vision and AI to achieve higher accuracy and consistency. These systems can grade produce based on size, color, ripeness and quality, often exceeding human consistency in quality assessment. Beyond simple sorting, these systems can help identify quality issues arising from growing conditions or improper harvesting techniques, providing valuable feedback for operational improvement.
Greenhouse navigation of advanced robotics
The ability to move safely and precisely through a greenhouse environment is fundamental to robotic operation. Fixed rail systems remain the most reliable solution for many applications, particularly in established greenhouse operations. One commercial harvesting robot demonstrates the effectiveness of using existing heating pipe rails between rows for movement while maintaining the ability to transition to floor operation when needed. Modern rail systems have evolved to incorporate multiple levels for different operations, automatic switching systems for row changes, and integrated power delivery through the rail infrastructure.
For free-moving robots, floor navigation can be aided by painted lines or magnetic strips, conductive wires embedded in the floor, RFID beacons, GPS or computer vision. LiDAR technology is also a very effective navigation tool, creating detailed 3D maps of the environment by measuring distances using laser pulses, helping robots understand their surroundings and avoid obstacles.

Specialized end effectors can detect leaves and safely remove them while avoiding damage to stems and fruit.
Photo courtesy of RII and Octiva
Computer vision technology
Vision systems are crucial for most CEA robotic applications but face several challenges unique to the greenhouse environment. Modern systems typically employ RGB-D cameras that combine standard color imaging (red, blue and green) with depth information, operating at high resolutions and frame rates to capture detailed information about plants and their environment. These cameras typically operate with resolutions of 1920x1080 for color imaging and 640x480 for depth sensing, with optimal operating ranges between 0.5 and 4.5 meters.
Computer vision not only helps navigation and perception of objects for pruning or harvesting. It can also assist cultivation strategy, such as steering vine crop development between vegetative or generative growth, or by determining the proper spacing of lettuce to improve harvest yield.

Lighting conditions present a particular challenge in greenhouse environments. Varying sunlight can interfere with computer vision systems, leading many robots to perform better at night or under controlled lighting conditions. To address this, advanced systems employ specialized LED arrays synchronized with camera exposure, strobed lighting to reduce motion blur and near-infrared illumination for night operation. Some commercial systems include sophisticated lighting systems that create consistent conditions for their visual sensors, regardless of ambient light levels.
Plant occlusion represents another significant challenge for vision systems. When plants overlap and obscure each other, robots struggle to identify individual fruits or plant parts. Modern systems address this through several sophisticated approaches:
- Multi-angle imaging using three to four cameras per monitoring station.
- Dynamic positioning allowing robotic arms to move cameras to different viewpoints.
- Active perception systems that can resolve ambiguous situations.
End effectors and object manipulation
The end effector — the part of the robot that interacts with plants and produce — has evolved into an increasingly sophisticated system. One such gripper, inspired by fish fin movement, uses soft, flexible fingers that can adapt to different fruit shapes while maintaining gentle handling. These systems often incorporate force-feedback mechanisms to prevent damage to delicate produce or belong to a family of underactuated grippers.
Suction-based systems offer another approach to harvesting, using multi-zone suction cups with independent vacuum control for different-sized products. Many modern end effectors combine multiple techniques — for instance, one commercial model uses a combination of visual sensing, soft gripping and precise cutting mechanisms to harvest sweet peppers while avoiding damage to the plant.
End effectors have become equally sophisticated for crop maintenance tasks. As described earlier, UV treatment systems include adjustable UV-C arrays with light intensity monitoring and automatic height adjustment. Precision spraying systems incorporate electrostatic charging and multiple nozzle configurations for optimal coverage while minimizing chemical use.

Explore the May/June 2025 Issue
Check out more from this issue and find your next story to read.
Latest from Produce Grower
- University of Florida offering new advanced irrigation management course
- GIE Media’s Horticulture Group honored with national awards for editorial and design excellence
- CEA Alliance expresses disappointment in termination of Tomato Suspension Agreement
- BrightFarms unveils new look with updated packaging
- másLabor and LaborMex announce joint venture
- JumpLights launches ETS MAX horticultural grow light
- Collecting a valuable resource
- Fresh Inset Solutions introduces MaTri Powder to U.S. market