
Computer vision system that guides robotic harvesters by detecting, sizing, and orienting mushroom caps in real time


Context
Mushroom farms running robotic arms still depend on human operators to judge which caps are ready to pick and at what angle. ShroomVision replaces that manual step with a camera-based perception system designed for the constraints of the farm floor. Pointed at a tray, it detects every visible cap, traces its exact contour, measures its diameter in centimeters, and calculates the tilt angle the gripper needs to approach each mushroom correctly. No GPU and no cloud connection are required. The system runs on lightweight embedded hardware, processes a full tray in under two seconds, and calibrates its measurements automatically using the laser grid already present in the setup. The output is a structured list sortable by maturity, tilt, or tray zone. The robot picks in the sequence that maximizes yield, acting on contours, sizes, and orientations rather than on approximate bounding boxes.



Defining the Harvesting Problem
The project started from a clear bottleneck: robotic arms waiting for human confirmation before each pick. The goal was defined as delivering per-cap contour, diameter, and approach angle from a single camera, with no retraining or labeling pipeline.
Embedded Architecture and Calibration Design
The pipeline was architected to run entirely on embedded hardware without GPU acceleration. Spatial calibration relies on the laser grid already installed in the client's setup, eliminating the need for external calibration targets. The output schema was designed to be sortable and directly consumable by the robot controller.


Detection, Measurement, and Output Pipeline
The computer vision pipeline was built to detect cap boundaries, compute real-world diameters in centimeters, and estimate tilt angles frame by frame. Processing time was kept under two seconds per tray. The structured output list integrates directly with the robotic arm's pick-order logic.
Validation and Farm-Floor Deployment
The system was tested against real trays under production lighting conditions. Measurement accuracy was validated against physical references. Calibration stability across sessions was verified using the laser grid. The final build was deployed on the client's embedded hardware with no dependency on external infrastructure.


Computer Vision Engineer
Embedded Systems Developer
Robotics Integration Engineer
Calibration & Metrology Specialist

