Search results
36 results
Sort by:
zaenker
dengler
Menon
Sicong
Roychoudhury
Jacob
Chenchani
Phenorob - Robotics and Phenotyping for Sustainable Crop Production
PhenoRob performs world-leading research in robotics and phenotyping for sustainable crop production. Our vision is to transform crop production by optimizing breeding and farming management through developing and deploying new technologies. PhenoRob addresses a real-world problem with a technology-oriented approach. Our scientists have backgrounds in computer science, geodesy, robotics, plant science, soil science, economics, and environmental science. This interdisciplinary team forms the only DFG-funded Cluster of Excellence focusing on agriculture.
Active Perception
Active perception tries to answer the question of where to look at to gain more information. This is an important task in any application where occlusions appear and scenes need to be observed from different angles. Especially for robots operating in complex unstructured environments, active perception is key to efficient spatio-temporal mapping.
AID4Crops - Automation and AI for Monitoring and Decision Making of Horticultural Crops
AID4Crops will bring together research about what can be sensed with what should be sensed. To do this, we will develop novel AI algorithms to enable autonomous monitoring (sensing) and management (forecasting and decision making) for horticultural crops. These approaches will be deployed in horticulture as it provides a set of realistic yet challenging environments with crops grown both indoors (in glasshouses) and outdoor (in orchards); the indoor environment provides greater control over the growth of the crops.
Safe Leaf Manipulation for Accurate Shape and Pose Estimation of Occluded Fruits
DM-OSVP++: One-Shot View Planning Using 3D Diffusion Models for Active RGB-Based Object Reconstruction
Graph-based View Motion Planning for Fruit Detection
Perception for Humanoid Robots
Deep Reinforcement Learning for Next-Best-View Planning in Agricultural Applications
Fruit Mapping with Shape Completion for Autonomous Crop Monitoring
Viewpoint Planning for Fruit Size and Position Estimation
PATHoBot: A Robot for Glasshouse Crop Phenotyping and Intervention.
Online Object-Oriented Semantic Mapping and Map Updating
Combining Local and Global Viewpoint Planning for Fruit Coverage
Gradient and Log-based Active Learning for Semantic Segmentation of Crop and Weed for Agricultural Robots
Speeding Up Person Finding Using Hidden Markov Models
GPU-Accelerated Next-Best-View Exploration of Articulated Scenes
People Finding under Visibility Constraints using Graph-Based Motion Prediction
HortiBot: An Adaptive Multi-Arm System for Robotic Horticulture of Sweet Peppers
Horticultural tasks such as pruning and selective harvesting are labor intensive and horticultural staff are hard to find. Automating these tasks is challenging due to the semi-structured greenhouse workspaces, changing environmental conditions such as lighting, dense plant growth with many occlusions, and the need for gentle manipulation of non-rigid plant organs. In this work, we present the three-armed system HortiBot, with two arms for manipulation and a third arm as an articulated head for active perception using stereo cameras. Its perception system detects not only peppers, but also peduncles and stems in real time, and performs online data association to build a world model of pepper plants. Collision-aware online trajectory generation allows all three arms to safely track their respective targets for observation, grasping, and cutting. We integrated perception and manipulation to perform selective harvesting of peppers and evaluated the system in lab experiments. Using active perception coupled with end-effector force torque sensing for compliant manipulation, HortiBot achieves high success rates.