Lab Course Humanoid Robots
Robots are versatile systems, that provide vast opportunities for active research and various operations. Humanoid robots, for example, have a human-like body, and thus can act in environments designed for humans. They are able to, e.g., climb stairs, walk through cluttered environments, and open doors. Mobile robots with a wheeled base are designed to operate on flat grounds to perform, e.g., cleaning and service tasks. Robotic arms are able to grasp and manipulate objects.
Participants will work in group of 2 or 3 on one of the possible topics.
At the end of the semester each group will give a presentation and demonstration of their project accompanied by oral moderation. The whole presentation should be approximately 10 minutes long. Every member of the group should present his/her part in the development of the system in a few sentences/slides. When the presentation is complete, each group will be asked a few questions by the HRL staff members or preferably the other students. Everyone is required to be present and to watch the presentation of the other groups.
Aside from the final demonstration, every group is required to submit a lab report. The report is due on the morning before the demonstration. Please describe the task you had to solve, in what ways you approached the solution, what parts your system consists of, special difficulties you may have encountered, and how to compile and use your software. Please include a sufficient number of illustrations. Apart from the content, there are no formal requirements to this document. It is sufficient to submit one lab report per group, it must be pushed to the group's git repository before the lab presentation.
The grade of the lab will depend on the final presentation and how well the assigned task was solved (30/70).
Participants are expected to have Ubuntu Linux installed on their personal computers. The specific requirements for each project are quoted below.
The mandatory Introductory Meeting take place in person (see important dates below).
Semester:
WSYear:
2025Course Number:
MA-INF 4214Links:
BasisCourse Start Date:
14.10.2025Course End Date:
02.02.2026ECTS:
9Responsible HRL Lecturers:
Important dates:
All interested students have to attend the Introductory Meeting. In the Introductory Meeting, we will present the projects, the schedule, the registration process, and answer your questions.
| 14.10.2025, Tuesday, 10:00-11:00hs, Room:2.025 | Introductory Meeting (mandatory) [presentation slides] |
| 19.10.2025, Sunday | Registration deadline and topic selection on our website |
| 26.10.2025, Sunday | Registration deadline in BASIS |
18.12.2025, Thursday, 10:00-12:00hs | Midterm lab presentation |
| 11.02.2026, Wednesday, 10:00-13:00hs | Lab presentation and deadline for lab documentation |
After the Introductory Meeting, each participant arranges an individual schedule with the respective supervisor.
Registration
The registration is open. Register here.
Report and presentation template
Please use the following template for the written summary:
[Report template]
Please use the following template for midterm and final presentation:
[Presentation template]
Projects:

Open Sesame!: Whole Body Control for Opening Furnitures
Supervisors: Rohit Menon, Shahram Khorshidi
You will learn how to detect furniture handles with AI and use whole-body motion to let an omni-directional mobile manipulator with a five fingered hand, and a humanoid open drawers, doors, and cabinets smoothly.

Robot Butler: Open-Vocabulary Mobile Manipulation
Supervisor: Rohit Menon
You will learn how to make a robot understand everyday language commands and turn them into real pick-and-place actions on a omni-directional mobile manipulator with different sensing modalities

Autonomous racing
Supervisor: Nils Dengler
This Lab will involve fixed meetings where the basics of the car will be explained, with exercise sheets to be completed. This will lead to a final project in which the task is to enable a racing car to navigate a given track autonomously.
![]()
Find the viewpoint!
Supervisor: Sicong Pan
The task is to find the viewpoint of the given RGB image within the context of an eye-in-hand tabletop configuration.

Social force model for crowd simulation
Supervisor: Subham Agrawal
The task is to implement social force model (SFM) for crowd simulation and visualize it in a 3D environment.

Robot navigation using LLM
Supervisor: Xuying Huang
This project aims to use a large language model (LLM) to achieve robot navigation. The robot should be able to understand spoken instructions, process the command using the LLM, and convert it into specific actions (such as moving to the kitchen).

