A project brought to you by RSL - ETH Zurich.
References • Hugging Face • ROS1 • Contributing • Citation
More instructions can be found on the official webpage.
- Recording Setup
- Dataset Explorer
- Benchmarks
You can find Jupyter Notebooks with full instructions in the examples_hugging_face
.
Requirements: Python>=3.11
Examples Tested with: zarr==3.0.7
Installation Instructions (UV with no preinstalled Python 3.11)
pip3 install uv
uv install
sudo add-apt-repository ppa:deadsnakes/ppa
sudo apt update
sudo apt install -y python3.11 python3.11-venv python3.11-distutils
cd ~/git/grand_tour_dataset/examples_hugging_face
mkdir .venv; cd .venv
python3.11 -m venv grandtour
source grandtour/bin/activate
cd ..; uv pip install -r pyproject.toml
jupyter notebook
List of examples:
To access and download the GrandTour dataset rosbags, please follow these steps:
- Register here: Google Form Registration
Option 1 – Command Line Interface (Recommended):
Install the CLI tool and log in:
pip3 install kleinkram
klein login
- You can now explore the CLI using tab-completion or the
--help
flag.
Download multiple files via Python scripting:
python3 examples_kleinkram/kleinkram_cli_example.py
Directly convert rosbags to PNG images (requires ROS1 installation):
python3 examples_kleinkram/kleinkram_extract_images.py
Option 2 – Web Interface:
- Use the GrandTour Dataset Web Interface to browse and download data directly.
mkdir -p ~/grand_tour_ws/src
mkdir -p ~/git
⚠️ Note: Thegrand_tour_box
repository is currently private. We are actively working on making it public.
# Cloning the repository
cd ~/git
git clone [email protected]:leggedrobotics/grand_tour_dataset.git
cd grand_tour_dataset; git submodule update --init
# Checkout only the required packages from the grand_tour_box repository for simplicity
cd ~/git/grand_tour_dataset/examples_ros1/submodules/grand_tour_box
git sparse-checkout init --cone
git sparse-checkout set box_model box_calibration box_drivers/anymal_msgs box_drivers/gnss_msgs
# Link the repository to the workspace
ln -s ~/git/grand_tour_dataset/examples_ros1 ~/grand_tour_ws/src/
cd ~/grand_tour_ws
catkin init
catkin config --extend /opt/ros/noetic
catkin config --cmake-args -DCMAKE_BUILD_TYPE=RelWithDebInfo
catkin build grand_tour_ros1
source devel/setup.bash
mkdir -p ~/grand_tour_ws/src/examples_ros1/data
cd ~/grand_tour_ws/src/examples_ros1/data
pip3 install kleinkram
klein login
klein download --mission 3c97a27e-4180-4e40-b8af-59714de54a87
roslaunch grand_tour_ros1 lidars.launch
# URDFs are automaticlly loaded by:
# Boxi: box_model box_model.launch
# ANYmal: anymal_d_simple_description load.launch
cd ~/grand_tour_ws/src/examples_ros1/data
# We provide an easy interface to replay the bags
rosrun grand_tour_ros1 rosbag_play.sh --help
rosrun grand_tour_ros1 rosbag_play.sh --lidars --tf_model
# We provide two tf_bags
# tf_model contains frames requred for UDRF model of ANYmal and Boxi.
# tf_minimal contains only core sensor frames.
You can also try the same for cameras.launch
.
Example Output:
LiDAR Visualization | Camera Visualization |
---|---|
![]() Visualization of LiDAR data using lidars.launch . |
![]() Visualization of images using cameras.launch . |
We provide a launch file to uncompress images and publish rectified images. Install the required dependencies:
sudo apt-get install ros-noetic-image-transport
sudo apt-get install ros-noetic-compressed-image-transport
roslaunch grand_tour_ros1 cameras_helpers.launch
We use rqt-multiplot to visualize the IMU measurments.
Install rqt_multiplot:
sudo apt-get install ros-noetic-rqt-multiplot -y
Start rqt_multiplot and replay the bags:
roslaunch grand_tour_ros1 imus.launch
cd ~/grand_tour_ws/src/examples_ros1/data
rosrun grand_tour_ros1 rosbag_play.sh --imus --ap20
We warmly welcome contributions to help us improve and expand this project. Whether you're interested in adding new examples, enhancing existing ones, or simply offering suggestions — we'd love to hear from you! Feel free to open an issue or reach out directly.
We are particularly looking for contributions in the following areas:
- New and interesting benchmarks
- ROS2 integration and conversion
- Visualization tools (e.g., Viser, etc.)
- Hosting and deployment support in Asia
We're organizing a workshop at ICRA 2026 in Vienna and are currently looking for co-organizers and collaborators. We are also planning to write a community paper about this project. Everyone who contributes meaningfully will be included as a co-author.
Let’s build this together — your input matters!
@INPROCEEDINGS{Tuna-Frey-Fu-RSS-25,
AUTHOR = {Jonas Frey AND Turcan Tuna AND Lanke Frank Tarimo Fu AND Cedric Weibel AND Katharine Patterson AND Benjamin Krummenacher AND Matthias Müller AND Julian Nubert AND Maurice Fallon AND Cesar Cadena AND Marco Hutter},
TITLE = {{Boxi: Design Decisions in the Context of Algorithmic Performance for Robotics}},
BOOKTITLE = {Proceedings of Robotics: Science and Systems},
YEAR = {2025},
ADDRESS = {Los Angeles, United States},
MONTH = {June}
}
*shared first authorship: Frey, Tuna, Fu.