LUCKY DRAW Follow, Repost & Tag a friend for a chance to win a ticket for €10 ONLY! ·
Filter

Speakers

Caroline Pascal
Caroline Pascal LeRobot Maintainer, Hugging Face LeRobot Dataset Roadmap! dataset
Virgile Batto
Virgile Batto Embodied Robotics Engineer, Hugging Face PhD from LAAS-GEPETTO (CNRS) focused on bipedal walking robot design. Now at Hugging Face working on legged robots and the LeRobot ecosystem. HuggingFace Legged Robot! A fully open-source 3D-printed humanoid robot built on the LeRobot ecosystem — covering design, training, and community-driven development. bipedal
Wenzhe Cai
Wenzhe Cai Researcher, Shanghai AI Laboratory Embodied AI researcher working on open-source embodied navigation foundation models. Building Visual Navigation Foundation Models with Scalable Simulation Data Building a general navigation system using only an external camera, leveraging scalable simulated data pipelines to train capable navigation policies. navigation
Jinwei Gu
Jinwei Gu Principal Research Scientist & Tech Lead, NVIDIA Tech lead on Cosmos world foundation models at NVIDIA. PhD from Columbia University. Adjunct Associate Professor at CUHK. IEEE Senior Member. Cosmos: Unleashing Video Foundation Models for Physical AI and Generalist Robots How large-scale pretrained video generation models can be transformed into infrastructure for robot control, data generation, and simulation — featuring Cosmos Policy, DreamDojo, and scalable policy evaluation. world model
Jiaming Liu
Jiaming Liu PhD Student, Peking University Research on robotic manipulation and embodied foundation models. 20 first-author papers at CVPR, ICLR, RSS. 3000+ citations. 2025 ByteDance Scholarship recipient. Enabling Robots to Both "Think" and "Act" Unified Vision-Language-Action (VLA) foundation models integrating perception, reasoning, and control — featuring HybridVLA, RoboMIND dataset, and TwinRL for efficient real-world RL. thinking
Charbel Dandjinou
Charbel Dandjinou CTO & Founder, Reflexion Robotics Founder of Reflexion Robotics (Paris), building a safety runtime for embodied AI. Previously AI Research Engineer at EXXACT Robotics. Founded IROKO and Deep Takka research lab. Skill Foundry: An AI That Embodies a Robot, Watches Itself, and Learns An AI embodies a simulated robot, observes itself through robot POV and scene cameras, then iterates on reward, strategy, and training params. Running on Unitree G1 in NVIDIA IsaacLab. skills
Sébastien Crozet
Sébastien Crozet Founder, Dimforge Creator of nalgebra and Rapier for the Rust ecosystem. Focused on cross-platform scientific computing and AI at Dimforge. Project Nexus: Cross-Platform GPU-Based Rust Multiphysics Simulator for Robotics and Training Datasets Generation A Rust alternative to Nvidia Warp and Genesis, compiled with rust-gpu, running on all GPU platforms including the web. Featuring Khal, Vortx, glamx and a live demo. simulation
Jeremy Laville
Jeremy Laville R&D Engineer, Pollen Robotics Creator of the Amazing Hand, an open-source 3D-printed robotic hand with 8 DOF. Previously designed hands and forearms for the Pepper robot at Aldebaran Robotics. Amazing Hand Tech Report hand
Vector Wang
Vector Wang PhD Student, Rice University Creator of XLeRobot, a $660 open-source dual-arm mobile robot with 4.8k+ GitHub stars and 6,000+ builders worldwide. Built on the Hugging Face LeRobot ecosystem. XLeRobot: Practical Dual-Arm Mobile Home Robot How a PhD side project became a globally adopted platform — design philosophy, community growth with zero promotion, and the evolving landscape from DORA middleware to NVIDIA's open Physical AI stack integrating into LeRobot. mobile manipulation

Call for Proposals

The call for proposals is now closed. Thank you to everyone who submitted!

View original call for proposals

The workshop spans 2 days with approximately 20 talks (10 to 20 minute talks + 5-10 minute Q&A) and dedicated demo sessions. We invite submissions including but not limited to the following topics:

Open Source Robotics Hardware

  • Open source actuators, sensors, and end-effectors
  • Open source robotic arms, hands, and grippers
  • Open source legged and wheeled platforms
  • Open source humanoid robot designs
  • Modular and reconfigurable hardware architectures

Open Source Learning & Control

  • Open source imitation learning and learning from demonstrations
  • Open source reinforcement learning for manipulation and locomotion
  • Open source navigation and SLAM
  • Open source real-world models and world models for robotics
  • Open source motion planning and control frameworks
  • Generalization to unseen objects, environments, and tasks

Open Source Infrastructure

  • Open source simulation environments and physics engines
  • Open source visualization and debugging tools
  • Open source robotics middleware and communication frameworks
  • Open source deployment and orchestration tools

Open Source Data & Benchmarks

  • Open source data collection pipelines and teleoperation systems
  • Open source robotics datasets and data formats
  • Open source benchmarks for manipulation, navigation, and locomotion
  • Internet-scale data for training robotics foundation models

Challenges & Limitations

  • Reproducibility and standardization in open source robotics
  • Sim-to-real transfer gaps and failure modes
  • Safety, robustness, and reliability of open source systems
  • Bootstrapping learning from scarce or noisy data

Important Dates

  • Submission Deadline: March 2026
  • Acceptance Notification: April 2026
  • Workshop: May 5-6, 2026
STATION F - Central Room

Venue

STATION F — Central Room, Paris

Hosted at STATION F, the world's largest startup campus, in the Central Room — an intimate 70-seat space perfect for talks, demos, and hands-on discussions.