HomeInfrastructuresInstitut de Robòtica i Informàtica Industrial | TERRINet


C/ Llorens i Artigas 4-6,
08028 Barcelona, Spain


Scientific Responsible
Alberto Sanfeliu Cortés

The structure

The Institut de Robòtica i Informàtica Industrial (IRI), is a Joint University Research Institute participated by the Spanish Council for Scientific Research (CSIC) and the Technical University of Catalonia (UPC) that conducts basic and applied research in human-centered robotics and automatic control.

The Barcelona Robot Lab

The Barcelona Robot Lab (BRL) is a scientific-technological area, (http://www.iri.upc.edu/research/webprojects/barcelonarobotlab/), for doing outdoor/indoor experiments in mobile robotics and human-robot interaction and collaboration. It includes an outdoor pedestrian area of 10.000sqm, provided with fixed cameras with high resolution, wifi, 3G/4G and partial gps coverage, with presence of buildings, open and covered areas, ramps and some vegetation. It also includes an indoor area of 30sqm with Optitrack installation where aerial robotics experiments can also be done. Several public space scenarios could be developed in this area, as markets, bars or shops aiming to deploy robots in a real and controlled urban scenario to perform navigation and human robot interaction and collaboration experiments for multiple applications as for example, vision surveillance, people tracking, person guiding, human-robot walking side-by-side, people searching, robot good distribution or communication between deployed robots and devices.

The Barcelona Robot Lab offers moreover a simulation software package with a 2D/3D scenario view, where any experiment can be validated before doing real-life experiments, and technical support previous to the physical tests and real-life experiments.

Available platforms

Tibi and Dabo robots

Tibi and Dabo are two mobile urban service robots aimed to perform navigation and human robot interaction tasks.

Navigation is based on the differential Segway RMP200 platform, able to work in balancing mode, which is useful to overcome low slope ramps. Two 2D horizontal laser range sensors allow obstacle detection and localization.

Human robot interaction is achieved with two 2 degrees of freedom (dof) arms, a 3 dof head with some face expressions, a stereo camera, text-to-speech software and a touch screen.

They can be used to provide information, guiding and steward services to persons in urban spaces, either alone or both in collaboration.

Key features

  • Differential mobile platform Segway RMP200, maximum speed ~1m/s
  • Battery with up to 3h operation time and 8h charge time
  • Two laser Hokuyo UTM-30LX
  • Stereo camera Bumblebee2 (placed on the head)

Possible applications

  • 2D navigation in urban environments
  • Human robot interaction
  • Multirobot systems
  • Teleoperation

Additional informations: http://wiki.iri.upc.edu/index.php/Tibi-Dabo


Teo robot

Teo is a robot aimed to perform 2D Simultaneous Localization And Mapping (SLAM) and 3D mapping. Is designed to work both in indoor and rugged outdoor areas.
To perform these tasks, the robot is build on a skid steer Segway RMP400 platform and provides two 2D horizontal laser range sensors, an inertial sensor (IMU), a GNSS receiver and a homemade 3D sensor based on a rotative 2D vertical laser range sensor.

Key features

  • Dimensions: 80 (W) x 140 (L) x 130 (H) cm
  • Skid steer Segway RMP 400 platform. Maximum speed of ~1m/s
  • Battery with up to 3h operation time and 8h charge time
  • Two lasers Hokuyo UXM-30LX
  • One IMU sensor
  • One GNSS receiver
  • One homemade 3D laser based on Hokuyo UTM-30LX

Possible applications

  • Navigation
  • SLAM
  • 3D mapping
  • Multi robot systems
  • Teleoperation

Additional informations: http://wiki.iri.upc.edu/index.php/TEO


IRIcar robot

The autonomous car is based on a standard golf cart that has been robotized. The robot is capable of carrying up to two adult people in slopes up to 30 degrees.
With the information provided by a 32 beams 360º laser range sensor, a 360º camera and a frontal 2D ranger, the autonomous car can navigate in 2D environments taking into account the ackermann constraints.
The robot has two safety laser scanners that allow it to safely navigate around people. The robot is not homologated, and it cannot travel with regular traffic.

Key features

  • One frontal and one rear Leuze RS4 safety laser with a range of 4 m
  • One frontal Hokuyo UXM-30LX laser with a range of 30 m
  • One Velodyne HDL-32 on top of the car with a range of 80 m
  • One Ladybug 360º degrees camera on top of the car (5 individual cameras)

Possible applications

  • 2D navigation with ackermann constraints
  • Autonomous people transportation system
  • Human robot interaction
  • Teleoperation

Ana and Helena Pioneer robots

Mobile urban service robot aimed to perform navigation, human robot interaction and package delivery tasks.
Navigation is based on the skid steer Pioneer 3AT platform, with a 3D lidar and stereo camera for obstacle detection.
Human robot interaction is based on a pan and tilt camera, status feedback lights, text-to-speech software, a microphone and a touch screen.

Key features

  • Skid steer mobile platform Pioneer 3AT
  • One Lidar Velodyne Puck VLP16
  • One stereo camera Zed
  • One IMU sensor
  • One GNSS receiver

Possible applications

  • 2D/3D navigation in urban environments
  • Human robot interaction
  • Multi robot systems
  • Teleoperation