HomeInfrastructuresInstitut de Robòtica i Informàtica Industrial | TERRINet

https://www.terrinet.eu/wp-content/uploads/2018/04/image23.png
https://www.terrinet.eu/wp-content/uploads/2018/04/image8.png

Address:
C/ Llorens i Artigas 4-6,
08028 Barcelona, Spain

Website
IRI

Scientific Responsible
Alberto Sanfeliu

The structure

The Institut de Robòtica i Informàtica Industrial (IRI), is a Joint University Research Institute participated by the Spanish Council for Scientific Research (CSIC) and the Technical University of Catalonia (UPC) that conducts basic and applied research in human-centered robotics and automatic control.

The IRI offers access to scientific-technological areas included in the Barcelona Robot Lab (BRL) (http://www.iri.upc.edu/research/webprojects/barcelonarobotlab/) for doing outdoor/indoor experiments in mobile robotics and human-robot interaction and collaboration. The BRL includes an outdoor pedestrian area of 10.000sqm in the north UPC campus and an indoor area of 80sqm with Optitrack installation. The BRL allows to experiment in public spaces deploying robots in a real and controlled urban scenarios to perform navigation and human robot interaction and collaboration experiments with end users and participants. The Barcelona Robot Lab offers moreover a simulation software package with a 2D/3D scenario view, where any experiment can be validated before doing real-life experiments, and technical support along the experimentation process.

Available platforms
https://www.terrinet.eu/wp-content/uploads/2019/02/barcelona-robot-lab-1.jpg
https://www.terrinet.eu/wp-content/uploads/2019/02/barcelona-robot-lab-2.jpg

Barcelona Robot Lab

Outdoor pedestrian area of 10.000sqm in the UPC nord campus, provided with fixed cameras, wifi, 3G/4G and partial gps coverage, with presence of buildings, open and covered areas, ramps and some vegetation. Several public space scenarios could be developed in this area, as markets, bars or shops aiming to deploy robots in a real and controlled urban scenario to perform navigation and human robot interaction and collaboration experiments for multiple applications.

Key features

  • Fixed 21 cameras
  • Wifi coverage
  • 3G/4G coverage
  • Partial GNSS coverage
  • ROS integration

Possible applications

  • Vision surveillance
  • Tracking of objects and persons
  • Human-robot interaction and collaboration
  • People searching
  • Good distribution
  • Communication between deployed robots and devices
https://www.terrinet.eu/wp-content/uploads/2019/02/optitrack.jpg

Optitrack indoor testbed

Indoor testbed based on an indoor positioning system that uses 20 Optitrack Flex13 infrared cameras. This system can calculate the position and orientation of moving objects, previously tagged with special markers, within the volume of the testbed (11x7x2.5 m) in real time (with an update rate of up to 120 Hz).

Key features

  • Cameras: 10
  • Size: 11x7x2.5m
  • Update rate: 120Hz
  • ROS integration

Possible applications

  • Human-robot collaboration experiments
  • Indoor flying robot experiments
  • Ground truth position recording for datasets
https://www.terrinet.eu/wp-content/uploads/2018/12/tibianddabo.png

Tibi and Dabo robots

Tibi and Dabo are two mobile urban service robots aimed to perform navigation and human robot interaction tasks.

Navigation is based on the differential Segway RMP200 platform, able to work in balancing mode, which is useful to overcome low slope ramps. Two 2D horizontal laser range sensors allow obstacle detection and localization.

Human robot interaction is achieved with two 2 degrees of freedom (dof) arms, a 3 dof head with some face expressions, a stereo camera, text-to-speech software and a touch screen.

They can be used to provide information, guiding and steward services to persons in urban spaces, either alone or both in collaboration.

Technical Specifications

  • Weight of about 100kg

  • Dimensions: 60 (W) x 60 (L) x 160 (H) cm
  • Battery with up to 3h operation time and 8h charge time.

  • Differential mobile platform Segway RMP200, maximum speed ~1m/s

  • Two laser Hokuyo UTM-30LX

  • Two arms with 2 dof each, and one head with 3 dof.

  • LED face expressions (mouth, eyebrows and cheeks)

  • Stereo camera Bumblebee2 (placed on the head)

  • Loquendo text-to-speech software with english, spanish and catalan languages

  • Touch screen

  • Onboard router for internal network with wi-fi and 3G connectivity

  • Remote and onboard emergency stop buttons

  • Two industrial onboard computers and an external laptop for monitoring

  • ROS enabled robot

Key features

  • Differential mobile platform Segway RMP200, maximum speed ~1m/s
  • Battery with up to 3h operation time and 8h charge time
  • Two laser Hokuyo UTM-30LX
  • Stereo camera Bumblebee2 (placed on the head)

Possible applications

  • 2D navigation in urban environments
  • Human robot interaction
  • Multirobot systems
  • Teleoperation

Additional informations: http://wiki.iri.upc.edu/index.php/Tibi-Dabo

https://www.terrinet.eu/wp-content/uploads/2018/05/teo.png

Teo robot

Teo is a robot aimed to perform 2D Simultaneous Localization And Mapping (SLAM) and 3D mapping. Is designed to work both in indoor and rugged outdoor areas.
To perform these tasks, the robot is build on a skid steer Segway RMP400 platform and provides two 2D horizontal laser range sensors, an inertial sensor (IMU), a GNSS receiver and a homemade 3D sensor based on a rotative 2D vertical laser range sensor.

Technical Specifications

  • Weight of about 110kg

  • Dimensions: 80 (W) x 140 (L) x 130 (H) cm
  • Battery with up to 3h operation time and 8h charge time.

  • Skid steer Segway RMP 400 platform. Maximum speed of ~1m/s

  • Two lasers Hokuyo UXM-30LX

  • One IMU sensor

  • One GNSS receiver

  • One homemade 3D laser based on Hokuyo UTM-30LX.

  • Onboard router for internal network with wi-fi and 3G connectivity

  • Remote and onboard emergency stop buttons

  • Two industrial onboard computers and an external laptop for monitoring

  • ROS enabled robot

Key features

  • Dimensions: 80 (W) x 140 (L) x 130 (H) cm
  • Skid steer Segway RMP 400 platform. Maximum speed of ~1m/s
  • Battery with up to 3h operation time and 8h charge time
  • Two lasers Hokuyo UXM-30LX
  • One IMU sensor
  • One GNSS receiver
  • One homemade 3D laser based on Hokuyo UTM-30LX

Possible applications

  • Navigation
  • SLAM
  • 3D mapping
  • Multi robot systems
  • Teleoperation

Additional informations: http://wiki.iri.upc.edu/index.php/TEO

https://www.terrinet.eu/wp-content/uploads/2018/05/tibidabo.png

IRIcar robot

The autonomous car is based on a standard golf cart that has been robotized. The robot is capable of carrying up to two adult people in slopes up to 30 degrees.
With the information provided by a 32 beams 360º laser range sensor, a 360º camera and a frontal 2D ranger, the autonomous car can navigate in 2D environments taking into account the ackermann constraints.
The robot has two safety laser scanners that allow it to safely navigate around people. The robot is not homologated, and it cannot travel with regular traffic.

Technical Specifications

  • Weight of about 250 kg.
  • Maximum payload of 180 kg.
  • Dimensions: 120 (W) x 250 (L) x 175 (H) cm
  • High capacity battery for an extended operation time of up to 8 hours.
  • One frontal and one rear Leuze RS4 safety laser with a range of 4 m.
  • One frontal Hokuyo UXM-30LX laser with a range of 30 m.
  • One Velodyne HDL-32 on top of the car with a range of 80 m.
  • One Ladybug 360º degrees camera on top of the car (5 individual cameras).
  • Loquendo text-to-speech software with english, spanish and catalan languages

  • Touch screen

  • Onboard router for internal network with wi-fi and 3G connectivity

  • Remote and onboard emergency stop buttons

  • Three industrial onboard computers and an external laptop for monitoring

  • ROS enabled robot

Key features

  • One frontal and one rear Leuze RS4 safety laser with a range of 4 m
  • One frontal Hokuyo UXM-30LX laser with a range of 30 m
  • One Velodyne HDL-32 on top of the car with a range of 80 m
  • One Ladybug 360º degrees camera on top of the car (5 individual cameras)

Possible applications

  • 2D navigation with ackermann constraints
  • Autonomous people transportation system
  • Human robot interaction
  • Teleoperation
https://www.terrinet.eu/wp-content/uploads/2018/12/ana_and_helena_robot.jpg

Ana and Helena Pioneer robots

Mobile urban service robot aimed to perform navigation, human robot interaction and package delivery tasks.
Navigation is based on the skid steer Pioneer 3AT platform, with a 3D lidar and stereo camera for obstacle detection.
Human robot interaction is based on a pan and tilt camera, status feedback lights, text-to-speech software, a microphone and a touch screen.

Technical Specification

  • Weight of about 30kg

  • Dimensions: 50 (W) x 65 (L) x 100 (H) cm
  • Battery with up to 5h operation time and 10h charge time.

  • Skid steer mobile platform Pioneer 3AT
  • One Lidar Velodyne Puck VLP16
  • One stereo camera Zed
  • One IMU sensor
  • One GNSS receiver
  • Loquendo text-to-speech software with english, spanish and catalan languages

  • Touch screen
  • Onboard router for internal network with wi-fi and 3G connectivity

  • One onboard computer and an external laptop for monitoring

  • ROS enabled robot

Key features

  • Skid steer mobile platform Pioneer 3AT
  • One Lidar Velodyne Puck VLP16
  • One stereo camera Zed
  • One IMU sensor
  • One GNSS receiver

Possible applications

  • 2D/3D navigation in urban environments
  • Human robot interaction
  • Multi robot systems
  • Teleoperation
https://www.terrinet.eu/wp-content/uploads/2018/03/TERRINet-Logo-su-Nero.png
https://www.terrinet.eu/wp-content/uploads/2018/05/logo-ce-horizontal-en-neg-quadri.png

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 730994

Training
https://www.terrinet.eu/wp-content/uploads/2018/03/TERRINet-Logo-su-Nero.png

Copyright by TERRINet. All rights reserved. – Designed by RGR