DIVERSITY IN ALL PARTSDiscover all the
Robotic Platforms

See here all the Robotic Platforms of our Infrastructures!
Featured platforms
https://www.terrinet.eu/wp-content/uploads/2018/06/8x8-AGV.jpg

Universidad de Sevilla (USE) Robotics, Vision and Control Group 8×8 AGV

A self-designed 8×8 AGV. Each wheel has an independent traction and direction, these features make the vehicle capable of move in almost any environment. It is controlled by a Pixhawk(Px4) autopilot. His design is thought out to can hold a variety of payload, for example a lidar or a terabee laser sensor. Up to 6kg can be carried.

https://www.terrinet.eu/wp-content/uploads/2018/08/armar6_squared.png

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) ARMAR 6

ARMAR-6 is a collaborative humanoid robot assistant for industrial environments. Designed to recognize the need of help and to allow for an easy and safe human-robot interaction, the robot’s comprehensive sensor setup includes various camera systems, torque sensors and systems for speech recognition. The dual arm system combines human-like kinematics with a payload of 10 kg which allows for dexterous and high-performant dual arm manipulation. In combination with its telescopic torso joint and a pair of underactuated five-finger hands, ARMAR-6 is able to grasp objects on the floor as well as to work in a height of 240 cm. The mobile platform includes holonomic wheels, battery packs and four high-end PCs for autonomous on-board data processing. The software architecture is implemented in ArmarX (https://armarx.humanoids.kit.edu). High-level functionality, like object localization, navigation, grasping and planning are already implemented and available.

https://www.terrinet.eu/wp-content/uploads/2018/08/2.jpeg

University of the West of England (UWE Bristol) Bristol Robotics Laboratory BRL – Bert

BERT-1 (Bristol Elumotion Robotic Torso) is a self contained, fully automated Robotic Torso, designed and built by Elumotion to provide a robotic platform based on human anatomy at adult human scale. RT-1 is a highly articulated manipulating platform and includes novel dexterous hands that allow emulation of human gesturing.This anthopomorphic robot is designed to resabmle the human upper body and arms. Its degrees of freedom are approximately equal to humans’s upper limbs: 2 arms with 7 DOF each, plus 2 DOF shoulder and 2 DOF head.

https://www.terrinet.eu/wp-content/uploads/2018/06/sensefly-ebee.jpg

École Polytechnique Fédérale de Lausanne (EPFL) Laboratory of Intelligent Systems eBee drone

The senseFly’s eBee is a fully autonomous and easy-to-use mapping drone. Use it to capture high-resolution aerial photos you can transform into accurate orthomosaics (maps) & 3D models. The eBee package contains all you need to start mapping: RGB camera, batteries, radio modem and eMotion software.

https://www.terrinet.eu/wp-content/uploads/2018/05/image8.jpeg

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab Haption ABLE

The Able is an exoskeleton haptic interface for the arm and the hand. It provides from 4 to 7 active degrees of freedom. Able is the first industrialized exoskeleton haptic device – it can be ordered off-the-shelf. The Able is available for right configuration and for left configuration.

https://www.terrinet.eu/wp-content/uploads/2018/05/image21.jpg

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute I-Support soft arm

The I-Support platform consists of a manipulator made of two identical interconnected modules. Each module, 60 mm in diameter and 200 mm in length, counts on: 3 couples of McKibben-based bellow-like flexible fluidic actuators (FFA) arranged at 60° with respect to 3 cables, an internal channel designed for the provision of water/soap but that can be exploited for passing some possible tools up to the tip and a layered structure along the module to drive the activation energy towards the right direction. Each module has elongation, contraction, omnidirectional bending, and variable stiffness capability. Furthermore, the manipulator being modular and implementing bowden-cable technology has multiple advantages, as follows: (i) fast replacement, (ii) the possibility of using one/two modules, (iii) local control of the single segment and (iv) an easy design for sensors integration. The manipulator has been designed for assisting elderly people during the showering task, but it can be used as a multipurpose platform for implementing and testing control algorithms for soft manipulators.

https://www.terrinet.eu/wp-content/uploads/2018/11/hrp_squared.png

Centre national de la recherche scientifique (CNRS) The Department of Robotics of LAAS Kawada Robotics HRP-2

Two human size humanoid robots in a fully equipped experimental room. LAAS has a long time experience in humanoid robot motion planning and control. After having demonstrated whole-body motion generation capabilities on HRP-2, LAAS is now developing new algorithms to enable physical interaction of humanoid robots with their environment and with humans. The new robot robot Pyrène constructed by Pal Robotics based on the experience of LAAS is powerful and designed to be torque controlled.

https://www.terrinet.eu/wp-content/uploads/2018/05/Microsurgical-Robot_TUM.jpg

Technical University Munich (TUM) Robotics and Embedded Systems Microsurgical Robot

Microsurgical platform is a setup that is originally designed and developed for ophthalmic application but it can be also used for other medical and biological procedures. It could be also used as a training platform for micro-robotic applications. Multidiciplinary team of researchers including engineers; biologists and clinicians can perform their micromanipulation tasks using the platform. Integration of imaging devices and microscopes to the setup enables performing interdiciplinary projects such as image guided micromanipilation.

https://www.terrinet.eu/wp-content/uploads/2018/07/RavenII-hamlyn.jpeg

Imperial College London (IMPERIAL) The Hamlyn Centre RAVEN II Surgical Robot

The RAVEN is a proven, third generation surgical robotics testbed that provides the nucleus of an open innovation community. This community is united in the application and support of a common platform, and each member has the power to pursue and develop their own intellectual property. Improvements made by the community to the core robotic manipulator will be made available to the entire group. Individual members then have the opportunity to pursue their own proprietary advances in procedures, attached instruments, supervisory software, and human/machine interfaces.

https://www.terrinet.eu/wp-content/uploads/2018/06/coming-out-of-the-box.jpg

École Polytechnique Fédérale de Lausanne (EPFL) BioRobotics Lab Roombots

Modular robotics for adaptive and self-organizing furniture that moves, self-assembles, and self-reconfigures. Our dream is to provide multi-functional modules that are merged with the furniture and that lay users and engineers can combine for multiple applications.

https://www.terrinet.eu/wp-content/uploads/2019/02/Space53-small-320x177.jpg

University of Twente (UT) Department of Robotics Space53

Space53 builds the tools and facilities to evolve unmanned systems technology into concepts that create societal and economic impact based on Technology Readiness, Economic Readiness (economic or societal business case) and Societal Readiness (ethics, legal, society).Space53 offers testing and training facilities and proper legal and procedural conditions for the safe and legal testing of unmanned aerial systems in a fully structured environment (lab), a fully unstructured environment (real life) and all gradual steps in between. Space53 has access to large enclosed structures, a 3 km runway with airspace and several experimental zones in industrial and urban environments.Space53 has a close collaboration with S/park. S/park is the open innovation center located at Nouryons’ former production site in Deventer, the Netherlands. At the S/park site there are many interesting assets (tanks, installations, factories etc) available that can be used for validation, demonstrations of new technologies like robots/drones for inspections, safety solutions, virtual reality etc.

https://www.terrinet.eu/wp-content/uploads/2018/11/icub_squared.png

Instituto Italiano di Tecnologia (IIT) iCub Facility The iCub robot

The iCub is a humanoid robot designed to support research in embodied AI. At 104 cm tall, the iCub has the size of a five year old child. It can crawl on all fours, walk and sit up to manipulate objects. Its hands have been designed to support sophisticate manipulation skills. The iCub is distributed as Open Source following the GPL licenses. The entire design is available for download from the project’s repositories (http://www.iCub.org). Four robots are available in the iCub Facility at the Istituto Italiano di Tecnologia. The iCub is one of the few platforms in the world with a sensitive full-body skin to deal with the physical interaction with the environment including possibly people.

https://www.terrinet.eu/wp-content/uploads/2018/12/tibianddabo.png

Universitat Politècnica de Catalunya (UPC) IRI Tibi and Dabo robots

Tibi and Dabo are two mobile urban service robots aimed to perform navigation and human robot interaction tasks.Navigation is based on the differential Segway RMP200 platform, able to work in balancing mode, which is useful to overcome low slope ramps. Two 2D horizontal laser range sensors allow obstacle detection and localization.Human robot interaction is achieved with two 2 degrees of freedom (dof) arms, a 3 dof head with some face expressions, a stereo camera, text-to-speech software and a touch screen.They can be used to provide information, guiding and steward services to persons in urban spaces, either alone or both in collaboration.

Platform search

Welcome to the TERRINet Platform search tool. You can use this search tool to search for specific platforms, infrastructures or keywords. The search input field also accepts regular expressions (e.g. categoryA|categoryB). For a full list of all available keywords please visit:

Note that the search requires JavaScript. If you cannot see the search input field please make sure that you have JavaScript enabled.

Quicksearch:

ImagePlatform
8×8 AGV

Universidad de Sevilla (USE) Robotics, Vision and Control Group 8×8 AGV

A self-designed 8×8 AGV. Each wheel has an independent traction and direction, these features make the vehicle capable of move in almost any environment. It is controlled by a Pixhawk(Px4) autopilot. His design is thought out to can hold a variety of payload, for example a lidar or a terabee laser sensor. Up to 6kg can be carried.
ABB IRB 120

University of the West of England (UWE Bristol) Bristol Robotics Laboratory ABB IRB 120

Flexible 6-axis industrial robot, with a payload of 3 kg, designed specifically for manufacturing industries that use robot-based automation. 3 robots available, with compact IRC5, RobotWare and RobotStudio available. 3D camera available for bin picking and part location. Flexible and with high speed. The presence of a cage or additional safety systems is required. High repeatability and speed, medium-low load capacity. 16/16 I/Os with 24V, 1 A power supply, and 5 MPa pneumatic air supply.
ABB IRB 14000 YuMi

University of the West of England (UWE Bristol) Robotics Innovation Facility ABB IRB 14000 YuMi

Collaborative, dual-arm robot. The robot includes integrated collision detection, lead-through mode, force-sensing parallel grippers, integrated camera-based part location and synchronized arm motion control. It presents over actuated arms (additional external link) for easy repositioning, alternative configurations and object avoidance. Flexible and safe to work with, without the need of cage or additional safety systems. High repeatability and speed, limited load capacity. Tool flange presents 24V, 1 A power supply, with Ethernet communication protocol. Alternative solutions (serial or custom) are available for different end-effectors.
Aerial Robots in a flight arena

Centre national de la recherche scientifique (CNRS) The Department of Robotics of LAAS Aerial Robots in a flight arena

Several models of flying robots, as quadrotors and hexarotors aerial robots, in a delimited flight arena of 6mx4mx5m (l,w,h) enclosed by a protective net. The ground is covered by protective mattresses. The arena is equipped with a motion capture system.

AMUSE

Universidad de Sevilla (USE) Robotics, Vision and Control Group AMUSE

Self-designed octocopter controlled by a pixhawk autopilot with an Intel Nuc (I5) for extra computational capabilities. It also has a Jexton TX1 GPU and a velodyne 3D laser as extra payload. It is designed for accomplish different task, using different types of sensors, like stereo cameras, laser sensors, GPS, altimeters, etc.
Ana and Helena Pioneer robots

Universitat Politècnica de Catalunya (UPC) IRI Ana and Helena Pioneer robots

Mobile urban service robot aimed to perform navigation, human robot interaction and package delivery tasks. Navigation is based on the skid steer Pioneer 3AT platform, with a 3D lidar and stereo camera for obstacle detection. Human robot interaction is based on a pan and tilt camera, status feedback lights, text-to-speech software, a microphone and a touch screen.
AR drone

École Polytechnique Fédérale de Lausanne (EPFL) Laboratory of Intelligent Systems AR drone

Parrot AR.Drone 2.0 Elite Edition allows you to see the world from above and to share your photos and videos on social networks instantly. It manoeuvres intuitively with a smartphone or tablet and offers exceptional sensations right from take-off. Soft protective frame allows to use indoors and and in crowded environments.
ARMAR 6

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) ARMAR 6

ARMAR-6 is a collaborative humanoid robot assistant for industrial environments. Designed to recognize the need of help and to allow for an easy and safe human-robot interaction, the robot’s comprehensive sensor setup includes various camera systems, torque sensors and systems for speech recognition. The dual arm system combines human-like kinematics with a payload of 10 kg which allows for dexterous and high-performant dual arm manipulation. In combination with its telescopic torso joint and a pair of underactuated five-finger hands, ARMAR-6 is able to grasp objects on the floor as well as to work in a height of 240 cm. The mobile platform includes holonomic wheels, battery packs and four high-end PCs for autonomous on-board data processing. The software architecture is implemented in ArmarX (https://armarx.humanoids.kit.edu). High-level functionality, like object localization, navigation, grasping and planning are already implemented and available.
ARMAR-4

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) ARMAR-4

ARMAR-4 is a full-body humanoid robot with torque control capabilities in its arm, leg and torso joints. It has 63 active degrees of freedom with 63 actuators overall, including feet, neck, hands and eyes. It features more than 200 individual sensors for position, temperature and torque measurement, 76 microcontrollers for low-level data processing and 3 on-board PCs for perception, high-level control and real-time functionalities. The robot stands 170 cm tall and weighs 70 kg. Each leg has six degrees of freedom, mimicking the flexibility and range of motion of the human leg. For maximum range of motion and dexterity, each arm has eight degrees of freedom. The kinematic similarity to the human body facilitates the mapping and execution of human motions on the robot. The four end-effectors (hands and feet) are equipped with sensitive 6D Force/Torque sensors to accurately capture physical interaction forces and moments between the robot and its environment. The robot’s two eyes are each equipped with two cameras for wide and narrow angle vision. The three control PCs (two in the torso, one in the head) run Ubuntu 14.04 and control the robot via the ArmarX software framework (https://armarx.humanoids.kit.edu), wherein high-level functionalities like object localization, grasping and planning are already implemented and available.
ARMAR-III in a robot kitchen

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) ARMAR-III in a robot kitchen

The humanoid robot ARMAR-III has been designed to help in the Kitchen, e.g. bring objects from the fridge and fill the dishwasher. It has a total of 43 DoF. A mobile platform equipped with three laser scanners allows the robot to navigate the kitchen environment. The arms have 7 DoF each with 8 DoF five fingered hands. A force-torque sensor is available in each wrist. ARMAR-III uses the Karlsruhe Humanoid Head with 7 DoF. For vision, four digital cameras are integrated into the head. Each eye has a wide-angle and narrow-angle camera for peripheral and foveal vision, respectively. There are four PCs inside the mobile base, which run Linux as their operating system. The robot software was originally written in MCA but can also be controlled via the newer ArmarX framework (https://armarx.humanoids.kit.edu). High-level functionality, like object localization, navigation, grasping and planning are already implemented and available.
Assisted Living Studio

University of the West of England (UWE Bristol) Ambient Assisted Living Laboratory Assisted Living Studio

Anchor Robotics Personalised Assisted Living Studio is an in-house facility to develop, test and implement assistive robots and heterogeneous sensor systems in a realistic environment, bringing together our expertise in robotics, human-robot interaction, intelligent learning systems and person-centred design. This helps to ensure real-world applicability of our research and can help in reducing the time to get these innovative technologies to market.
ATLAS

Universidad de Sevilla (USE) Robotics, Vision and Control Group ATLAS

ATLAS (Air Traffic Laboratory for Advanced Unmanned Systems) is a Test Flight Centre located in Villacarrillo (Jaen) which offers the international aerospace community an aerodrome equipped with excellent technological-scientific facilities and airspace ideally suited to the development of experimental flights with unmanned aerial vehicles (UAS/RPAS). The ATLAS Centre holds the first facilities in Spain exclusively dedicated to testing light and tactical Unmanned Aircraft System (UAS) or Remotely Piloted Aircraft Systems RPAS.
Barcelona Robot Lab

Universitat Politècnica de Catalunya (UPC) IRI Barcelona Robot Lab

Outdoor pedestrian area of 10.000sqm in the UPC nord campus, provided with fixed cameras, wifi, 3G/4G and partial gps coverage, with presence of buildings, open and covered areas, ramps and some vegetation. Several public space scenarios could be developed in this area, as markets, bars or shops aiming to deploy robots in a real and controlled urban scenario to perform navigation and human robot interaction and collaboration experiments for multiple applications.
Bobcat

Universidad de Sevilla (USE) Robotics, Vision and Control Group Bobcat

A commercial 4×4 gasoline Bobcat 2200 vehicle. Modified to remotely control speed and turn. It has a 2D lidar installed and, thanks to the big size of the system, no weight restriction for the payload. It can operate in any kind of environment.
BRL – Bert

University of the West of England (UWE Bristol) Bristol Robotics Laboratory BRL – Bert

BERT-1 (Bristol Elumotion Robotic Torso) is a self contained, fully automated Robotic Torso, designed and built by Elumotion to provide a robotic platform based on human anatomy at adult human scale. RT-1 is a highly articulated manipulating platform and includes novel dexterous hands that allow emulation of human gesturing.This anthopomorphic robot is designed to resabmle the human upper body and arms. Its degrees of freedom are approximately equal to humans’s upper limbs: 2 arms with 7 DOF each, plus 2 DOF shoulder and 2 DOF head.
BRL – Flying Arena

University of the West of England (UWE Bristol) Bristol Robotics Laboratory BRL – Flying Arena

BRL facilities include two Flying Arenas.The Large Flying Arena covers an area of 182 sqm and the Small an area of 106 sqm. The Large Flying Arena is equipped with an Infrared Motion Tracking System and an overhead camera.
Cheetah-Cub-AL

École Polytechnique Fédérale de Lausanne (EPFL) BioRobotics Lab Cheetah-Cub-AL

Cheetah-Cub (https://biorob.epfl.ch/cheetah) was not fundamentally altered from its early development days. Some major changes are introduced with Cheetah-Cub-AL. The leg was redesigned and features now a (to the saggital plane of the leg) symmetric diagonal spring, canceling unwanted bending behavior present in previous Cheetah-Cub-versions. Additionally, making use of classical CNC manufacturing techniques with aluminum in combination with ball-bearings in every joint, friction was reduced, alignment of the axis and repeatability of experiments were improved. The changes to the trunk are little but feature now an easy access to the control board for development purposes. Another major change is the switch to a new operating system, Jokto, that improves stability and ease of use. Tuleu implemented inverse-kinematics of the legs for control purposes. This allowed to tune gaits much faster andmore intuitively. The robot was featured recently in Prof. Ijspeert’s talk in TED Global Geneva.
Cobomanip

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab Cobomanip

The Cobomanip assists the operator during handling operation. It is designed as a standard a force generator more than a robot driven in position control mode. In the entire workspace it will assist or oppose to the movements of the operator according to the task definition and the requested assistances. One may therefore consider two distinct operating modes:

  • Movements in free space: the Cobomanip is a perfect balance with 3 or 4 dofs. It behaves like if the load is handled in a zero-gravity area.

  • Constrained movements: motors (one for each dof) apply counteracting torques to limit the manipulator movements into specific directions.

  • All the frictions are compensated with motors situated in the frame of the cobot.

COBOMANIP provides assistances to keep the attention of the operator focused on the main task using virtual guides to constrain the movement within a specific part of the workspace. Concepts for such constraints applied to teleoperation tasks are proposed in two distinct types:

  • The operator cannot enter a specific area of the workspace. The movement is possible until a boundary is reached.

  • Assistance to guide the operator. Movements of the operator are limited to those allowed by a mechanism attached to the robot.

COMAU Dual arm robot

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute COMAU Dual arm robot

The Comau Smart Dual arm robot is a robotic system especially designed to perform assembling tasks. It is based on an innovative humanlike approach, having a sensor based platform and cognitive functions in order to support the assembling process. Its ease of use and the cooperation with humans in the different phases of each task, makes it ideal also for SME applications. Currently a lot of new devices and solution are under development: • Devices for fenceless approach (safety eyes by PILZ, stereocameras) • Vision system for parts recognition • Force torque sensor • Devices for easy programming (gesture and vocals recognition) • A Kinect v2 • Festo and Schunk grippers • Integration of C5G open control and ORL libraries. C5G Open is suitable for industrial application that need additional sensors integrated in the system. For this purpose, used hardware parts have to be compliant with the industrials regulations.
Cooperative Robotic Manufacturing Station

Technical University Munich (TUM) Robotics and Embedded Systems Cooperative Robotic Manufacturing Station

The setup consists of several robotic arms (Staubli TX0 and TX90, 4X ABB IRB 120, KUKA LRB iiwa), end effectors (including human-robot interaction safe R800 gripper) and a mock-up of a collaborative manufacturing cell equipped with a tactile SAPARO floor. The environment can be easily configured to represent different variations of the manufacturing and robot manipulation scenarios involving both industrial robots and human operators. This installation is particularly useful for research on human-robot cooperation, multi-robot object manipulation, human tracking and detection for ensuring safety, etc. It provides unique opportunities to perform research in manipulative and collaborative robotics with bleeding edge robotic sensors (e.g. the SAPARO floor), and several different manipulators, including the human-safe ones (e.g. KUKA iiwa). As such, it is attractive not only for the research community but also for the industry (especially in the SME segment) which require innovative robotic solutions that are both flexible and safe for humans.
Da Vinci Research Kit (DVRK)

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute Da Vinci Research Kit (DVRK)

The da Vinci Research Kit (dVRK) is a research platform based on the da Vinci Surgical System developed and distributed by Intuitive Surgical Inc. The kit is a collection of first-generation da Vinci components that can be used to assemble a telerobotics platform which provides complete access to all levels of control via open source electronics and software. The platform consists of a surgeon’s console to tele-operate the surgery and a patient side system where the surgery takes place. The surgeon’s console consists of two Master Tool Manipulators, each having 8 DOF for dexterous and natural hand manipulation, and a foot-pedal tray. On the other side at the patient’s end, there are two Patient Side Manipulators, which are controlled by the two Master Tool Manipulators. The interface between the two components is based on custom hardware consisting of motor-controllers, coupled with FPGAs and connected to a PC running the control loops. The DVRK can be exploited for interfacing with different master manipulators, for testing force-feedback strategies, of for the integration on novel tools for surgery.
Da Vinci Research Kit (DVRK)

Imperial College London (IMPERIAL) The Hamlyn Centre Da Vinci Research Kit (DVRK)

The da Vinci Research Kit (dVRK) is a research platform based on the da Vinci Surgical System developed and distributed by Intuitive Surgical Inc. The kit is a collection of first-generation da Vinci components that can be used to assemble a telerobotics platform which provides complete access to all levels of control via open source electronics and software. The platform consists of a surgeon’s console to tele-operate the surgery and a patient side system where the surgery takes place. The surgeon’s console consists of two Master Tool Manipulators, each having 8 DOF for dexterous and natural hand manipulation, and a foot-pedal tray. On the other side at the patient’s end, there are two Patient Side Manipulators, which are controlled by the two Master Tool Manipulators. The interface between the two components is based on custom hardware consisting of motor-controllers, coupled with FPGAs and connected to a PC running the control loops.The DVRK can be exploited for interfacing with different master manipulators, for testing force-feedback strategies, of for the integration on novel tools for surgery.
Darius

Universidad de Sevilla (USE) Robotics, Vision and Control Group Darius

Self-designed hexacopter, designed for being able to accomplish a variety of tasks. It can be controlled with two different types of autopilots: Pixhawk (Px4) and Naza V3. It can carry up to 8kg of payload, including robotic arms. It is also foldable.
DJI F550

Universidad de Sevilla (USE) Robotics, Vision and Control Group DJI F550

Comercial DJI hexacopter. Much smaller than the others, it can only carry GPS and little cameras. The autopilot is a Pixhawk (Px4) and it is programmed with ROS/Ubuntu.
DJI Matrice 600

Universidad de Sevilla (USE) Robotics, Vision and Control Group DJI Matrice 600

The Matrice 600 (M600) is a flying platform designed for professional aerial photography and industrial applications. It is built to closely integrate with a host of powerful DJI technologies, including the A3 flight controller, Lightbridge 2 transmission system, Intelligent Batteries and Battery Management system, for maximum performance and quick setup. This excellent features make it capable of use self-designed robotic arms as payload.
Domotic House

Centre national de la recherche scientifique (CNRS) The Department of Robotics of LAAS Domotic House

Large experimental space reproducing the scenery of an apartment with an open roof. The environment is supplied with furniture amidst which various robots can navigate and execute daily tasks. The apartment is equipped with various sensors including a motion capture system for studying and experimenting human-robot interaction scenarios.
eBee drone

École Polytechnique Fédérale de Lausanne (EPFL) Laboratory of Intelligent Systems eBee drone

The senseFly’s eBee is a fully autonomous and easy-to-use mapping drone. Use it to capture high-resolution aerial photos you can transform into accurate orthomosaics (maps) & 3D models. The eBee package contains all you need to start mapping: RGB camera, batteries, radio modem and eMotion software.
Electric Car Testbed

Technical University Munich (TUM) Robotics and Embedded Systems Electric Car Testbed

The electric car testbed consists of four controlled drives directly connected to the axes of a Roading Roadster Electric car. Both the car drive and other components, as well as the external torque applied to the axes, can be controlled, while the complete state of the car sensors is registered. The station can be used to simulate desired driving conditions and evaluate the performance of wheel drive control algorithms in fully-controlled, reproducible environment. Moreover, the station contains an augmented reality system, which can be used to control the car in a life-like way and allow safe testing of autonomous driving solutions.
Engineered Arts – RoboThespian

University of the West of England (UWE Bristol) Bristol Robotics Laboratory Engineered Arts – RoboThespian

Life size, interactive, fully programmable humanoid robot exhibit suitable for public display, demonstration and academic research.A different head design containing a pico projector projects the entire face of the robot. Using InYaFace software, the face can take on any form and is animated in real time to show any expression.Hybrid design with pneumatic and DC servo motor actuators for upper body, upper limbs and head. Featuring gripping hands.Suitable for indoor use only.Standard domestic mains power in the range 100v – 250v AC.
Engineered Arts – SociBot mini

University of the West of England (UWE Bristol) Ambient Assisted Living Laboratory Engineered Arts – SociBot mini

SociBot Mini integrates the core technologies of RoboThespian in a desktop-sized robot. Extra ports and multiple Ethernet connections make it easy to interface with your choice of hardware. The projective head and fully articulated neck make SociBot even more expressive than its larger sibling. Ideal for individual researchers, small-footprint telepresence, or just for playing around with at home, SociBot offers a simple and affordable introduction to advanced robotics. SociBot can track the position of more than 12 people at a time, even in a crowd. It can also detect gestures like hand waves and body poses. The internal head projector can map any face onto any mould – talk to us about customizing your facial features and projection options. See our Technology section for more details.
FUTURA platform for US-guided HIFU treatment

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute FUTURA platform for US-guided HIFU treatment

The FUTURA system is a robotic-assisted platform designed for Ultrasound-guided High Intensity Focused Ultrasound (HIFU) treatment. The control of two independent anthropomorphic manipulators provides the FUTURA platform with high flexibility in terms of operating workspace and maneuverability. The platform is composed of: i) a robotic module, ii) a therapeutic module, and iii) a monitoring module. The robotic module is composed by two anthropomorphic industrial manipulators (i.e., ABB IRB 120) equipped with two force/torque sensors (ATI mini 45). The monitoring module is composed by two different US probes: i) a 2D imaging US probe (Analogic Ultrasound PA7-4/12) confocal to the HIFU transducer, and ii) a motorized 3D imaging US probe (Analogic Ultrasound 4DC7-3/40) mounted on the second manipulator, both connected to the Analogic Ultrasound SonixTablet machine. The therapeutic module consists of a custom-made Focused Ultrasound System. This system has three main components: i) a multi-channel high power signal generator (Image Guided Therapy), ii) a 16 channels phased annular array transducer (Imasonic), and iii) a coupling system (small pillow filled with water) which provides a good acoustic path between the transducer and the patient. The remote control on the FUS generator allows to adjusts the shooting parameters (e.g. focal depth) with a frequency of 20 Hz. The different modules of the FUTURA platform are mutually controlled through dedicated software developed in Robot Operating System (ROS) framework. The FUS treatment is managed by the users through a dedicated Human Machine Interface with real-time visualization of the working scenario. The high modularity of the platform allows for the testing of different modalities: specific experiments dedicated to image guidance, force controlled contact with human tissues, obstacle avoidance strategies between manipulators and patients/operators can be also set-up in the framework of the overall structure.
Haption ABLE

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab Haption ABLE

The Able is an exoskeleton haptic interface for the arm and the hand. It provides from 4 to 7 active degrees of freedom. Able is the first industrialized exoskeleton haptic device – it can be ordered off-the-shelf. The Able is available for right configuration and for left configuration.
Haption Virtuose 6D

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab Haption Virtuose 6D

The Virtuose 6D is the only product on the market combining a high force feedback in the 6 degrees of freedom with a large workspace. The Virtuose 6D is especially suited for scale one manipulations virtual and distant (robotic control). It is also used as a co-manipulation medical robot and in rehabilitation applications. Its handle has 3 buttons (2 programmable), and is replaceable through our tool changer. You can for instance use it with 3D printed objects. This product is available in a High Force Version, for applications in need of a very high level of force feedback.
HMI Human media interaction lab facilities

University of Twente (UT) Department of Robotics HMI Human media interaction lab facilities

The lab offers extensive utilities for Human Robot Interaction Experimentation and testing in laboratory or in the wild conditions. There are two outfitted lab environments: A smaller (50m2) lab outfitted with virtual reality and tracking facilities as well as the DesignLab: DesignLab (2000m2) is a platform at the University of Twente for multidisciplinary collaboration, innovation and creativity. It connects students, educational staff, researchers, businesses, societal organisations and governments through its Science2Design4Society method. To provide an optimal infrastructure for team-based collaboration and multidisciplinary research and education, DesignLab offers dynamic working spaces in a state-of-the-art facility. The ambition of DesignLab: integrate TeamScience and Design Thinking into education and research, and use scientific insights to build a better tomorrow today. It inclused a maker space, an electronics lab, laser cutting and 3d printing facilities as well as a fully outfitted user lab with control room.
Human Motion Analysis with Vicon

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) Human Motion Analysis with Vicon

The human motion capture studio provides a unique facility for capturing and analyzing human motion as well as for the mapping to humanoid robots. The studio is equipped with 14 Vicon MX cameras (1 megapixel resolution and 250 fps), microphone array and several kinect cameras. Several tools for motion post-processing of recorded data, normalization, synchronization of different sensor modalities, visualization exist. In addition, a reference model of the human body (The Master Motor Map, MMM) and a standardized marker set allow unifying representations of captured human motion, and the transfer of subject-specific motions to robots with different embodiments. The motion data in the database considers human as well as object motions. The raw motion data entries are enriched with additional descriptions and labels. Beside the captured motion in its raw format (e.g., marker motions), information about the subject anthropometric measurements and the setup of the scene including environmental elements and objects are provided. The motions are annotated with motion description tags that allow efficient search for certain motion types through structured queries.
HVSLIM

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab HVSLIM

Prototype of a lower limbs exoskeleton.
I-Support soft arm

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute I-Support soft arm

The I-Support platform consists of a manipulator made of two identical interconnected modules. Each module, 60 mm in diameter and 200 mm in length, counts on: 3 couples of McKibben-based bellow-like flexible fluidic actuators (FFA) arranged at 60° with respect to 3 cables, an internal channel designed for the provision of water/soap but that can be exploited for passing some possible tools up to the tip and a layered structure along the module to drive the activation energy towards the right direction. Each module has elongation, contraction, omnidirectional bending, and variable stiffness capability. Furthermore, the manipulator being modular and implementing bowden-cable technology has multiple advantages, as follows: (i) fast replacement, (ii) the possibility of using one/two modules, (iii) local control of the single segment and (iv) an easy design for sensors integration. The manipulator has been designed for assisting elderly people during the showering task, but it can be used as a multipurpose platform for implementing and testing control algorithms for soft manipulators.
IH2 Azzurra Hand

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute IH2 Azzurra Hand

Intrinsic robotic hand with all functional components (5 motors, tactile sensors and control electronics) integrated in the palm and in the underactuated, self-adaptive fingers. Able to perform multiple grasps and sense objects. Simple communication interface (RS-232 over USB or Bluetooth). Standard prosthetic wrist attachments available (compatible with Ottobock QWD). The compact size of these hands allows using them in research, evaluation and clinical experience with humans in real daily living environments on human-machine interfaces (either invasive or non-invasive) and control (EMG, ENG, EEG, sensory feedback systems, etc). Not only! Due to their light weight and anthropomorphism they are suitable as robotic end-effectors on limited pay-load robotic arms.

Indoor robots

Centre national de la recherche scientifique (CNRS) The Department of Robotics of LAAS Indoor robots

Several models of indoor robot for navigation or manipulation equiped with specific sensors and motor capabilities.
InMotion wrist

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute InMotion wrist

The InMotion WRIST™ exoskeletal robot is capable of lifting even a severely impaired neurologic patient’s hand against gravity, overcoming most forms of hypertonicity. The InMotion WRIST™ exoskeletal robot accommodates the range of motion of a normal wrist in everyday tasks.
IRIcar robot

Universitat Politècnica de Catalunya (UPC) IRI IRIcar robot

The autonomous car is based on a standard golf cart that has been robotized. The robot is capable of carrying up to two adult people in slopes up to 30 degrees. With the information provided by a 32 beams 360º laser range sensor, a 360º camera and a frontal 2D ranger, the autonomous car can navigate in 2D environments taking into account the ackermann constraints. The robot has two safety laser scanners that allow it to safely navigate around people. The robot is not homologated, and it cannot travel with regular traffic.IRIcar robot
Kawada Robotics HRP-2

Centre national de la recherche scientifique (CNRS) The Department of Robotics of LAAS Kawada Robotics HRP-2

Two human size humanoid robots in a fully equipped experimental room. LAAS has a long time experience in humanoid robot motion planning and control. After having demonstrated whole-body motion generation capabilities on HRP-2, LAAS is now developing new algorithms to enable physical interaction of humanoid robots with their environment and with humans. The new robot robot Pyrène constructed by Pal Robotics based on the experience of LAAS is powerful and designed to be torque controlled.
KINOVA JACO

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab KINOVA JACO

JACO robot arm allow individuals to do the many “daily living” activities that promote self-reliance, independence and comfort—all things that contribute to their well-being. Safe to humans, JACO empowers people living with upper mobility impairments diagnosed with one of the following conditions: Muscular Dystrophy (MD), Spinal Muscular Atrophy (SMA), Tetraplegia, Amyotrophic Lateral Sclerosis (ALS), Cerebral Palsy (CP).
KIT Prosthetic Hand

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) KIT Prosthetic Hand

A five-finger 3D printed hand prosthesis with an underactuated mechanism, sensors and embedded control system. It has an integrated RGB camera in the base of the palm and a colour display in the back of the hand. All functional components are integrated into the hand, dimensioned according to a 50th percentile male human hand. Accessible via a simple communication interface (serial interface directly or via Bluetooth) or controllable via buttons. Camera and display allow for studies on vision-based semi-autonomous grasping and user feedback in prosthetics. As a stand-alone device the hand allows easy usage in different environments and settings.
KIT-EXO-1

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) KIT-EXO-1

The exoskeleton KIT-EXO-1 was developed with the aim to augment human capabilities or to use it in rehabilitation applications. It has two active DOF at the knee and ankle joint to support flexion/extension movement. The linear actuators consist of brushless DC-motors, coupled to planetary roller screws and an optional serial spring. They are equipped with absolute and relative position encoders as well as a force sensor and can be controlled via the ArmarX software framework (https://armarx.humanoids.kit.edu). Eight additional force sensors, which are distributed on the exoskeleton measure interaction forces between user and exoskeleton at thigh, shank and foot and can be used for research on intuitive exoskeleton control or to assess the kinematic compatibility of new joint mechanisms.
KOMPAÏ Robots – KOMPAÏ 1

University of the West of England (UWE Bristol) Ambient Assisted Living Laboratory KOMPAÏ Robots – KOMPAÏ 1

Service robot on mobile base, designed to work in indoor environments to support caregivers and healthcare assistants.
KUKA iiwa7 r800/iiwa14 r820 Platform

Imperial College London (IMPERIAL) The Hamlyn Centre KUKA iiwa7 r800/iiwa14 r820 Platform

The LBR iiwa has integrated, sensitive torque sensors in all seven axes. These endow the lightweight robot with contact detection capabilities and programmable compliance. It masters force-controlled joining operations and continuous-path processes for which the position of the objects must be determined sensitively. It can also handle fragile and sensitive objects without damaging them. In many cases, the integrated sensitivity of the LBR iiwa allows the use of simpler and less expensive tools.The KUKA iiwas are robotic arms with the strength and speed of an industrial robot. It is up to the programmer to set up the appropriate safety settings on the controller, and clear the working environment before testing any code. T1 mode should be used almost exclusively during testing phases to limit the speed of the robot in a safe and reliable way.
KUKA KR60-3

University of the West of England (UWE Bristol) Robotics Innovation Facility KUKA KR60-3

Six-axis industrial grade robot arm. Flexible and versatile, with high repeatability, medium load capacity and high speed, high duty cycles. EtherCAT communication (Industrial Ethernet), Profisafe, Backhoff 32/32 digital I/Os at 24 VDC and 4-channel outputs at 24 VDC, 2 A.
Laboratory space (equipment)

Instituto Italiano di Tecnologia (IIT) iCub Facility Laboratory space (equipment)

The laboratory space of the iCub Facility at the Istituto Italiano di Tecnologia is a fully equipped lab to support research and development in the field of humanoid robotics. It includes a complete electronics assembly, testing and re-working facility with the ability to design microcontroller and FPGA cards, develop firmware and system’s software, test motors and sensors. The iCub Facility sports workstations and engineers to design mechanics with a 10+ years of experience in humanoid robotics. The Facility includes four iCubs and two R1 robots (see related technical sheets). The computational infrastructure consists of servers and small clusters (including GPUs) to train and run machine learning algorithms. In addition, measuing equipment includes a 10-camera Vicon system for motion capture and a virtual reality system (with goggles, e.g. Oculus) to study teleoperation of humanoids in complex application scenarios.
LBR IIWA robot

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab LBR IIWA robot

The IIWA enables humans and robots to work together on highly sensitive tasks in close cooperation. Thanks to its joint torque sensors in all seven axes, the LBR IIWA can detect contact and is able to work directly with its human operator. You can choose from three operating modes and program the LBR iiwa via simulation: indicate the desired position and it will remember the coordinates of the path point. The LBR iiwa’s controller, KUKA Sunrise Cabinet, simplifies the quick start-up of even complex applications.
LOPES

University of Twente (UT) Department of Robotics LOPES

The LOPES can be used to assist patients (e.g. stroke, SCI) during walking or assess gait impairments. It has eight powered degrees of freedom (hip flexion/extension, hip abduction/aduction, knee flexion/extension, pelvis forward/backward and pelvis mediolateral). Other degrees of freedom are left free. The robot is attached with a minimal amount of clamps which results in a short donning and doffing time. It is admittance controlled and allows for control over the complete spectrum from low to high impedance. Kinematics and interaction forces are measured by the device and it can be easily combined with EMG measurements. It allows to test new controllers (e.g. for exoskeletons) or assessment algorithms in a safe environment.
Magnetic Micro Manipulation Platform

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute Magnetic Micro Manipulation Platform

The Magnetic Micro Manipulation Platform consists of a magnetic field generator and a control system. The magnetic field generator is composed by two orthogonal pairs of Helmholtz coils, which generate a uniform field, and two orthogonal pairs of Maxwell coils, which generate a uniform field gradient. The 2D workspace is 2 mm x 3.5 mm. The maximum magnetic field and gradient are 12 mT and 1.2 T/m, respectively, generated by currents in the range 0-2 A. The control system comprises a laptop, a joypad (SAITEK P580 Blue Rumble Pad), a camera (BASLER scA1390-17gc), a data acquisition board (NI DAQ USB-6259) and custom electronicsA custom electronics is avaiable and it is based on a closed loop op-amp configuration with a Darlington stage. The driving signals for the electronics are generated through the DAQ board by means of a software program developed in LabVIEW (National Instruments, Inc., USA). The magnetic fields and gradients are controlled by the user through the joypad interface. The GUI (graphical user interface) can be adapted in order to full-fill specific testing needs. At the same time, the camera for the 2D visualization of the workspace can be adapted depending on the type of microrobot which is under testing.
MBZIRC Hexarotor

Universidad de Sevilla (USE) Robotics, Vision and Control Group MBZIRC Hexarotor

Self-designed hexarotor, controlled with a Pixhawk (Px4) autopilot. An intel Nuc (I5) is also embedded for more computational capability. It works with ROS and it can be simulated with Gazebo. Payload includes among others: laser sensors, GPS, stereo camera and an electromagnet. Self-designed robotics arms are also used in this platform. This provides a multitask aerial robot.
Micro-Robot Fabrication and Characterisation

Imperial College London (IMPERIAL) The Hamlyn Centre Micro-Robot Fabrication and Characterisation

The Hamlyn Centre Micro-Robot Fabrication and Characterisation platform is articulated around our clean room for microfabrication equipped with a NanoScribe system and expertise in both untethered and tethered microrobots. In addition, a multimaterial fiber platform allows for integration of those tethered robot into complex systems. Precision assembly tools complete the platform.Characterisation tools adapted to those devices allow for full characterisations and development of the prototypes.Finally, various micro and nano manipulation devices, both mechanical and optical, and micro force sensors can be used for the robots actuation and operation.
Microsurgical Robot

Technical University Munich (TUM) Robotics and Embedded Systems Microsurgical Robot

Microsurgical platform is a setup that is originally designed and developed for ophthalmic application but it can be also used for other medical and biological procedures. It could be also used as a training platform for micro-robotic applications. Multidiciplinary team of researchers including engineers; biologists and clinicians can perform their micromanipulation tasks using the platform. Integration of imaging devices and microscopes to the setup enables performing interdiciplinary projects such as image guided micromanipilation.
Motion capture and virtual reality platform

Imperial College London (IMPERIAL) The Hamlyn Centre Motion capture and virtual reality platform

This laboratory is equipped with a state of the art multi-camera motion capture system, various devices for virtual reality immersion, and sporting equipment, ideal for testing wearable robots, body sensor networks and life assistance robots.

User studies in a virtual or real environment can be coupled with an EEG measurement system and motion capture in an open space environment.

Motion capture arena

École Polytechnique Fédérale de Lausanne (EPFL) Laboratory of Intelligent Systems Motion capture arena

This facility is a large room (~10x10x5.5 m volume) equipped with an array of Optitrack motion capture cameras and an accompanying software. The system allows to track up to 50 rigid objects with a position error less than 1 mm and with a frame rate of up to 200 fps. The cameras illuminate the scene with IR light so the use of special passive markers on the tracked objects is necessary. Most of the wall surface is covered with protective net. Initially designed for the experiments with flying robots this arena can be used in many other robotic applications. AR drone is available on site for experiments which is equipped with markers and with open source flight controller.
Motion Capture Facilities

Centre national de la recherche scientifique (CNRS) The Department of Robotics of LAAS Motion Capture Facilities

Large experimental room equipped with an optoelectronic Motion Capture System to compute the position of reflective markers, force plates embedded in the floor to measure ground reaction forces, 6-axis force sensors to measure additional force contacts, wireless EMG to measure the activity of muscles. The system is provided with a processing software to reconstruct the whole-body dynamics and identify key elements of the musculoskeletal activity.
NAO Robots

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) NAO Robots

NAO is an autonomous, programmable humanoid robot developed by Adelbaran Robotics in 2006 (now SoftBankRobotics [1]). The robot is commonly used for research and education purposes. Mostly known is this robot due to the participation in the RoboCup Standard Platform Soccer League [2]. In this league the robots play fully autonomously, make decisions independently from each other but can also cooperate with each other. NAO can be programmed graphically via the programm Choregraphe, which is commonly used by young students with no programming experience. The robot can also be programmed in python and C++, which allow a wide range of complexe tasks to be fulfilled . The robot can also be used to show mathematical modelling on humanoids robots to solve complexe tasks.
NeuiCub

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute NeuiCub

The NeuiCub platform is composed of an iCub robot controlled through a SpiNNaker neuromorphic board (SpiNN-5). It is meant for neurorobotic experiments that involve detailed brain models implemented with spiking neural networks. The SpiNNaker neuromorphic platform comprises arrays of low-power, parallel custom chips (each containing 18 ARM9 cores) running a digital software simulation of neurons and synapses. The SpiNN-5 board includes 48 processors. The system’s philosophy focuses on large brain simulation and spike communication in real time whilst scaling up to biological scale. A standard SpiNNaker neural model is primarily configured through the provision of the “sPyNNaker” implementation of the Python-based PyNN modeling framework. The version of the iCub humanoid robot available at the BioRobotics Institute is provided with one head that has 6 degrees of freedom (dofs), 3 for eyes and 3 for neck control), two arms (7 dofs each) and one hand (3 dofs for the thumb, 2 for the index, 2 for the middle finger, 1 for the coupled ring and little finger, 1 for the adduction/abduction). The iCub robot available at the BioRobotics Insitute is equipped also with an inertial sensor, two dragonfly cameras and tactile sensors in the hand.
Neuromorphic Artificial Touch Sensors

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute Neuromorphic Artificial Touch Sensors

Neuromorphic Artificial Touch Sensors are a platform composed by an artificial tactile finger equipped with an array of 4 tactile MEMS sensors (MicroTAF), each having 4 output channels, and single board RIO by National Instruments. It encodes not only the normal force but also tangential forces. Data generated from the fingertip can be raw, i.e., the conversion of MEMS outputs, or neuromorphic, i.e., with sequences of neural-like spikes. The artificial tactile finger is available in two versions, one with a thick covering layer emulating the firing behaviour of type II human mechanoreceptors (Ruffini and Pacini), and another with a thin covering layer emulating the firing behaviour of type I mechanoreceptors (Merkel and Meissner). The platform is capable to discriminate between surfaces with different roughness, even between different daily use surfaces (like glass, wood, paper and more). The artificial tactile finger can be mounted on the Azzurra Hand with a proper adapter or any other hand/arm with a simple mechanical interface.
Oncilla

École Polytechnique Fédérale de Lausanne (EPFL) BioRobotics Lab Oncilla

Oncilla is a compliant, quadruped robot developed during the FP7 European project AMARSi (Adaptive Modular Architectures for Rich Motor Skills, project start March 2010, project duration 48 months, 4 Oncilla copies build and distributed, 2 remain at BIOROB). The goal of the AMARSi project was to improve richness of robotic motor skills. Oncilla is a highly sensorized robot with panthographic legs (ASLP legs) as well as an abduction/adduction (AA) mechanism. The sensorization features encoders on each joint and motor, IMU as well as new ground contact sensors in the feet (3d force-sensors). The research done with the BIOROB team focuses around closed loop rough terrain locomotion and richer motor behaviors through a combination of CPG’s and reflexes.
Operation room environment for medical robot prototype

Imperial College London (IMPERIAL) The Hamlyn Centre Operation room environment for medical robot prototype

Our operation theatre platform is equipped with various equipment and environment found in operation room specialised in neurosurgery.This platform is equipped with a large number of phantom for simulation of blood vessel, ultrasound and other various body parts and mannequin.Several screen can be used to simulate the feedback that surgeon will get during operation.Various eye tracking system are available for further advanced control and augmented reality applications.Robot control system and haptic devices are available for manipulation study.
Optitrack indoor testbed

Universitat Politècnica de Catalunya (UPC) IRI Optitrack indoor testbed

Indoor testbed based on an indoor positioning system that uses 20 Optitrack Flex13 infrared cameras. This system can calculate the position and orientation of moving objects, previously tagged with special markers, within the volume of the testbed (11x7x2.5 m) in real time (with an update rate of up to 120 Hz).
Outdoor robots

Centre national de la recherche scientifique (CNRS) The Department of Robotics of LAAS Outdoor robots

Three rover robots designed and equiped for outdoor navigation – Two robots RMP 400 and 440 – One reobot Sterela
PAL Robotics Pyrène

Centre national de la recherche scientifique (CNRS) The Department of Robotics of LAAS PAL Robotics Pyrène

Two human size humanoid robots in a fully equipped experimental room. LAAS has a long time experience in humanoid robot motion planning and control. After having demonstrated whole-body motion generation capabilities on HRP-2, LAAS is now developing new algorithms to enable physical interaction of humanoid robots with their environment and with humans. The new robot robot Pyrène constructed by Pal Robotics based on the experience of LAAS is powerful and designed to be torque controlled.
PAL Robotics Tiago

University of the West of England (UWE Bristol) Bristol Robotics Laboratory PAL Robotics Tiago

Service robot on mobile base, designed to work in indoor environments.Laser range-finder and mapping and localization in unstructured indoor environments. People aware multi-sensor navigation. Obstacle avoidance.Front RGB-D camera for object recognition and pose estimation, face detection & recognition, people detection.Multilanguage text-to-speech & speech recognition. Remote control with tablet. Telepresence and teleoperation.Pick & place with grasping and dexterous 7 DoF manipulator, with lead-through and force sensing.Interchangeable end-effector with force-torque sensing.Large workspace: from ground level to 1.5 m
Pioneer 3-AT

Universidad de Sevilla (USE) Robotics, Vision and Control Group Pioneer 3-AT

The PIONEER 3-AT is a highly versatile four-wheel drive robotic platform. Powerful, easy to use reliable, flexible, P3-AT is a popular team performer for outdoor or rough-terrain projects. It offers an embedded computer option, opening the way for onboard vision processing, Ethernet-based communications, laser, DGPS, and other autonomous functions. It is controlled with ROS/Ubuntu using a laptop.
PK2

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab PK2

Thanks to its unique mechanical qualities, the SYBOT brings competitiveness and attractiveness to manual operations that cannot be automated. Through agile automation, SYBOT simplifies the response to flexibility and variability requirements for all types of processes such as grinding.Without special training, the operator integrates and optimizes the use of the COBOT by enhancing his know-how in a work environment that does not require any modification. SYBOT interactive COBOTs improve productivity while reducing operator fatigue.
Ranger

Universidad de Sevilla (USE) Robotics, Vision and Control Group Ranger

A commercial fixed wing drone specially designed for FPV flights. It offers a lot of space inside the body for FPV equipment and cameras. It is compatible with most of the camera supports models. A perfect system for FPV flights and video recording. It can also carry a lot of payload compared with other models of the same weight.
RAVEN II Surgical Robot

Imperial College London (IMPERIAL) The Hamlyn Centre RAVEN II Surgical Robot

The RAVEN is a proven, third generation surgical robotics testbed that provides the nucleus of an open innovation community. This community is united in the application and support of a common platform, and each member has the power to pursue and develop their own intellectual property. Improvements made by the community to the core robotic manipulator will be made available to the entire group. Individual members then have the opportunity to pursue their own proprietary advances in procedures, attached instruments, supervisory software, and human/machine interfaces.
Rethink Robotics Baxter

University of the West of England (UWE Bristol) Robotics Innovation Facility Rethink Robotics Baxter

Collaborative, compliant dual-arm robot. Power and force limited by design, with series elastic actuators and torque sensors. The robot includes integrated collision detection, lead-through mode, parallel grippers and integrated vision. It presents over actuated arms (additional external link) for easy repositioning, alternative configurations and object avoidance. Flexible and safe to work with, without the need of cage or additional safety systems.
RoboGen

École Polytechnique Fédérale de Lausanne (EPFL) Laboratory of Intelligent Systems RoboGen

RoboGen™ is an open source platform for the co-evolution of robot bodies and brains. It has been designed with a primary focus on evolving robots that can be easily manufactured via 3D-printing and the use of a small set of low-cost, off-the-shelf electronic components. It features an evolution engine, and a physics simulation engine. Additionally it includes utilities for generating design files of body components to be used with a 3D-printer, and for compiling neural-network controllers to run on an Arduino microcontroller board.
Robotic arms: Franka Emica

University of Twente (UT) Department of Robotics Robotic arms: Franka Emica

At RaM we have multiple general-purpose robotic arms; KUKA LBR4+ arm and Franka Emika Panda arms. Each is a 7-axis fully-actuated robot with torque sensing and control. Each joint is equipped with a position sensor on the input side and position and torque sensors on the output side. The robots can thus be operated with position, velocity and torque control.Unlike typical factory robots, which are so dangerous they are often put inside cages, these arms can operate among people. They are designed to perform tasks that require direct physical contact in a carefully controlled manner. These include drilling, screwing, and buffing, as well as a variety of inspection and assembly tasks that electronics manufacturers in particular have long wanted to automate. The KUKA robot is programmable in C++. The Franka Emika robots are setup with Robotic Operating System (ROS) interface, hence they may be programmed in a variety of languages such as C++, Python, etc.
Robotic arms: KUKA LBR4+

University of Twente (UT) Department of Robotics Robotic arms: KUKA LBR4+

At RaM we have multiple general-purpose robotic arms; KUKA LBR4+ arm and Franka Emika Panda arms. Each is a 7-axis fully-actuated robot with torque sensing and control. Each joint is equipped with a position sensor on the input side and position and torque sensors on the output side. The robots can thus be operated with position, velocity and torque control.Unlike typical factory robots, which are so dangerous they are often put inside cages, these arms can operate among people. They are designed to perform tasks that require direct physical contact in a carefully controlled manner. These include drilling, screwing, and buffing, as well as a variety of inspection and assembly tasks that electronics manufacturers in particular have long wanted to automate. The KUKA robot is programmable in C++. The Franka Emika robots are setup with Robotic Operating System (ROS) interface, hence they may be programmed in a variety of languages such as C++, Python, etc.
Robotic Fast Prototyping Platform

Imperial College London (IMPERIAL) The Hamlyn Centre Robotic Fast Prototyping Platform

The robotic fast prototyping platform is a collection of industrial size 3D printing devices and a precision workshop dedicated to the design, fabrication and assembly of robot and medical robots.The fabrication centre comprises several 3D printer with large building volume. The printers are several polyjet printers, FDM printer, SLS metal printer. An electronic lab equipped with pick and place PCB machine allows for large scale fabrication of electronics board.A state of the art workshop and precision manufacturing tool is also available with a micro EDM and 5 axis milling machine.A Clean Room assembly space allows for an assembly complying with medical grade devices. The characterisation platform assure the conformity of the final prototyped both mechanically and electrically in semi-production scale.
Robotic support/performance measurement

Instituto Italiano di Tecnologia (IIT) iCub Facility Robotic support/performance measurement

Equipment at the iCub Facility allows evaluating performance of humanoid robots by measuring accurately their movements (10-camera Vicon system) and forces (via a force platform). In addition, for legged humanoids, we have a motorized gantry system that can follow the robot motion in the testing room while keeping the robot tethered for safety (in case of failure). In addition, we have a ”virtualizer” system that together with a sensorized suit and VR system allows experiments in the field of teleoperation of humanoid robots. For walking teleoperation (or ”in place” robot testing) we also have a standard treadmill which offers the possibility of testing walking humanoids on slopes of different degrees.
Robotic systems for high-fidelity neonatal simulation: training for medical doctors

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute Robotic systems for high-fidelity neonatal simulation: training for medical doctors

Simulation-based training is increasingly emerging in Neonatal Intensive Care Units (NICUs), since high fidelity simulation has been confirmed as an effective instructional strategy to develop clinicians’ technical and non-technical skills needed for patient care. By exploiting a strong collaboration with neonatologists, dedicated devices have been designed and realized both for training in neonatal intubation and mechanical ventilation.
  1. An active, robust and reliable neonatal skill trainer that is able to provide clinicians with real-time information about the execution of the intubation procedure in terms of both force peak value, force distribution and timing. The system is based on the integration of different sensing elements into a commercial Laerdal® Neonatal Intubation Trainer.
  2. An innovative neonatal respiratory simulation system was designed for obtaining a high-fidelity representation of physiological pulmonary features and breathing patterns in infants.
The prototype has 5 compartments arranged to reproduce anatomical distribution. Each compartment is characterized by its own adjustable compliance, and right and left respiratory branches are subjected to an independent and adjustable resistance level. The simulator is designed so as to be compatible with mechanical ventilators commonly used in NICUs, showing active behavior. With the final aim to provide medical doctors with mechatronic and robotic systems able to answer specific and different training needs, custom training kit can be designed and realized as described above, but by following the specific clinical requirements. In this case, an extra time should be devoted to the customization of the training devices.
Roombots

École Polytechnique Fédérale de Lausanne (EPFL) BioRobotics Lab Roombots

Modular robotics for adaptive and self-organizing furniture that moves, self-assembles, and self-reconfigures. Our dream is to provide multi-functional modules that are merged with the furniture and that lay users and engineers can combine for multiple applications.
RX-90

Universidad de Sevilla (USE) Robotics, Vision and Control Group RX-90

Two commercial RX-90 robotic arms. They can cooperate with each other to do different tasks. Controlled with Ubuntu/ROS thanks to self-designed programs.
SENLY

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute SENLY

SENLY is a mechatronic platform aimed at perturbing steady conditions, such us, walking or keeping the upright stance, by slipping-like perturbations. It mainly consists of a double split-belts treadmill which belts can be independently controlled along the fore-aft and medio-lateral directions, thus emulating multi-directional perturbations. SENLY is also provided with force cells allowing the users to evaluate the load distribution between the legs. Perturbations can be delivered either synchronously, that is, simultaneously with the Start command run by the user, or asynchronously, that is, when a particular load distribution on both platforms is detected.
Serval

École Polytechnique Fédérale de Lausanne (EPFL) BioRobotics Lab Serval

Serval, the last in a line of robot iterations, is meant to serve as a quadruped for agile movement. We use the previously researched mechanisms, control structures and gained knowledge in the electronics development to build a combined and hopefully higher performing robot. Serval consists of and active 3-DOF spine (combining advantages from Lynx and Cheetah-Cub-S), leg units with adduction/abduction mechanism and a scaled ASLP-version of Cheetah-Cub-AL. All motors (Dynamixel MX64R and MX28R) are combined with in-series elastics to protect the rather sensitive gear-boxes from harm in different load scenarios. The robot is equipped only with a minimal sensor set, consisting of a low-cost, medium-grade IMU. Collaborations, started close to the end of this thesis will provide contact and GRF sensing with capacitive sensors as well as a sensitive skin for physical guidance. Control is realized through inverse kinematics for the legs, (for now) offsets in the spine and an underlying CPG-network for pattern generation. Reflexes, like in Oncilla, were not yet implemented, but are ongoing and future work.
SILVER (Seabed-Interaction Legged Vehicle for Exploration and Research)

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute SILVER (Seabed-Interaction Legged Vehicle for Exploration and Research)

SILVER is a four-legged underwater vehicle. Each leg is based on a crank-slider mechanism with a serial spring, and two rotational joints at the hip (3 DoF each leg). Twelve Dynamixel AX-12a smart servomotors are connected in daisy chain, and powered by a LiPo battery for an average autonomy of 2 hours. The structure in PVC can hold sensors such as cameras, laser scan, salinity, pressure and temperature sensors, etc., or it can mount robotic arms or grippers. Matlab controllers, running on an external (out of water) laptop, enable SILVER with hopping, walking and crawling gaits. The self-stabilizing gaits, the reduced disturbances induced on sand during locomotion, and the peculiar underwater capability grants to the robot unique features for the investigation of seabed or for the study of legged locomotion in reduced gravity environment.
Skysurfer

Universidad de Sevilla (USE) Robotics, Vision and Control Group Skysurfer

First person view system, thought-out for training UAV pilots. If offers a great manoeuvrability. This is a four-channel model featuring rudder, elevator, ailerons and ESC motor control. The model can be used as a first trainer thanks to its forgiving nature and stability, it’s also super durable due to the EPOFLEXY material it’s made from.
Smart Experience Laboratory-SmartXP

University of Twente (UT) Department of Robotics Smart Experience Laboratory-SmartXP

The name of the lab refers to Smart Experience Laboratory. It part of the creative technology study track. Throughout the many hands-on projects typical of our programmes at Twente, researchers and students develop technical knowledge and skills. if helps researchers and students to investigate ability to understand how technology influences humans, design and creative processes. In our Smart XP lab, you will have room to discover how to turn a question or an opportunity into an appealing prototype.
SoftBank Robotics NAO

University of the West of England (UWE Bristol) Bristol Robotics Laboratory SoftBank Robotics NAO

An autonomous, programmable humanoid robot, featuring an inertial measurement unit with accelerometer, gyrometer and four ultrasonic sensors.Force-sensing resistors on legs for adaptive walking.Microphones, Ethernet and Wi-Fi connectivity, 2 cameras with face detection.Linux-based operating system. Compatible with the Microsoft Robotics Studio, Cyberbotics Webots, and the Gostai URBI Studio.
Space53

University of Twente (UT) Department of Robotics Space53

Space53 builds the tools and facilities to evolve unmanned systems technology into concepts that create societal and economic impact based on Technology Readiness, Economic Readiness (economic or societal business case) and Societal Readiness (ethics, legal, society).Space53 offers testing and training facilities and proper legal and procedural conditions for the safe and legal testing of unmanned aerial systems in a fully structured environment (lab), a fully unstructured environment (real life) and all gradual steps in between. Space53 has access to large enclosed structures, a 3 km runway with airspace and several experimental zones in industrial and urban environments.Space53 has a close collaboration with S/park. S/park is the open innovation center located at Nouryons’ former production site in Deventer, the Netherlands. At the S/park site there are many interesting assets (tanks, installations, factories etc) available that can be used for validation, demonstrations of new technologies like robots/drones for inspections, safety solutions, virtual reality etc.
STIFF-FLOP soft manipulator

School of Advanced Studies Sant'Anna (SSSA) The BioRobotics Institute STIFF-FLOP soft manipulator

Modular soft manipulator with high dexterity and intrinsic safety. The manipulator is based on a modular approach and each module is 60 mm long and 14 mm in diameter. It is able to implement omnidirectional bending and elongation when pressurized (pressure range: 0 – 1.5 bar). The basic version of the manipulator is composed of two modules, completely based on elastomers (silicones) and some ABS parts for connection. On the tip, the system can lodge a micro camera, a gripper or an ablator (all tested)
Swimming Pool and Flow tank

École Polytechnique Fédérale de Lausanne (EPFL) BioRobotics Lab Swimming Pool and Flow tank

The facility is a medium size pool for testing performance of small swimming robots. The room is equipped with a tracking system based on the cameras mounted above the swimming pool. Tracking system provides position information of all bright points in the swimming pool area. Swimming tests against water flow can also be done by mounting removable components of the flow tank system, shown in the left photo.
SYBOT PK0

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab SYBOT PK0

Thanks to its unique mechanical qualities, the SYBOT brings competitiveness and attractiveness to manual operations that cannot be automated. Through agile automation, SYBOT simplifies the response to flexibility and variability requirements for all types of processes such as grinding.Without special training, the operator integrates and optimizes the use of the COBOT by enhancing his know-how in a work environment that does not require any modification. SYBOT interactive COBOTs improve productivity while reducing operator fatigue.
Talon

Universidad de Sevilla (USE) Robotics, Vision and Control Group Talon

Features a strong foam body internally reinforced with carbon fibre, parachute recovery and belly landing available, 24 megapixel camera with many other sensor options, automatic lens cover door, easy hand-launch, 5 minutes setup time, 2 hours of endurance, easy to transport and +30km link range. It is controlled with a pixhawk autopilot ¡ with an embedded Odroid, but it also has a Raspberry on board. The camera is a Xiaomi Yi Cenital.
TechMed Simulation and Training Centre

University of Twente (UT) Department of Robotics TechMed Simulation and Training Centre

The TechMed Centre’s simulation and training centre offers the latest state of the art simulation technology for research, development and the education of students and professionals in health care. It is used as a large high-tech and safe learning space in which the authentic professional environment is simulated. It fits our high demands for training of Technical Medicine students and other professionals, offering them numerous courses and postgraduate courses, such as: Laparoscopic or Endovascular Interventions, Advanced Life Support, Fundamentals of Ventilation. The TechMed simulation and training centre is one of the Centres of Expertise of the University of Twente. It is used as a ‘beta test site’ by the market leaders in simulation technology and, of course, several scientists of the TechMed Centre carry out their research with the help of advanced technology.
Teo robot

Universitat Politècnica de Catalunya (UPC) IRI Teo robot

Teo is a robot aimed to perform 2D Simultaneous Localization And Mapping (SLAM) and 3D mapping. Is designed to work both in indoor and rugged outdoor areas. To perform these tasks, the robot is build on a skid steer Segway RMP400 platform and provides two 2D horizontal laser range sensors, an inertial sensor (IMU), a GNSS receiver and a homemade 3D sensor based on a rotative 2D vertical laser range sensor.
The iCub robot

Instituto Italiano di Tecnologia (IIT) iCub Facility The iCub robot

The iCub is a humanoid robot designed to support research in embodied AI. At 104 cm tall, the iCub has the size of a five year old child. It can crawl on all fours, walk and sit up to manipulate objects. Its hands have been designed to support sophisticate manipulation skills. The iCub is distributed as Open Source following the GPL licenses. The entire design is available for download from the project’s repositories (http://www.iCub.org). Four robots are available in the iCub Facility at the Istituto Italiano di Tecnologia. The iCub is one of the few platforms in the world with a sensitive full-body skin to deal with the physical interaction with the environment including possibly people.

The Karlsruhe Humanoid Head

Karlsruhe Institute of Technology (KIT) High Performance Humanoid Technologies Lab (H²T) The Karlsruhe Humanoid Head

The Karlsruhe humanoid head was consistently used in ARMAR-IIIa and ARMAR-IIIb. It is a stand-alone robot head for studying various visual perception tasks in the context of object recognition and human-robot interaction. The active stereo head has a total number of 7 DOFs (4 in the neck and 3 in the eyes), six microphones and a 6D inertial sensor. Each eye is equipped with two digital color cameras, one with a wide-angle lens for peripheral vision and one with a narrow-angle lens for foveal vision to allow simple visuo-motor behaviors. The software was originally written in MCA but can also be controlled via the robot development environment ArmarX (https://armarx.humanoids.kit.edu).
The R1 robot

Instituto Italiano di Tecnologia (IIT) iCub Facility The R1 robot

R1 is a service robot designed at IIT starting from the experience and know-how of the iCub (with which it shares the software API). R1 is 1.3m tall. The torso is equipped with a mechanism that allows varying its height from a minimum of 1.15m to a maximum of 1.45m. R1 has an especially small footprint to move in cluttered office/home/mall environments. R1 is safe for interaction. Arm joints contain a simple torque overload protection, which behaves as a clutch mechanisms providing intrinsic safety. In addition, R1 can be torque controlled (active torque control) via read outs from two 6-axial force-torque sensors and tactile pressure sensors in the hands and forearms. R1 has two eight degree of freedom (DoF) arms. The target payload is 1.5kg in the fully stretched configuration – reaching at 0.7m distance from the robot’s body. The robot has two four-DoF, two degrees of actuation (DOA) hands. The hands are equipped with distributed pressure sensors, joint angle encoders and series elastic actuators to allow monitoring grip forces. The robot has a two-DoF head equipped with sensors and devices for HRI. The robot mounts an Xtion Pro Live RGBD sensor, for depth sensing. Higher performance depth sensors are possible. The head also mounts a Leopard Imaging OV580 twin camera module, which allows for multiple configurable video resolutions and sampling rates. The head also integrates eight microphones, a loudspeaker and a special, custom designed, programmable RGB LED matrix.
Tibi and Dabo robots

Universitat Politècnica de Catalunya (UPC) IRI Tibi and Dabo robots

Tibi and Dabo are two mobile urban service robots aimed to perform navigation and human robot interaction tasks.Navigation is based on the differential Segway RMP200 platform, able to work in balancing mode, which is useful to overcome low slope ramps. Two 2D horizontal laser range sensors allow obstacle detection and localization.Human robot interaction is achieved with two 2 degrees of freedom (dof) arms, a 3 dof head with some face expressions, a stereo camera, text-to-speech software and a touch screen.They can be used to provide information, guiding and steward services to persons in urban spaces, either alone or both in collaboration.
TX90 6-axis robot

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab TX90 6-axis robot

The TX90 6-axis robot is an articulated arm with 6 axes for increased flexibility. The spherical work envelope allows maximum utilization of cell workspace. It can also be mounted on the floor, wall or ceiling. The fully enclosed structure (IP65) makes the robotic arm ideal for applications in harsh environments. The TX90 6-axis robot has a maximum payload of 20 kg and a 1000 mm reach.
Universal Robots UR5

University of the West of England (UWE Bristol) Robotics Innovation Facility Universal Robots UR5

Lightweight, flexible collaborative robot. The robot includes freedrive mode, force-sensing for collision detection and several wide spread end of arm tooling. Flexible and safe to work with, without the need of cage or additional safety systems. High repeatability and speed, quick to program. Tool flange presents 12V/24V, 600 mA power supply, with Ethernet communication protocol. Alternative solutions (serial or custom) are available for different end-effectors. 2/2 Digital I/Os and 2 Analog inputs.
UR10 Universal Robot

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab UR10 Universal Robot

Collaborative robot UR10 are designed to mimic the range of motion of a human arm and incidentally all it takes to program and reprogram the robotic arms is a human arm. It doesn’t get any easier – and perhaps most importantly it eliminates the need for expensive third-party programmers every time you want to assign the robot arm to a different task. The intuitive software allows even the most inexperienced user to quickly grasp the basics of programming and set waypoints by simply moving the robot into position. Recurring tasks, programs can be stored in the UR robot arm and re-used.FAST SET-UP, EASY PROGRAMMING, COLLABORATIVE AND SAFE, FLEXIBLE DEPLOYMENT
Vicon Indoor Testbed

Universidad de Sevilla (USE) Robotics, Vision and Control Group Vicon Indoor Testbed

Indoor testbed for the assessment and validation of air traffic automation techniques and multivehicle systems (both coordination and cooperation). This testbed is based on an indoor positioning system that uses 20 VICON cameras. This system can calculate the position and attitude of any moving object within the volume of the testbed (15x15x5 m) in real time (with an update rate of up to 500 Hz). CATEC has 10 light unmanned quadrotors that can be used to emulate the trajectory of any type of aircraft. These rotorcrafts can carry up to 500 g of payload. In addition, CATEC has 4 coaxial quadrotors with significantly higher payload capacity up to 2 kg which are used to test aerial manipulation techniques. Finally, the testbed is integrated with a software development environment which allows the simulation of the algorithms before they are tested in within the testbed.
YUMII ABB

Commissariat à l’Energie Atomique (CEA) Interactive Robotics Lab YUMII ABB

At only 38 kg and approximately the size of a small human, YuMi® is quickly and easily installed on the production line to work hand-in-hand with a human colleague. Lead-through programming means YuMi® can be taught a process by being physically guided through it, eliminating the need for complex, time-consuming code-based instruction. • Can operates equally effectively side-by-side or face-to-face with human coworkers. • Servo grippers (the “hands”) include options for built-in cameras • Real-time algorithms set a collision-free path for each arm customized for the required task. • Padding protects coworkers in high-risk areas by absorbing force if contact is made. • If the robot encounters an unexpected object – even a slight contact with a coworker – it can pause its motion within milliseconds, and the motion can be restarted again as easily as pressing play on a remote control. • Pinch points have been eliminated or minimized to an acceptable level between moving parts, and between moving and stationary parts.
“Birdly” flight simulator with haptic feedback

École Polytechnique Fédérale de Lausanne (EPFL) Laboratory of Intelligent Systems “Birdly” flight simulator with haptic feedback

Visually immersed through a Head Mounted Display you are embedded in a high resolution virtual landscape. You command your flight with arms and hands which directly correlates to the wings (flapping) and the primary feathers of the bird (navigation). This input is reflected in the flight model of the bird and returned as a physical feedback by the simulator through nick, roll and heave movements. To evoke an intense and immersive flying adventure SOMNIACS vigorously relies on precise sensory-motor coupling and strong visual impact. Additionally Birdly® includes sonic, and wind feedback: according to the speed the simulator regulates the headwind from a fan mounted in front of you.