HomeInfrastructuresLaboratory of Intelligent Systems | TERRINet


Route Cantonale,
1015 Lausanne, Switzerland

Laboratory of Intelligent Systems

Scientific Responsible
Auke Ijspeert

The structure

The Laboratory of Intelligent Systems takes inspiration from nature to design artificial intelligence and robots that are soft, fly, or evolve their own behaviors. Main research areas are the followings:

Aerial Robotics

We design flying robots, or drones, with rich sensory and behavioural abilities that can change morphology to smoothly and safely operate in different environments. These drones are conceived to work cooperatively and with humans to power civil applications in transportation, aerial mapping, agriculture, search-and-rescue, and augmented virtual reality.

Evolutionary Robotics

Evolutionary robotics takes inspiration from natural evolution to automatically design robot bodies and brains (neural networks) and to understand evolution of living systems. Topics of interest include open-ended evolution, evolution of social cooperation and competition, evolution of communication, evolution of multi-cellular robots, evolvable hardware.

Soft Robotics

Soft robots can continuously change their shape, withstand strong mechanical forces, and passively adapt to their environment. The “softness” makes these robots safer and potentially more robust and versatile than their counterparts made of bolts and metal. Examples include soft grippers that manipulate complex shapes without complex software and insect-inspired compound eyes that conform to curved surfaces to provide large fields of views. Soft robotics technologies will find applications in mobile robotics, in wearable robotics, and in many other applications such as manufacturing, rehabilitation, and portable intelligent devices.

Wearable Robotics

We investigate and develop novel soft wearable robots, or exosuits, for natural interaction between humans and robots and for novel forms of augmented reality.

Traditional human-robot interaction often requires funnelling rich sensory-motor information through simplified computer interfaces, such as visual displays and joysticks, that demand cognitive effort. Instead, we need novel embodied interactions where humans and machines feel each other throughout the extension of their bodies and their rich sensory channels. We want to enable human experience of non-anthropomorphic morphologies and behaviors, such as flying, and novel forms of augmented reality through wearable robotics technologies.

Available platforms

AR drone

Parrot AR.Drone 2.0 Elite Edition allows you to see the world from above and to share your photos and videos on social networks instantly. It manoeuvres intuitively with a smartphone or tablet and offers exceptional sensations right from take-off. Soft protective frame allows to use indoors and and in crowded environments.

Technical Specifications

Battery life

12 min


1000 mAh


50 m


HD 720p 30fps



Wide-angle lens

92° diagonal

Weight with internal frame

380 g

Weight with external frame

420 g

Key features

  • Flight control with a smart phone or tablet
  • HD video recording
  • 9 axis IMU: Gyroscope, Accelerometer, Magnetometer
  • Pressure sensor
  • Vertical camera: QVGA 60 FPS to measure the ground speed
  • Removable protective frame

Possible applications

  • Collaborative navigation of aerial and land robots
  • Search and rescue missions
  • Education

“Birdly” flight simulator with haptic feedback

Visually immersed through a Head Mounted Display you are embedded in a high resolution virtual landscape.
You command your flight with arms and hands which directly correlates to the wings (flapping) and the primary feathers of the bird (navigation). This input is reflected in the flight model of the bird and returned as a physical feedback by the simulator through nick, roll and heave movements.
To evoke an intense and immersive flying adventure SOMNIACS vigorously relies on precise sensory-motor coupling and strong visual impact. Additionally Birdly® includes sonic, and wind feedback: according to the speed the simulator regulates the headwind from a fan mounted in front of you.

Technical Specifications

Degrees of freedom



-30° – +30°


-30° – +30°


-15cm – +15cm

Max user weight

150 kg

Machine weight

132 kg

Key features

  • Integrates all technical components (simulation actuators and sensors, high performance rendering computer etc.) in one robust and well-designed metal body
  • Haptic feedback
  • Top of the line head mounted display (HMD) – HTC Vive
  • High quality headphones
  • Wired remote control with mountable stand
  • Mobile remote control app (Android)

Possible applications

  • Study of the human-robot interaction
  • Virtual reality experiments
  • Development of new immersive drone control strategies
  • Study of the alteration of human perception with realistic flying experience

eBee drone

The senseFly’s eBee is a fully autonomous and easy-to-use mapping drone. Use it to capture high-resolution aerial photos you can transform into accurate orthomosaics (maps) & 3D models. The eBee package contains all you need to start mapping: RGB camera, batteries, radio modem and eMotion software.

Technical Specifications



Weight (incl. supplied camera & battery)

Approx. 0.69 kg

Radio link range

3 km nominal (up to 8 km)

Cameras (supplied)

senseFly S.O.D.A. (1” 20Mpix sensor)

Cruise speed

40-90 km/h (11-25 m/s)

Wind resistance

Up to 45 km/h (12 m/s)

Max. flight range

33 km


Automatic, linear with ~5 m accuracy

Max. flight time

50 minutes

Key features

  • Ultra-portable fixed-wing with a carry case
  • Hand launch
  • Automatic piloting and landing

Possible applications

  • Collaborative navigation of aerial and land robots
  • Search and rescue missions
  • Precise mapping and 3D model creation
  • Agriculture

Motion capture arena

This facility is a large room (~10x10x5.5 m volume) equipped with an array of Optitrack motion capture cameras and an accompanying software. The system allows to track up to 50 rigid objects with a position error less than 1 mm and with a frame rate of up to 200 fps. The cameras illuminate the scene with IR light so the use of special passive markers on the tracked objects is necessary. Most of the wall surface is covered with protective net. Initially designed for the experiments with flying robots this arena can be used in many other robotic applications. AR drone is available on site for experiments which is equipped with markers and with open source flight controller.

Technical Specifications

Usable volume

10x10x5.5 m

Number of cameras

26 on 2 levels

Frame rate

200 fps

Position error

< 1 mm

Number of tracked objects

Up to 50

Data streaming protocol


Key features

  • High position accuracy (error <1mm)
  • Fast response (latency < 5 ms)
  • Advanced software
  • Recording and/or streaming data over the network
  • Suitable for real-time applications

Possible applications

  • Flying robots (drones) real-time position and orientation estimation
  • Virtual and augmented reality experiments
  • Human or robotic movements recording and analysis


RoboGen™ is an open source platform for the co-evolution of robot bodies and brains. It has been designed with a primary focus on evolving robots that can be easily manufactured via 3D-printing and the use of a small set of low-cost, off-the-shelf electronic components. It features an evolution engine, and a physics simulation engine. Additionally it includes utilities for generating design files of body components to be used with a 3D-printer, and for compiling neural-network controllers to run on an Arduino microcontroller board.

Technical Specifications

Main board

NanoWii (ATMega32U4 )


2 cell LiPo




Light, Distance

Key features

  • Low-cost electronics and mechanical parts
  • Easy to assemble and repair
  • Simulation software – try your robot without hardware!

Possible applications

  • Evolutionary robotics
  • Fast prototyping of robots
  • Education

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 730994


Copyright by TERRINet. All rights reserved. – Designed by RGR