HomeInfrastructuresInstitute of Anthropomatics and Robotics – H2T | TERRINet

https://www.terrinet.eu/wp-content/uploads/2018/04/image19.png
https://www.terrinet.eu/wp-content/uploads/2018/05/H2T_color_print.jpg

Address:
Adenauerring 2
76131 Karlsruhe, Germany

Website
Anthropometrics and Robotics – H2T

Scientific Responsible
Tamim Asfour

The structure

The infrastructure offered by the Chair of High Performance Humanoid Technologies consists of several state of the art humanoid robot systems and humanoid components used in research projects and education. The first major part of the infrastructure is the KIT robot kitchen environment, in which the robots ARMAR-IIIa and ARMAR-IIIb are operating. The installation provide unique opportunities to conduct research in the areas of grasping and dexterous manipulation in human-centered environment, visuo-haptic object exploration and learning from human observation and from experience. The second major part of the infrastructure is the KIT Motion Capture Studio, which provide a unique environment for capturing and analysis of human motion and thus support research in the area of learning from human observation and robot programming by demonstration. The users of the infrastructure have access to the expertise of the scientific staff of not only the Chair of High Performance Humanoid Technologies but also of other employees of the Institute of Anthropomatics and Robotics and the KIT Center Information – Systems – Technologies, which bundles interdisciplinary competences across KIT, in particular from informatics, economics, electrical and mechanical engineering, information technology, as well as social science. This is especially important considering the competences at KCIST, which range like machine intelligence, robotics, human-machine interfaces, algorithmics, software engineering, cyber security, cloud computing and scientific computing, secure communication systems, and big data technologies.

Available platforms
https://www.terrinet.eu/wp-content/uploads/2018/05/armar3.jpeg

Humanoid Robots: ARMAR-III

The humanoid robot ARMAR-III has been designed to help in the Kitchen, e.g. bring objects from the fridge and fill the dishwasher. It has a total of 43 DoF. A mobile platform equipped with three laser scanners allows the robot to navigate the kitchen environment. The arms have 7 DoF each with 8 DoF five fingered hands. A force-torque sensor is available in each wrist. ARMAR-III uses the Karlsruhe Humanoid Head with 7 DoF. For vision, four digital cameras are integrated into the head. Each eye has a wide-angle and narrow-angle camera for peripheral and foveal vision, respectively. There are four PCs inside the mobile base, which run Linux as their operating system. The robot software was originally written in MCA but can also be controlled via the newer ArmarX framework (https://armarx.humanoids.kit.edu). High-level functionality, like object localization, navigation, grasping and planning are already implemented and available.

Key features

  • Integrated humanoid robot system
  • Advanced grasping and manipulation
  • Force-control for contact detection
  • 5-finger pneumatically actuated hands
  • Active head with foveated vision
  • Control architecture with memory and attention system

Possible applications

  • Manufacturing tasks involving carrying heavy loads

Technical specifications in brief

Total Weight140kg
MotorsDC, Harmonic Drives
Color camerasPoint Grey Research Dragonfly (RGB, 640×480@30FPS, Stereo calibrated)
Depth CamerasAsus Xtion Pro (RGB-D, 640×480@30PFS)
Laser Range-FindersThree Hokuyo scanners of type URG-X003S in the mobile base
Force-torque sensorsTwo 6D force torque sensors from ATI Industiral Automation at the wrists
Computers4 PCs inside (1 PC for external control)
OSUbuntu Linux 14.04
Robotic FrameworkArmarX (previous framework MCA)
NetworkGigabit Ethernet (WLAN for plugless operation)
Bus systemEtherCAT (100Mbit/s)
SoftwareArmarX (https://www.armarx.humanoids.kit.edu)

Additional Information available here, here and here.

https://www.terrinet.eu/wp-content/uploads/2018/05/armar4.jpeg

Humanoid Robot ARMAR-4

ARMAR-4 is a full-body humanoid robot with torque control capabilities in its arm, leg and torso joints. It has 63 active degrees of freedom with 63 actuators overall, including feet, neck, hands and eyes. It features more than 200 individual sensors for position, temperature and torque measurement, 76 microcontrollers for low-level data processing and 3 on-board PCs for perception, high-level control and real-time functionalities. The robot stands 170 cm tall and weighs 70 kg.
Each leg has six degrees of freedom, mimicking the flexibility and range of motion of the human leg. For maximum range of motion and dexterity, each arm has eight degrees of freedom. The kinematic similarity to the human body facilitates the mapping and execution of human motions on the robot.
The four end-effectors (hands and feet) are equipped with sensitive 6D Force/Torque sensors to accurately capture physical interaction forces and moments between the robot and its environment. The robot’s two eyes are each equipped with two cameras for wide and narrow angle vision.
The three control PCs (two in the torso, one in the head) run Ubuntu 14.04 and control the robot via the ArmarX software framework (https://armarx.humanoids.kit.edu), wherein high-level functionalities like object localization, grasping and planning are already implemented and available.

Key features

  • Bipedal humanoid robot system
  • Position, velocity, current and torque control on joint level
  • 6DoF legs and 8DoF arms
  • 9 DoF active head with foveated vision
  • 6D Force/Torque sensors in the wrist and ankle joints

Possible applications

  • Whole-body balancing
  • Dynamic whole-body state estimation
  • Transfer of human whole-body motion to humanoid motion
  • Multi-contact whole-body motion planning and execution
  • Whole-body torque control

Technical specifications in brief

Power supply48V DC
Total Weight70kg
MotorsBrushless DC
Color CamerasPoint Grey Research Dragonfly (RGB, 640×480@30FPS)
Joint control modesPosition, velocity, current and torque
Force-torque sensorsFour 6D force torque sensors (Two in the wrists and two in the ankle joints)
OSUbuntu Linux 14.04
Robotic FrameworkArmarX (previous framework MCA)
NetworkGigabit Ethernet
Bus systemCAN-Bus (CANopen)
SoftwareArmarX (https://www.armarx.humanoids.kit.edu)

Additional Information available here and here.

https://www.terrinet.eu/wp-content/uploads/2018/05/armar6.png

Humanoid Robots: ARMAR-6

ARMAR-6 is a collaborative humanoid robot assistant for industrial environments. Designed to recognize the need of help and to allow for an easy and safe human-robot interaction, the robot’s comprehensive sensor setup includes various camera systems, torque sensors and systems for speech recognition. The dual arm system combines human-like kinematics with a payload of 10 kg which allows for dexterous and high-performant dual arm manipulation. In combination with its telescopic torso joint and a pair of underactuated five-finger hands, ARMAR-6 is able to grasp objects on the floor as well as to work in a height of 240 cm. The mobile platform includes holonomic wheels, battery packs and four high-end PCs for autonomous on-board data processing.
The software architecture is implemented in ArmarX (https://armarx.humanoids.kit.edu). High-level functionality, like object localization, navigation, grasping and planning are already implemented and available.

Key features

  • Dexterous arm system with 2×8 DoF for dual arm manipulation
  • Underactuated five-finger hands
  • Limitless rotation in shoulder, upper arm and forearm
  • Comprehensive sensor setup, including:
    – Highly precise absolute position sensors, torque sensors, temperature sensors and IMU in each arm joint
    – 6D-force-torque sensors in the wrist
    – Laser scanners for navigation
    – Sensor head with two stereo vision systems (Roboception rc_visard 160 & 2 Flea 3.0) and a depth camera (Microsoft PrimeSense RGB-D)
  • Various control modes enable the execution of precise and torque/force-controlled motions
  • Holonomic movement of the mobile platform
  • Control architecture with memory and attention system

Possible applications

  • Dual arm manipulation
  • Force and torque based control and interaction
  • Gravity compensated torque control
  • Task space impedance control
  • Physical human-robot interaction
  • Vision-based grasping and deep learning for grasping
  • Imitation Learning, Programming by demonstration
  • Semantic scene understanding and affordance extraction
  • Human-robot interaction
  • Natural speech dialog
  • Cognitive robotics: learning multimodal representations, affordances
  • AI: symbolic planning and execution monitoring

Technical specifications in brief

DoF27
Total Height192 cm
Arm span width310 cm
Arm range130 cm
Working height0 – 240 cm
Payload (single arm)10 kg (long range), 14 kg (mid range)
Total Weight160 kg (without battery packs)
Platform speed1 m/s
Computersk4 high-end PCs, 1 GPU
Robotic FrameworkArmarX (previous framework MCA)
Bus systemCAN-Bus (CANopen)
SoftwareArmarX (https://www.armarx.humanoids.kit.edu)

Additional Information available here and here.

https://www.terrinet.eu/wp-content/uploads/2018/05/head.png

The Karlsruhe Humanoid Head

The Karlsruhe humanoid head was consistently used in ARMAR-IIIa and ARMAR-IIIb. It is a stand-alone robot head for studying various visual perception tasks in the context of object recognition and human-robot interaction.
The active stereo head has a total number of 7 DOFs (4 in the neck and 3 in the eyes), six microphones and a 6D inertial sensor. Each eye is equipped with two digital color cameras, one with a wide-angle lens for peripheral vision and one with a narrow-angle lens for foveal vision to allow simple visuo-motor behaviors. The software was originally written in MCA but can also be controlled via the robot development environment ArmarX (https://armarx.humanoids.kit.edu).

Key features

  • Seven degrees of freedom (DoF)
  • Foveated stereo camera system
  • Inertial system
  • Six channel microphone system

Possible applications

  • Active vision
  • Active visual search
  • Gaze stabilization
  • Multimodal attention (audio, vision)
  • Human-robot interaction and communication

Technical specifications in brief

DoF7
ActuatorDC Motor and Harmonic Drives
CameraFour Point Grey Dragonfly2 color cameras (460×480@60Hz)
Inertial systemXsens MTIx gyroscope-basen orientation sensor
Auditory systemSix microphones (SONY ECMC115.CE7)
SoftwareMCA or ArmarX (https://www.armarx.humanoids.kit.edu)

Additional Information available here and here.

https://www.terrinet.eu/wp-content/uploads/2018/05/hma-2.png

Human Motion Analysis with Vicon

The human motion capture studio provides a unique facility for capturing and analyzing human motion as well as for the mapping to humanoid robots. The studio is equipped with 14 Vicon MX cameras (1 megapixel resolution and 250 fps), microphone array and several kinect cameras. Several tools for motion post-processing of recorded data, normalization, synchronization of different sensor modalities, visualization exist. In addition, a reference model of the human body (The Master Motor Map, MMM) and a standardized marker set allow unifying representations of captured human motion, and the transfer of subject-specific motions to robots with different embodiments. The motion data in the database considers human as well as object motions. The raw motion data entries are enriched with additional descriptions and labels. Beside the captured motion in its raw format (e.g., marker motions), information about the subject anthropometric measurements and the setup of the scene including environmental elements and objects are provided. The motions are annotated with motion description tags that allow efficient search for certain motion types through structured queries.

Key features

  • 14 VICON cameras
  • Multi-Modal recordings: different sensor combinations possible (force, IMU, audio,…)
  • Large-scale human motion database https://motion-database.humanoids.kit.edu/
  • Master Motor Map: reference representation of the human body

Possible applications

  • Human Motion Analysis
  • Learning whole-body motion primitives
  • Whole-body motion segmentation
  • Linking motion to natural language
  • Data-driven methods for motion and action learning
  • Large-scale whole-body human motion database

Technical specifications in brief

Number of cameras14 Vicon cameras (10 MTX T10, 4 Vero)
Capture size volume6 x 4m
Update rate330FPS
SoftwareNexus 2.7 and MMM (https://mmm.humanoids.kit.edu)

Additional Information available here and here.

The whole body human motion database is publicy available.

https://www.terrinet.eu/wp-content/uploads/2018/05/exo.png

KIT-EXO-1

The exoskeleton KIT-EXO-1 was developed with the aim to augment human capabilities or to use it in rehabilitation applications. It has two active DOF at the knee and ankle joint to support flexion/extension movement. The linear actuators consist of brushless DC-motors, coupled to planetary roller screws and an optional serial spring. They are equipped with absolute and relative position encoders as well as a force sensor and can be controlled via the ArmarX software framework (https://armarx.humanoids.kit.edu). Eight additional force sensors, which are distributed on the exoskeleton measure interaction forces between user and exoskeleton at thigh, shank and foot and can be used for research on intuitive exoskeleton control or to assess the kinematic compatibility of new joint mechanisms.

Key features

  • Lower limb exoskeleton with 2 DoF
  • Linear actuators with force and position sensing
  • Eight force sensors measure interaction forces between human and exoskeleton
  • Position, velocity, current and force control on joint level
  • CANopen and RS-232 (over USB) communication

Possible applications

  • Exoskeleton control based on actuator and interaction forces
  • Motion classification or prediction with a multi-modal sensor setup
  • Tests of new joint mechanisms for knee and ankle joint
  • Assessment of the kinematic compatibility of new joint mechanisms

Technical specifications in brief

DoA2
InterfaceCANopen / RS232 / USB
Power supply48V, 10A peak
Total Weight4 kg
Actuator force3000 N
Actuator speed100 mm/s
OSUbuntu Linux 16.04
SoftwareArmarX (https://www.armarx.humanoids.kit.edu)

Additional Information available here and here.

https://www.terrinet.eu/wp-content/uploads/2018/05/hand.jpg

KIT Prosthetic Hand

A five-finger 3D printed hand prosthesis with an underactuated mechanism, sensors and embedded control system. It has an integrated RGB camera in the base of the palm and a colour display in the back of the hand. All functional components are integrated into the hand, dimensioned according to a 50th percentile male human hand. Accessible via a simple communication interface (serial interface directly or via Bluetooth) or controllable via buttons. Camera and display allow for studies on vision-based semi-autonomous grasping and user feedback in prosthetics. As a stand-alone device the hand allows easy usage in different environments and settings.

Key features

  • Adaptive, compliant grasping behaviour
  • Fast, integrated microprocessor (216 MHz)
  • RGB camera and OLED colour display
  • Serial communication and Bluetooth low energy
  • I²C interfaces for additional sensors

Possible applications

  • Grasping and Manipulation
  • Prosthetics
  • Semi-autonomous control
  • Sensor application and sensor fusion
  • Prosthetic user feedback

Technical specifications in brief

DoA2
Interfaceserial / Bluetooth LE
Power supply12V, 2A peak
Hook grasp force120 N
Finger force7.5 – 11.8 N
Full finger flexion speed1.3s
ProcessorARM Cortex M7
Camera1.2 MP RGB
DisplayOLED colored

Additional Information available here.

https://www.terrinet.eu/wp-content/uploads/2018/05/nao.png

NAO

NAO is an autonomous, programmable humanoid robot developed by Adelbaran Robotics in 2006 (now SoftBankRobotics [1]). The robot is commonly used for research and education purposes. Mostly known is this robot due to the participation in the RoboCup Standard Platform Soccer League [2]. In this league the robots play fully autonomously, make decisions independently from each other but can also cooperate with each other. NAO can be programmed graphically via the programm Choregraphe, which is commonly used by young students with no programming experience. The robot can also be programmed in python and C++, which allow a wide range of complexe tasks to be fulfilled . The robot can also be used to show mathematical modelling on humanoids robots to solve complexe tasks.

Key features

  • Easy programmable robot
  • NAO is very stable due to the intertial board and construction of the robot
  • 2 HD cameras, 4 microphones, sonar rangefinder, 2 infrared emitters and receiver, intertial board, 9 tactile sensors, 8 pressure sensors

Possible applications

  • Motion and Task Planning
  • Intuitive programming
  • Human-robot interaction
  • Multi-robot task planning

Technical specifications in brief

Height58 cm
Weight4.3 kg
Power supply48.6 Wh (Lithium Battery, 90 min active use)
DoF25
ProcessorIntel Atom @ 1.6 GHz
Built-in OSNAOqi 2.0 (Linux based)
Compatible OSWindows, Mac OS, Linux
Programming LanguagesC++, Python, Javam MATLAB, Urbi, C, .Net
ConnectivityEthernet, Wifi
Sensors2 HD cameras, 4 microphones, sonar rangefinder, 2 infrared emitters and receiver, intertial board, 9 tactile sensors, 8 pressure sensors

Additional Information available here, here and here.