Springe zum Hauptinhalt
Robotik und Mensch-Technik-Interaktion
Perception
Robotik und Mensch-Technik-Interaktion 

Research Area Perception

Object Pose Estimation

  • 6D object pose matching
  • Point cloud segmentation
  • Variable encode/decode networks
  • Recognition of objects
  • Transfer learning

3D Reconstruction and Next Best View Planning

  • Automatic scanning of objects
  • Texture generation 
  • Interactive perception and next best viewpoint planning
  • Model estimation

Bin Picking

  • Object recognition
  • 6D pose estimation
  • Grasp generation
  • Grasp learning

Eye Gaze Tracking and Action Recognition

  • Eye gaze as a key signal to indicate the availability for communication
  • New eye gaze tracker, able to estimate the gaze from 2m distance with a usual RGBD-camera
  • People tracking and action recognition

Hand and Object Pose Tracking

  • Hand tracking and pose tracking of grasped objects
  • Avoid the collision with the hand during hand over maneuvers
  • Simultaneously tracking of hands and objects
  • Gesture tracking

New Sensors

  • Capacity proximity sensors for safety in HRI
  • Optical force sensors for robust meassurements
  • New wide range sensors like an insect's eye

Environmental Perception

  • Fast 3D Voxel maps for collision avoidance and fast collision free motion planning
  • Segmentation of environments
Contact Prof. Dr.-Ing. Ulrike Thomas
Funded by