Untitled Document

A Robust Object Recognition and Pose Estimation System
for Robotic Applications

Project Description

This project received the Interpro best R&D project award in May 2003.

The aim of the project is rapid and accurate identification of 2-D objects and pose estimation. The system is robust in the sense that it is resilient to some problematic environment influences such as noise caused by image acquisiton hardware, changes in light and position (rotation & translation invariance) of an object on conveyor belt.
The object recognition system is composed of three main units:
- Image preprocessing unit
- Feature extraction unit
- Classification unit
Image preprocessing unit includes filtering operation to reduce noise, image segmentation by thresholding, morphological operations and contour extraction.
The feature extraction module developed in the project is designed to include various techniques. The current techniques included in the software are implicit polynomial models, Fourier descriptors, moment invariants and eigenspace representation. The features found by the feature extraction module are stored in a database for each object.
In the classification unit, the feature vector of the object to be recognised is compared to the records in the database. The object is identified as the nearest object in the database.

Project Objectives

  • Rapid and accurate identification of 2-D objects invariant to Euclidean transformations and invariant to lighting situations.
  • Integration of the developed software with Festo robots

Object Recognition Algorithms Used in the Project

  • Multiscale Edge matching
  • Independent Component Analysis
  • Moment Invariants
  • Fourier Descriptors
  • Implicit Polynomials
  • Eigenimages
  • SIFT features
  • Curvature Scale Space Techniques
  • Edge Map Correlation
  • Deformable Models
  • Shape Contex

Partners

  Festo AG & Co. KG Germany


This project received the Interpro best R&D project award in May 2003.



Invariant to Euclidean Transformations



Invariant to lighting situations



Objects used in the experiments