Face recognition based on Kinect
In this paper, we present a new algorithm that utilizes low-quality red, green, blue and depth (RGB-D) data from the Kinect sensor for face recognition under challenging conditions. This algorithm extracts multiple features and fuses them at the feature level. A Finer Feature Fusion technique is dev...
| Main Authors: | , , , |
|---|---|
| Format: | Journal Article |
| Published: |
Springer-Verlag London Ltd
2015
|
| Online Access: | http://hdl.handle.net/20.500.11937/33976 |
| _version_ | 1848754094928297984 |
|---|---|
| author | Li, B. Mian, A. Liu, Wan-Quan Krishna, Aneesh |
| author_facet | Li, B. Mian, A. Liu, Wan-Quan Krishna, Aneesh |
| author_sort | Li, B. |
| building | Curtin Institutional Repository |
| collection | Online Access |
| description | In this paper, we present a new algorithm that utilizes low-quality red, green, blue and depth (RGB-D) data from the Kinect sensor for face recognition under challenging conditions. This algorithm extracts multiple features and fuses them at the feature level. A Finer Feature Fusion technique is developed that removes redundant information and retains only the meaningful features for possible maximum class separability. We also introduce a new 3D face database acquired with the Kinect sensor which has released to the research community. This database contains over 5,000 facial images (RGB-D) of 52 individuals under varying pose, expression, illumination and occlusions. Under the first three variations and using only the noisy depth data, the proposed algorithm can achieve 72.5 % recognition rate which is significantly higher than the 41.9 % achieved by the baseline LDA method. Combined with the texture information, 91.3 % recognition rate has achieved under illumination, pose and expression variations. These results suggest the feasibility of low-cost 3D sensors for real-time face recognition. |
| first_indexed | 2025-11-14T08:34:57Z |
| format | Journal Article |
| id | curtin-20.500.11937-33976 |
| institution | Curtin University Malaysia |
| institution_category | Local University |
| last_indexed | 2025-11-14T08:34:57Z |
| publishDate | 2015 |
| publisher | Springer-Verlag London Ltd |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | curtin-20.500.11937-339762017-09-13T15:09:36Z Face recognition based on Kinect Li, B. Mian, A. Liu, Wan-Quan Krishna, Aneesh In this paper, we present a new algorithm that utilizes low-quality red, green, blue and depth (RGB-D) data from the Kinect sensor for face recognition under challenging conditions. This algorithm extracts multiple features and fuses them at the feature level. A Finer Feature Fusion technique is developed that removes redundant information and retains only the meaningful features for possible maximum class separability. We also introduce a new 3D face database acquired with the Kinect sensor which has released to the research community. This database contains over 5,000 facial images (RGB-D) of 52 individuals under varying pose, expression, illumination and occlusions. Under the first three variations and using only the noisy depth data, the proposed algorithm can achieve 72.5 % recognition rate which is significantly higher than the 41.9 % achieved by the baseline LDA method. Combined with the texture information, 91.3 % recognition rate has achieved under illumination, pose and expression variations. These results suggest the feasibility of low-cost 3D sensors for real-time face recognition. 2015 Journal Article http://hdl.handle.net/20.500.11937/33976 10.1007/s10044-015-0456-4 Springer-Verlag London Ltd restricted |
| spellingShingle | Li, B. Mian, A. Liu, Wan-Quan Krishna, Aneesh Face recognition based on Kinect |
| title | Face recognition based on Kinect |
| title_full | Face recognition based on Kinect |
| title_fullStr | Face recognition based on Kinect |
| title_full_unstemmed | Face recognition based on Kinect |
| title_short | Face recognition based on Kinect |
| title_sort | face recognition based on kinect |
| url | http://hdl.handle.net/20.500.11937/33976 |