Regularised nonnegative shared subspace learning

Joint modeling of related data sources has the potential to improve various data mining tasks such as transfer learning, multitask clustering, information retrieval etc. However, diversity among various data sources might outweigh the advantages of the joint modeling, and thus may result in performa...

Full description

Bibliographic Details
Main Authors: Gupta, Sunil, Phung, Dinh, Adams, Brett, Venkatesh, Svetha
Format: Journal Article
Published: Springer 2011
Subjects:
Online Access:http://hdl.handle.net/20.500.11937/6219
_version_ 1848745013414985728
author Gupta, Sunil
Phung, Dinh
Adams, Brett
Venkatesh, Svetha
author_facet Gupta, Sunil
Phung, Dinh
Adams, Brett
Venkatesh, Svetha
author_sort Gupta, Sunil
building Curtin Institutional Repository
collection Online Access
description Joint modeling of related data sources has the potential to improve various data mining tasks such as transfer learning, multitask clustering, information retrieval etc. However, diversity among various data sources might outweigh the advantages of the joint modeling, and thus may result in performance degradations. To this end, we propose a regularized shared subspace learning framework, which can exploit the mutual strengths of related data sources while being immune to the effects of the variabilities of each source. This is achieved by further imposing a mutual orthogonality constraint on the constituent subspaces which segregates the common patterns from the source specific patterns, and thus, avoids performance degradations. Our approach is rooted in nonnegative matrix factorization and extends it further to enable joint analysis of related data sources. Experiments performed using three real world data sets for both retrieval and clustering applications demonstrate the benefits of regularization and validate the effectiveness of the model. Our proposed solution provides a formal framework appropriate for jointly analyzing related data sources and therefore, it is applicable to a wider context in data mining.
first_indexed 2025-11-14T06:10:36Z
format Journal Article
id curtin-20.500.11937-6219
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T06:10:36Z
publishDate 2011
publisher Springer
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-62192017-09-13T16:08:46Z Regularised nonnegative shared subspace learning Gupta, Sunil Phung, Dinh Adams, Brett Venkatesh, Svetha Nonnegative shared subspace learning Auxiliary sources Transfer learning Multi-task clustering Joint modeling of related data sources has the potential to improve various data mining tasks such as transfer learning, multitask clustering, information retrieval etc. However, diversity among various data sources might outweigh the advantages of the joint modeling, and thus may result in performance degradations. To this end, we propose a regularized shared subspace learning framework, which can exploit the mutual strengths of related data sources while being immune to the effects of the variabilities of each source. This is achieved by further imposing a mutual orthogonality constraint on the constituent subspaces which segregates the common patterns from the source specific patterns, and thus, avoids performance degradations. Our approach is rooted in nonnegative matrix factorization and extends it further to enable joint analysis of related data sources. Experiments performed using three real world data sets for both retrieval and clustering applications demonstrate the benefits of regularization and validate the effectiveness of the model. Our proposed solution provides a formal framework appropriate for jointly analyzing related data sources and therefore, it is applicable to a wider context in data mining. 2011 Journal Article http://hdl.handle.net/20.500.11937/6219 10.1007/s10618-011-0244-8 Springer restricted
spellingShingle Nonnegative shared subspace learning
Auxiliary sources
Transfer learning
Multi-task clustering
Gupta, Sunil
Phung, Dinh
Adams, Brett
Venkatesh, Svetha
Regularised nonnegative shared subspace learning
title Regularised nonnegative shared subspace learning
title_full Regularised nonnegative shared subspace learning
title_fullStr Regularised nonnegative shared subspace learning
title_full_unstemmed Regularised nonnegative shared subspace learning
title_short Regularised nonnegative shared subspace learning
title_sort regularised nonnegative shared subspace learning
topic Nonnegative shared subspace learning
Auxiliary sources
Transfer learning
Multi-task clustering
url http://hdl.handle.net/20.500.11937/6219