Learning parts-based representations with nonnegative restricted boltzmann machine
© 2013 T.D. Nguyen, T. Tran, D. Phung & S. Venkatesh.The success of any machine learning system depends critically on effective representations of data. In many cases, especially those in vision, it is desirable that a representation scheme uncovers the parts-based, additive nature of the data....
| Main Authors: | , , , |
|---|---|
| Format: | Conference Paper |
| Published: |
2013
|
| Online Access: | http://hdl.handle.net/20.500.11937/51761 |
| _version_ | 1848758763804164096 |
|---|---|
| author | Nguyeny, T. Tran, The Truyen Phungy, D. Venkateshy, S. |
| author_facet | Nguyeny, T. Tran, The Truyen Phungy, D. Venkateshy, S. |
| author_sort | Nguyeny, T. |
| building | Curtin Institutional Repository |
| collection | Online Access |
| description | © 2013 T.D. Nguyen, T. Tran, D. Phung & S. Venkatesh.The success of any machine learning system depends critically on effective representations of data. In many cases, especially those in vision, it is desirable that a representation scheme uncovers the parts-based, additive nature of the data. Of current representation learning schemes, restricted Boltzmann machines (RBMs) have proved to be highly effective in unsupervised settings. However, when it comes to parts-based discovery, RBMs do not usually produce satisfactory results. We enhance such capacity of RBMs by introducing nonnegativity into the model weights, resulting in a variant called nonnegative restricted Boltzmann machine (NRBM). The NRBM produces not only controllable decomposition of data into interpretable parts but also offers a way to estimate the intrinsic nonlinear dimensionality of data. We demonstrate the capacity of our model on well-known datasets of handwritten digits, faces and documents. The decomposition quality on images is comparable with or better than what produced by the nonnegative matrix factorisation (NMF), and the thematic features uncovered from text are qualitatively interpretable in a similar manner to that of the latent Dirichlet allocation (LDA). However, the learnt features, when used for classification, are more discriminative than those discovered by both NMF and LDA and comparable with those by RBM. |
| first_indexed | 2025-11-14T09:49:10Z |
| format | Conference Paper |
| id | curtin-20.500.11937-51761 |
| institution | Curtin University Malaysia |
| institution_category | Local University |
| last_indexed | 2025-11-14T09:49:10Z |
| publishDate | 2013 |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | curtin-20.500.11937-517612017-04-04T02:46:23Z Learning parts-based representations with nonnegative restricted boltzmann machine Nguyeny, T. Tran, The Truyen Phungy, D. Venkateshy, S. © 2013 T.D. Nguyen, T. Tran, D. Phung & S. Venkatesh.The success of any machine learning system depends critically on effective representations of data. In many cases, especially those in vision, it is desirable that a representation scheme uncovers the parts-based, additive nature of the data. Of current representation learning schemes, restricted Boltzmann machines (RBMs) have proved to be highly effective in unsupervised settings. However, when it comes to parts-based discovery, RBMs do not usually produce satisfactory results. We enhance such capacity of RBMs by introducing nonnegativity into the model weights, resulting in a variant called nonnegative restricted Boltzmann machine (NRBM). The NRBM produces not only controllable decomposition of data into interpretable parts but also offers a way to estimate the intrinsic nonlinear dimensionality of data. We demonstrate the capacity of our model on well-known datasets of handwritten digits, faces and documents. The decomposition quality on images is comparable with or better than what produced by the nonnegative matrix factorisation (NMF), and the thematic features uncovered from text are qualitatively interpretable in a similar manner to that of the latent Dirichlet allocation (LDA). However, the learnt features, when used for classification, are more discriminative than those discovered by both NMF and LDA and comparable with those by RBM. 2013 Conference Paper http://hdl.handle.net/20.500.11937/51761 restricted |
| spellingShingle | Nguyeny, T. Tran, The Truyen Phungy, D. Venkateshy, S. Learning parts-based representations with nonnegative restricted boltzmann machine |
| title | Learning parts-based representations with nonnegative restricted boltzmann machine |
| title_full | Learning parts-based representations with nonnegative restricted boltzmann machine |
| title_fullStr | Learning parts-based representations with nonnegative restricted boltzmann machine |
| title_full_unstemmed | Learning parts-based representations with nonnegative restricted boltzmann machine |
| title_short | Learning parts-based representations with nonnegative restricted boltzmann machine |
| title_sort | learning parts-based representations with nonnegative restricted boltzmann machine |
| url | http://hdl.handle.net/20.500.11937/51761 |