Exploiting layerwise convexity of rectifier networks with sign constrained weights

© 2018 Elsevier Ltd By introducing sign constraints on the weights, this paper proposes sign constrained rectifier networks (SCRNs), whose training can be solved efficiently by the well known majorization–minimization (MM) algorithms. We prove that the proposed two-hidden-layer SCRNs, which exhibit...

Full description

Bibliographic Details
Main Authors: An, Senjian, Boussaid, F., Bennamoun, M., Sohel, F.
Format: Journal Article
Published: Pergamon, Elsevier 2018
Online Access:http://hdl.handle.net/20.500.11937/69564
_version_ 1848762074409205760
author An, Senjian
Boussaid, F.
Bennamoun, M.
Sohel, F.
author_facet An, Senjian
Boussaid, F.
Bennamoun, M.
Sohel, F.
author_sort An, Senjian
building Curtin Institutional Repository
collection Online Access
description © 2018 Elsevier Ltd By introducing sign constraints on the weights, this paper proposes sign constrained rectifier networks (SCRNs), whose training can be solved efficiently by the well known majorization–minimization (MM) algorithms. We prove that the proposed two-hidden-layer SCRNs, which exhibit negative weights in the second hidden layer and negative weights in the output layer, are capable of separating any number of disjoint pattern sets. Furthermore, the proposed two-hidden-layer SCRNs can decompose the patterns of each class into several clusters so that each cluster is convexly separable from all the patterns from the other classes. This provides a means to learn the pattern structures and analyse the discriminant factors between different classes of patterns. Experimental results are provided to show the benefits of sign constraints in improving classification performance and the efficiency of the proposed MM algorithm.
first_indexed 2025-11-14T10:41:47Z
format Journal Article
id curtin-20.500.11937-69564
institution Curtin University Malaysia
institution_category Local University
last_indexed 2025-11-14T10:41:47Z
publishDate 2018
publisher Pergamon, Elsevier
recordtype eprints
repository_type Digital Repository
spelling curtin-20.500.11937-695642018-08-08T04:56:39Z Exploiting layerwise convexity of rectifier networks with sign constrained weights An, Senjian Boussaid, F. Bennamoun, M. Sohel, F. © 2018 Elsevier Ltd By introducing sign constraints on the weights, this paper proposes sign constrained rectifier networks (SCRNs), whose training can be solved efficiently by the well known majorization–minimization (MM) algorithms. We prove that the proposed two-hidden-layer SCRNs, which exhibit negative weights in the second hidden layer and negative weights in the output layer, are capable of separating any number of disjoint pattern sets. Furthermore, the proposed two-hidden-layer SCRNs can decompose the patterns of each class into several clusters so that each cluster is convexly separable from all the patterns from the other classes. This provides a means to learn the pattern structures and analyse the discriminant factors between different classes of patterns. Experimental results are provided to show the benefits of sign constraints in improving classification performance and the efficiency of the proposed MM algorithm. 2018 Journal Article http://hdl.handle.net/20.500.11937/69564 10.1016/j.neunet.2018.06.005 Pergamon, Elsevier restricted
spellingShingle An, Senjian
Boussaid, F.
Bennamoun, M.
Sohel, F.
Exploiting layerwise convexity of rectifier networks with sign constrained weights
title Exploiting layerwise convexity of rectifier networks with sign constrained weights
title_full Exploiting layerwise convexity of rectifier networks with sign constrained weights
title_fullStr Exploiting layerwise convexity of rectifier networks with sign constrained weights
title_full_unstemmed Exploiting layerwise convexity of rectifier networks with sign constrained weights
title_short Exploiting layerwise convexity of rectifier networks with sign constrained weights
title_sort exploiting layerwise convexity of rectifier networks with sign constrained weights
url http://hdl.handle.net/20.500.11937/69564