Comparison of Diverse Ensemble Neural Network for Large Data Classification

Mohamad, Mumtazimah and Md Saman, Mohd Yazid (2015) Comparison of Diverse Ensemble Neural Network for Large Data Classification. International Journal of Advances in Soft Computing and Its Application, 7 (3). pp. 67-84. ISSN 2074-8523

This is the latest version of this item.

[img] Text
6IJASCA-070306_Pg67-84_Comparison-of-Diverse-Ensemble-Neural.pdf - Published Version
Restricted to Registered users only

Download (335Kb) | Request a copy
Official URL:


In a large dataset classification, a higher number of attributes commonly evolve over time, where many dynamic learning strategies have been proposed such as the ensemble network and incremental neural network. Ensemble network is a learning paradigm where many neural networks are jointly used to solve a problem. The relationship between the ensemble and component of neural networks is analyzed from the context of classification in integrated framework. This task would reveal that, it may be better to have many neural networks instead the incremental neural network. Most approaches of ensemble using totally different classifiers for prediction. Then, in order to find an appropriate neural network from ensemble members, it can be selected from a set of different available neural networks. Thus, a Distributed Reordering Technique (DRT) is proposed. DRT is an enhanced algorithm based on distributed random for different neural networks. The weights are randomly assigned to networks in order to evolve, so that they can characterize each neural network to some extent of fitness in constituting a better result. The ensemble network integrated framework supported by the selection of some neural networks based on output and weights that made up the ensemble. The experimental study shows that in comparing with some ensemble approaches such as Bagging, DRT can generate a neural network with enhanced performance and stronger generalization ability. Furthermore, the use of DRT for neural network classifier is practical and relevance to classification systems for large and can be applied to different large data dimension in future.

Item Type: Article
Keywords: Ensemble Network, Incremental Learning, Large Data, Neural Network.
Subjects: Q Science > QA Mathematics
Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Faculty / Institute: Faculty of Informatics & Computing
Depositing User: Dr Mumtazimah Mohamad
Date Deposited: 27 Dec 2015 03:10
Last Modified: 27 Dec 2015 03:10

Available Versions of this Item

  • Comparison of Diverse Ensemble Neural Network for Large Data Classification. (deposited 27 Dec 2015 03:10) [Currently Displayed]

Actions (login required)

View Item View Item