Parallel Training for Back Propagation in Character Recognition

Mohamad, Mumtazimah (2012) Parallel Training for Back Propagation in Character Recognition. In: 1st Taibah University International Conference on Computing and Information Technology , 12-14 March 2012, Madinah, Saudi Arabia .

[img]
Preview
Text
p89-mohamad.pdf - Published Version

Download (501Kb) | Preview
Official URL: http://www.taibahu.edu.sa/pages.aspx?pid=7544&ln=e...

Abstract

Artificial Neural Network has made the character recognition work easier and they grow tremendously in improving accuracies and efficiency. However, there are always gaps and weaknesses which is need to prevail due to recognition inaccuracies and less discussed especially in handling large scale of data. The parallelism can be regarded as the practical solution in solving large workload. In achieving an optimal training time and generalization ability, possessing the problem for generating suitable comprehensive classifier will affected positively to the time and maintaining the accuracy in the same time. This paper presents an idea of distributing data the same neural network structure to measure the capability of time reduced for recognition accuracy. The MNIST benchmark dataset is used represent handwritten digits for recognition. To test the validity, the data sets are handled in parallel computers in separate training of back propagation neural network. The results indicate the proposed algorithm improve the speed- up performance for large scale neural networks while maintaining its accuracies.

Item Type: Conference or Workshop Item (Poster)
Keywords: MNIST Data Set, Parallel neural network, distributed memory, neural networks
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Faculty / Institute: Faculty of Informatics & Computing
Depositing User: Dr Mumtazimah Mohamad
Date Deposited: 11 May 2014 07:05
Last Modified: 28 Apr 2015 23:42
URI: http://erep.unisza.edu.my/id/eprint/516

Actions (login required)

View Item View Item