[Solved]: Adapting neural network

Problem Detail: I have on a few occasions trained neural networks (back propagation networks) with some rather complicated data sets (backgammon positions and OCR). When doing this, it seems that a lot of the work involves trying out different configurations of the networks, in order to find the optimal configuration for learning. Often there is a compromise between small nets that are faster to use/learn, and bigger nets, that are able to represent more knowledge. Then I wonder if it could be possible to make some networks that are both fast and big. I’m thinking that at network where every neuron ain’t fully connected ought to be faster to calculate than nets with full connection on all layers. It could be the training that detected that certain inputs are not needed by certain neurons, and therefore remove those connections. In the same way the training could also involve adding new neurons if some neurons seems to be “overloaded”. Is this something that have been tried out with any success ? Does any classes of networks exists with this kind of behavior ?

Asked By : Ebbe M. Pedersen

Answered By : vzn

This is roughly an open problem subject to ongoing research with various different strategies and heuristics known. A key word is “neural network architecture“. The most basic strategy is to iterate through various network topologies and retrain for each one. Another strategy is to start with a relatively larger network and prune connections that have low weight and retrain and look for improvements; years ago this was called “brain damage” in at least one paper [2]. Here are some example references. There are many others. There is also a possibility of using GA-like algorithms to determine the network structure [3]. Here is part of the abstract from [1]:

The use of ANNs requires some critical decisions on the part of the user, which may affect the accuracy of the resulting classification. In this study, determination of the optimum network structure, which is one of the most important attributes of a network, is investigated. The structure of the network has a direct effect on training time and classification accuracy. Although there is some discussion in the literature of the impact of network structure on the performance of the network, there is no certain method or approach to determine the best structure. Investigations of the relationship between the network structure and the accuracy of the classification are reported here, using a MATLAB tool-kit to take the advantage of scientific visualisation.

[1] Determining Optimum Structure for Artificial Neural Networks by Taskin Kavzoglu [2] Optimal brain damage Le Cun, Denker, Solla [3] Finding Optimal Neural Network Architecture Using Genetic Algorithms Britos et al

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/10937

Leave a Reply