Charles Colbourn > AI-NNFlex > AI::NNFlex

Download:
ai-nnflex/AI-NNFlex-0.24.tar.gz

Dependencies

Annotate this POD

Related Modules

Inline::C
Devel::DProf
Filter::Simple
Tk::Pod
more...
By perlmonks.org

CPAN RT

Open  0
View/Report Bugs
Module Version: 0.24   Source  

NAME ^

AI::NNFlex - A base class for implementing neural networks

SYNOPSIS ^

 use AI::NNFlex;

 my $network = AI::NNFlex->new(config parameter=>value);

 $network->add_layer(   nodes=>x,
                        activationfunction=>'function');

 $network->init(); 

 $network->lesion(      nodes=>PROBABILITY,
                        connections=>PROBABILITY);

 $network->dump_state (filename=>'badgers.wts');

 $network->load_state (filename=>'badgers.wts');

 my $outputsRef = $network->output(layer=>2,round=>1);

DESCRIPTION ^

AI::NNFlex is a base class for constructing your own neural network modules. To implement a neural network, start with the documentation for AI::NNFlex::Backprop, included in this distribution

CONSTRUCTOR ^

AI::NNFlex->new ( parameter => value );

randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT

fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS

debug=>[LIST OF CODES FOR MODULES TO DEBUG]

round=>0 or 1, a true value sets the network to round output values to nearest of 1, -1 or 0

The constructor implements a fairly generalised network object with a number of parameters.

The following parameters are optional: randomweights fixedweights debug round

(Note, if randomweights is not specified the network will default to a random value from 0 to 1.

METHODS ^

This is a short list of the main methods implemented in AI::NNFlex.

AI::NNFlex

add_layer

 Syntax:

 $network->add_layer(   nodes=>NUMBER OF NODES IN LAYER,
                        persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
                        decay=>RATE OF ACTIVATION DECAY PER PASS,
                        randomactivation=>MAXIMUM STARTING ACTIVATION,
                        threshold=>NYI,
                        activationfunction=>"ACTIVATION FUNCTION",
                        randomweights=>MAX VALUE OF STARTING WEIGHTS);

Add layer adds whatever parameters you specify as attributes of the layer, so if you want to implement additional parameters simply use them in your calling code.

Add layer returns success or failure, and if successful adds a layer object to the $network->{'layers'} array. This layer object contains an attribute $layer->{'nodes'}, which is an array of nodes in the layer.

init

 Syntax:

 $network->init();

Initialises connections between nodes, sets initial weights. The base AI::NNFlex init method implementes connections backwards and forwards from each node in each layer to each node in the preceeding and following layers.

init adds the following attributes to each node:

The connections to easterly nodes are not used in feedforward networks. Init also implements the Bias node if specified in the network config.

connect

Syntax: $network->connect(fromlayer=>1,tolayer=>0); $network->connect(fromnode=>'1,1',tonode=>'0,0');

Connect allows you to manually create connections between layers or nodes, including recurrent connections back to the same layer/node. Node indices must be LAYER,NODE, numbered from 0.

Weight assignments for the connection are calculated based on the network wide weight policy (see INIT).

lesion

 $network->lesion (nodes=>PROBABILITY,connections=>PROBABILITY)

 Damages the network.

PROBABILITY

A value between 0 and 1, denoting the probability of a given node or connection being damaged.

Note: this method may be called on a per network, per node or per layer basis using the appropriate object.

EXAMPLES ^

See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.

PREREQs ^

None. NNFlex should run OK on any version of Perl 5 >.

ACKNOWLEDGEMENTS ^

Phil Brierley, for his excellent free java code, that solved my backprop problem

Dr Martin Le Voi, for help with concepts of NN in the early stages

Dr David Plaut, for help with the project that this code was originally intended for.

Graciliano M.Passos for suggestions & improved code (see SEE ALSO).

Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.

SEE ALSO ^

 AI::NNFlex::Backprop
 AI::NNFlex::Feedforward
 AI::NNFlex::Mathlib
 AI::NNFlex::Dataset

 AI::NNEasy - Developed by Graciliano M.Passos 
 (Shares some common code with NNFlex)

TODO ^

 Lots of things:

 clean up the perldocs some more
 write gamma modules
 write BPTT modules
 write a perceptron learning module
 speed it up
 write a tk gui

CHANGES ^

v0.11 introduces the lesion method, png support in the draw module and datasets.

v0.12 fixes a bug in reinforce.pm & adds a reflector in feedforward->run to make $network->run($dataset) work.

v0.13 introduces the momentum learning algorithm and fixes a bug that allowed training to proceed even if the node activation function module can't be loaded

v0.14 fixes momentum and backprop so they are no longer nailed to tanh hidden units only.

v0.15 fixes a bug in feedforward, and reduces the debug overhead

v0.16 changes some underlying addressing of weights, to simplify and speed

v0.17 is a bugfix release, plus some cleaning of UI

v0.20 changes AI::NNFlex to be a base class, and ships three different network types (i.e. training algorithms). Backprop & momentum are both networks of the feedforward class, and inherit their 'run' method from feedforward.pm. 0.20 also fixes a whole raft of bugs and 'not nices'.

v0.21 cleans up the perldocs more, and makes nnflex more distinctly a base module. There are quite a number of changes in Backprop in the v0.21 distribution.

v0.22 introduces the ::connect method, to allow creation of recurrent connections, and manual control over connections between nodes/layers.

v0.23 includes a Hopfield module in the distribution.

v0.24 fixes a bug in the bias weight calculations

COPYRIGHT ^

Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.

CONTACT ^

 charlesc@nnflex.g0n.net
syntax highlighting: