The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

Search results for "ai::neuralnet"

AI::NeuralNet::Mesh - An optimized, accurate neural network Mesh. River stage zero No dependents

AI::NeuralNet::Mesh is an optimized, accurate neural network Mesh. It was designed with accruacy and speed in mind. This network model is very flexable. It will allow for clasic binary operation or any range of integer or floating-point inputs you ca...

JBRYAN/AI-NeuralNet-Mesh-0.44 - 14 Sep 2000 20:56:21 UTC - Search in distribution

AI::NeuralNet::Simple - An easy to use backprop neural net. River stage zero No dependents

The Disclaimer Please note that the following information is terribly incomplete. That's deliberate. Anyone familiar with neural networks is going to laugh themselves silly at how simplistic the following information is and the astute reader will not...

OVID/AI-NeuralNet-Simple-0.11 - 18 Nov 2006 15:53:01 UTC - Search in distribution

AI::NeuralNet::FastSOM - Perl extension for fast Kohonen Maps River stage zero No dependents

A drop-in replacement for Robert Barta's AI::NeuralNet::SOM. See those docs for details....

JRM/AI-NeuralNet-FastSOM-0.19 - 03 Dec 2016 19:54:12 UTC - Search in distribution

AI::NeuralNet::Kohonen - Kohonen's Self-organising Maps River stage zero No dependents

An illustrative implimentation of Kohonen's Self-organising Feature Maps (SOMs) in Perl. It's not fast - it's illustrative. In fact, it's slow: but it is illustrative.... Have a look at AI::NeuralNet::Kohonen::Demo::RGB for an example of visualisatio...

LGODDARD/AI-NeuralNet-Kohonen-0.142 - 08 Aug 2006 13:58:55 UTC - Search in distribution

AI::NeuralNet::BackProp - A simple back-prop neural net that uses Delta's and Hebbs' rule. River stage zero No dependents

AI::NeuralNet::BackProp implements a nerual network similar to a feed-foward, back-propagtion network; learning via a mix of a generalization of the Delta rule and a disection of Hebbs rule. The actual neruons of the network are implemented via the A...

JBRYAN/AI-NeuralNet-BackProp-0.89 - 17 Aug 2000 07:21:47 UTC - Search in distribution

AI::NeuralNet::Hopfield - A simple Hopfiled Network Implementation. River stage zero No dependents

LEPREVOST/AI-NeuralNet-Hopfield-0.1 - 05 Mar 2013 04:27:59 UTC - Search in distribution

AI::NeuralNet::SOM::Rect - Perl extension for Kohonen Maps (rectangular topology) River stage zero No dependents

DRRHO/AI-NeuralNet-SOM-0.07 - 24 May 2008 07:07:15 UTC - Search in distribution

AI::NeuralNet::Kohonen::Visual - Tk-based Visualisation River stage zero No dependents

Provides TK-based visualisation routines for "AI::NueralNet::Kohonen". Replaces the earlier "AI::NeuralNet::Kohonen::Demo::RGB". This is a sub-class of "AI::NeuralNet::Kohonen" that impliments extra methods to make use of TK. This moudle is itself in...

LGODDARD/AI-NeuralNet-Kohonen-Visual-0.3 - 05 May 2006 20:42:16 UTC - Search in distribution

AI::NeuralNet::Kohonen::Demo::RGB - Colour-based demo River stage zero No dependents

A sub-class of "AI::NeuralNet::Kohonen" that impliments extra methods to make use of TK in a very slow demonstration of how a SOM can collapse a three dimensional space (RGB colour values) into a two dimensional space (the display). See SYNOPSIS. The...

LGODDARD/AI-NeuralNet-Kohonen-Demo-RGB-0.123 - 14 Mar 2003 11:28:04 UTC - Search in distribution

AI::PSO - Module for running the Particle Swarm Optimization algorithm River stage zero No dependents

OF ALGORITHM Particle Swarm Optimization is an optimization algorithm designed by Russell Eberhart and James Kennedy from Purdue University. The algorithm itself is based off of the emergent behavior among societal groups ranging from marching of ant...

KYLESCH/AI-PSO-0.86 - 25 Nov 2006 03:49:50 UTC - Search in distribution

AI::NNEasy - Define, learn and use easy Neural Networks of different types using a portable code in Perl and XS. River stage zero No dependents

The main purpose of this module is to create easy Neural Networks with Perl. The module was designed to can be extended to multiple network types, learning algorithms and activation functions. This architecture was 1st based in the module AI::NNFlex,...

GMPASSOS/AI-NNEasy-0.06 - 17 Jan 2005 02:25:07 UTC - Search in distribution

AI::Perceptron - example of a node in a neural network. River stage zero No dependents

This module is meant to show how a single node of a neural network works. Training is done by the *Stochastic Approximation of the Gradient-Descent* model....

SPURKIS/AI-Perceptron-1.0 - 10 Oct 2003 15:48:24 UTC - Search in distribution
12 results (0.057 seconds)