Imran's personal blog

January 28, 2016

Notes on Encog

Filed under: Uncategorized — ipeerbhai @ 11:46 pm

I’ve been learning heaton research‘s encog machine learning framework.  This post is simple notes on how to use the framework.  As a fan of Windows, Visual Studio, and C#, I wanted a framework that was easy to learn and use with that stack.

Step 1: Prepare data.

Use any class that implements IMLData.  Here’s a quick snippet with BasicML data type.

Encog.ML.Data.IMLData myData = new Encog.ML.Data.Basic.BasicMLData(new double[] { 0.0, 1.0 });

We can also use data sets. Here’s a snippet matrix for XOR and its labeled output vector.

double[][] matrixXorInputs = {
new[] { 0.0, 0.0 },
new[] { 0.0, 1.0 },
new[] { 1.0, 0.0 },
new[] { 1.0, 1.0 }
};

double[][] vectorLabeledOutputs =
{
new[] {0.0},
new[] {1.0},
new[] {1.0},
new[] {0.0}
};
Encog.ML.Data.Basic.BasicMLDataSet myDataset = new Encog.ML.Data.Basic.BasicMLDataSet(matrixXorInputs, vectorLabeledOutputs);

 

Step 2. Prepare the network.

This gets a little more complicated.  Neural networks have layers — input layer, hidden layer(s) and output layer.  You define each layer then initialize the network to random weights.  Remember neural network theory — neural networks are searches and combinations over a topology.  Each layer allows a new set of combinations — you can sort of map the concept to the size of an exponent, with more layers increasing the maximum dimensionality of your search space.  Here’s the C# code for a simple feed forward network in low-dimensional space.

Encog.Neural.Networks.BasicNetwork myNetwork = new Encog.Neural.Networks.BasicNetwork(); // Create the network, then configure it.

myNetwork.AddLayer(new Encog.Neural.Networks.Layers.BasicLayer(null, true, 2)); // an input layer with 2 inputs and a single bias neuron

myNetwork.AddLayer(new Encog.Neural.Networks.Layers.BasicLayer(new Encog.Engine.Network.Activation.ActivationSigmoid(), true, 2)); // hidden layer — Sigmoid activation function per neuron, 1 bias neuron in layer, 2 neurons.

myNetwork.AddLayer(new Encog.Neural.Networks.Layers.BasicLayer(new Encog.Engine.Network.Activation.ActivationSigmoid(), false, 1)); // output layer — Sigmoid activation function per neuron, 0 bias neuron in layer, 1 neuron.

// Now, tell encog that I’m done declaring the net structure, and initialize the net.

myNetwork.Structure.FinalizeStructure();
myNetwork.Reset(); //random weights to start.

Step 3.  Train the network.

You can use any class that impliments IMLTrain.  Here’s a simple resilient propogation trainer:

Encog.ML.Train.IMLTrain myTrainer = new Encog.Neural.Networks.Training.Propagation.Resilient.ResilientPropagation(myNetwork, myDataset);
do
{
myTrainer.Iteration(); // run through the reslient perceptron in a single iteration.
}
while (myTrainer.Error > 0.01);

Step 4. Use the trained network to evaluate unknown inputs.

double[] output = new double[] { 0.0 };
myNetwork.Compute(new double[] { 0.0, 0.0 }, output); // The input to the evaluator is 0, 0.  Use whatever input you want.
Console.WriteLine(“output is {0}”, output[0]);

Usage notes:

Training can get into oscillations where it just can’t train.  The training time is highly variable — sometimes you get trained in few iterations, and sometimes, it takes a while.  In a quick sample run of the above code, I could train in 60 iterations or never train.  This is due to the learning rate — neural networks seem to suffer the same issues as any gradient descent based regression algorithm.

 

Advertisements

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: