ANN
0.1.1.5
A library containing multiple neural network models written in C
|
Feedforward neural network are used for classification and regression.
First include headers
With the following method, you can create complex neural networks with partial or total connections between neuron layers.
So you can create a new PCFNN
For the moment the network is totally empty. You need to add some layers. So let's create a new input layer with 42 neurons.
and 1 hidden layer
and 1 output layer.
Then you can add those new layers to the network (add then in the order of layers bonds).
Now you can configure the hidden layer and the output layer. So we will link all neurons from the input layer l1 with an offset of 0 neurons to the 64 neurons in l2 with and offset of 0 neurons and we will use the provided initialisation function and the sigmoid activation function to create new neurons in the hidden layer l2.
Then we can connected the hidden layer l2 to the output layer l3 and we want 2 neurons in l3
Now the neural network is well configured. So we can build it!
And that's all!
With the following method, you can create a fully connected neural networks from an array of integers.
For example, we want 1 input layer with 2 neurons, 1 hidden layer with 2 neurons and 1 output layer with 1 neurons. It's a XOR neural network. With this information, we can build the following array.
We will use the sigmoid activation function and the default initializer.
Call this function and your neural network is ready!
Create an array with the input data. The size of this array must be equal to the input layer size; here 42.
Initialize this array and call
Then you can use this function.
To get the output of the network, use this function:
The size of output is equal to the output layer size. This pointer must be free after usage.
Imagine you want to train the network and you have a 100 items in your dataset. So the size of your dataset is 100. So you can initialize 2 array of 100 double pointer:
The size of each array in input must be equal to the input layer size; here 42. And the size of each array in target must be equal to the output layer size; here 2. Now initialize each array with your dataset.
To train the network we will use a learning rate of 0.1 and a momentum of 0.8 with 20000 epochs with a batch size of 2 and we want the dataset to be shuffle after each epochs. We also want to use the last 25%(1/4) of our dataset to validate the training. For the cost function, we will use the quadractic loss function.
When the training is done, you will have the validate pointer. It's an array of the average of the loss of each output neuron.
If you don't need anymore to use the network, you can call
This function will free the network and all layers linked to the network for you!
If you want to save the configuration of the PCFNN, you can use the following function:
If you want to load it again, initialize the network and use the following function:
Read the documentation !