ANN  0.1.1.5
A library containing multiple neural network models written in C
tools.c File Reference
#include <stdlib.h>
#include <math.h>
#include <time.h>
#include "ANN/tools.h"
+ Include dependency graph for tools.c:

Go to the source code of this file.

Functions

double f_init_rand_norm ()
 Weight and bias initialization function for hidden and output layer. More...
 
double f_init_input ()
 Weight and bias initialization function input layer. More...
 
double f_act_sigmoid (double n)
 Sigmoid activation function (for feedforward algorithm) More...
 
double f_act_sigmoid_de (double n)
 Derivative sigmoid activation function (for backpropagation algorithm) More...
 
double f_act_input (double n)
 Activation function for input layer (for feedforward algorithm) More...
 
double f_act_input_de (double n __attribute__((unused)))
 
double f_act_relu (double n)
 ReLu activation function (for feedforward algorithm) More...
 
double f_act_relu_de (double n)
 Derivative ReLu activation function (for backpropagation algorithm) More...
 
double f_act_softplus (double n)
 SoftPlus activation function (for feedforward algorithm) More...
 
double f_act_softplus_de (double n)
 Derivative SoftPlus activation function (for backpropagation algorithm) More...
 
double f_act_elu (double n)
 Elu activation function (for feedforward algorithm) More...
 
double f_act_elu_de (double n)
 Derivative Elu activation function (for backpropagation algorithm) More...
 
double f_act_swish (double n)
 Swish activation function (for feedforward algorithm) More...
 
double f_act_swish_de (double n)
 Derivative Swish activation function (for backpropagation algorithm) More...
 
double f_cost_quadratic_loss (double o, double t)
 Quadratic cost function. More...
 
double f_cost_quadratic_loss_de (double o, double t)
 Derivative Quadratic cost function(for backpropagation algorithm) More...
 

Function Documentation

◆ f_act_elu()

double f_act_elu ( double  n)

Elu activation function (for feedforward algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 59 of file tools.c.

◆ f_act_elu_de()

double f_act_elu_de ( double  n)

Derivative Elu activation function (for backpropagation algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 64 of file tools.c.

+ Here is the call graph for this function:

◆ f_act_input()

double f_act_input ( double  n)

Activation function for input layer (for feedforward algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 29 of file tools.c.

◆ f_act_input_de()

double f_act_input_de ( double n   __attribute__(unused))

Definition at line 34 of file tools.c.

◆ f_act_relu()

double f_act_relu ( double  n)

ReLu activation function (for feedforward algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 39 of file tools.c.

◆ f_act_relu_de()

double f_act_relu_de ( double  n)

Derivative ReLu activation function (for backpropagation algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 44 of file tools.c.

◆ f_act_sigmoid()

double f_act_sigmoid ( double  n)

Sigmoid activation function (for feedforward algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 19 of file tools.c.

◆ f_act_sigmoid_de()

double f_act_sigmoid_de ( double  n)

Derivative sigmoid activation function (for backpropagation algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 24 of file tools.c.

+ Here is the call graph for this function:

◆ f_act_softplus()

double f_act_softplus ( double  n)

SoftPlus activation function (for feedforward algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 49 of file tools.c.

◆ f_act_softplus_de()

double f_act_softplus_de ( double  n)

Derivative SoftPlus activation function (for backpropagation algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 54 of file tools.c.

+ Here is the call graph for this function:

◆ f_act_swish()

double f_act_swish ( double  n)

Swish activation function (for feedforward algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 69 of file tools.c.

+ Here is the call graph for this function:

◆ f_act_swish_de()

double f_act_swish_de ( double  n)

Derivative Swish activation function (for backpropagation algorithm)

Parameters
[in]nactivation sum
Returns
double

Definition at line 74 of file tools.c.

+ Here is the call graph for this function:

◆ f_cost_quadratic_loss()

double f_cost_quadratic_loss ( double  o,
double  t 
)

Quadratic cost function.

Parameters
[in]ooutput
[in]ttarget
Returns
double

Definition at line 82 of file tools.c.

◆ f_cost_quadratic_loss_de()

double f_cost_quadratic_loss_de ( double  o,
double  t 
)

Derivative Quadratic cost function(for backpropagation algorithm)

Parameters
[in]ooutput
[in]ttarget
Returns
double

Definition at line 87 of file tools.c.

◆ f_init_input()

double f_init_input ( )

Weight and bias initialization function input layer.

Returns
1

Definition at line 12 of file tools.c.

◆ f_init_rand_norm()

double f_init_rand_norm ( )

Weight and bias initialization function for hidden and output layer.

Returns
a double between -1 and 1

Definition at line 7 of file tools.c.