ANN
0.1.1.5
A library containing multiple neural network models written in C
|
Some useful functions. More...
Go to the source code of this file.
Macros | |
#define | F_ACT_ELU_ALPHA 0.01 |
#define | F_COST_QUADRATIC_CONSTANT 1/2 |
Functions | |
double | f_init_rand_norm () |
Weight and bias initialization function for hidden and output layer. More... | |
double | f_init_input () |
Weight and bias initialization function input layer. More... | |
double | f_act_sigmoid (double n) |
Sigmoid activation function (for feedforward algorithm) More... | |
double | f_act_sigmoid_de (double n) |
Derivative sigmoid activation function (for backpropagation algorithm) More... | |
double | f_act_input (double n) |
Activation function for input layer (for feedforward algorithm) More... | |
double | f_act_input_de (double n) |
Derivative activation function for input layer (for backpropagation algorithm) More... | |
double | f_act_relu (double n) |
ReLu activation function (for feedforward algorithm) More... | |
double | f_act_relu_de (double n) |
Derivative ReLu activation function (for backpropagation algorithm) More... | |
double | f_act_softplus (double n) |
SoftPlus activation function (for feedforward algorithm) More... | |
double | f_act_softplus_de (double n) |
Derivative SoftPlus activation function (for backpropagation algorithm) More... | |
double | f_act_elu (double n) |
Elu activation function (for feedforward algorithm) More... | |
double | f_act_elu_de (double n) |
Derivative Elu activation function (for backpropagation algorithm) More... | |
double | f_act_swish (double n) |
Swish activation function (for feedforward algorithm) More... | |
double | f_act_swish_de (double n) |
Derivative Swish activation function (for backpropagation algorithm) More... | |
double | f_cost_quadratic_loss (double o, double t) |
Quadratic cost function. More... | |
double | f_cost_quadratic_loss_de (double o, double t) |
Derivative Quadratic cost function(for backpropagation algorithm) More... | |
Some useful functions.
Some useful functions such as activation functions, cost functions or weight/bias initialization functions
Definition in file tools.h.
#define F_ACT_ELU_ALPHA 0.01 |
#define F_COST_QUADRATIC_CONSTANT 1/2 |
f_act_elu | ( | double | n | ) |
f_act_elu_de | ( | double | n | ) |
f_act_input | ( | double | n | ) |
f_act_input_de | ( | double | n | ) |
Derivative activation function for input layer (for backpropagation algorithm)
[in] | n | activation sum |
f_act_relu | ( | double | n | ) |
f_act_relu_de | ( | double | n | ) |
f_act_sigmoid | ( | double | n | ) |
f_act_sigmoid_de | ( | double | n | ) |
f_act_softplus | ( | double | n | ) |
f_act_softplus_de | ( | double | n | ) |
f_act_swish | ( | double | n | ) |
f_act_swish_de | ( | double | n | ) |
f_cost_quadratic_loss | ( | double | o, |
double | t | ||
) |
f_cost_quadratic_loss_de | ( | double | o, |
double | t | ||
) |
f_init_input | ( | ) |