MQL5 中的矩阵和向量：激活函数

MetaTrader 5示例 | 7 十一月 2023, 09:43
896 0

描图

ESC 键终止脚本。 脚本本身附在下面。 类似的脚本是由利用 MQL5 矩阵的反向传播神经网络一文的作者编写。 我们延用他的思路，将相应激活函数导数的数值图形与激活函数本身的图形一并显示。

指数线性单元（ELU）激活函数

```   if(x >= 0)
f = x;
else
f = alpha*(exp(x) - 1);
```

```   if(x >= 0)
d = 1;
else
d = alpha*exp(x);

```

```   vector_a.Activation(vector_c,AF_ELU);      // call with the default parameter

```

`   vector_a.Activation(vector_c,AF_ELU,3.0);  // call with alpha=3.0`

指数激活函数

```   f = exp(x);

```

`vector_a.Activation(vector_c,AF_EXP);`

高斯误差线性单元（GELU）激活函数

```   f = 0.5*x*(1 + tanh(sqrt(M_2_PI)*(x+0.044715*pow(x,3)));

```

```   double x_3 = pow(x,3);
double tmp = cosh(0.0356074*x + 0.797885*x);
d = 0.5*tanh(0.0356774*x_3 + 0.398942*x)+(0.535161*x_3 + 0.398942*x)/(tmp*tmp) + 0.5;

```

`vector_a.Activation(vector_c,AF_GELU);`

硬希格玛（Sigmoid）激活函数

```   if(x < -2.5)
f = 0;
else
{
if(x > 2.5)
f = 1;
else
f = 0.2*x + 0.5;
}

```

```   if(x < -2.5)
d = 0;
else
{
if(x > 2.5)
d = 0;
else
d = 0.2;
}
```

`vector_a.Activation(vector_c,AF_HARD_SIGMOID);`

线性激活函数

`   f = alpha*x + beta`

`   d = alpha`

`vector_a.Activation(vector_c,AF_LINEAR);         // call with default parameters`

`vector_a.Activation(vector_c,AF_LINEAR,2.0,5.0);  // call with alpha=2.0 and beta=5.0`

泄露修整线性单元（LReLU）激活函数

```   if(x >= 0)
f = x;
else
f = alpha * x;
```

```   if(x >= 0)
d = 1;
else
d = alpha;
```

```   vector_a.Activation(vector_c,AF_LRELU);      // call with the default parameter

```

`vector_a.Activation(vector_c,AF_LRELU,0.1);  // call with alpha=0.1`

修整线性单元（ReLU）激活函数

```   if(alpha==0)
{
if(x > 0)
f = x;
else
f = 0;
}
else
{
if(x >= max_value)
f = x;
else
f = alpha * (x - treshold);
}

```

```   if(alpha==0)
{
if(x > 0)
d = 1;
else
d = 0;
}
else
{
if(x >= max_value)
d = 1;
else
d = alpha;
}

```

`vector_a.Activation(vector_c,AF_RELU);         // call with default parameters`

`vector_a.Activation(vector_c,AF_RELU,2.0,0.5);      // call with alpha=2.0 and max_value=0.5`

伸缩指数线性单元（SELU）激活函数

```   其中 scale = 1.05070098，alpha = 1.67326324
```

```   if(x >= 0)
d = scale;
else
d = scale * alpha * exp(x);
```

`vector_a.Activation(vector_c,AF_SELU);`

希格玛（Sigmoid）激活函数

`   f = 1 / (1 + exp(-x));`

`   d = exp(x) / pow(exp(x) + 1, 2);`

`vector_a.Activation(vector_c,AF_SIGMOID);`

Softplus 激活函数

`   f = log(exp(x) + 1);`

`   d = exp(x) / (exp(x) + 1);`

`vector_a.Activation(vector_c,AF_SOFTPLUS);`

Softsign 激活函数

`   f = x / (|x| + 1)`

`   d = 1 / (|x| + 1)^2`

`vector_a.Activation(vector_c,AF_SOFTSIGN);`

Swish 激活函数

`   f = x / (1 + exp(-x*beta));`

```   double tmp = exp(beta*x);
d = tmp*(beta*x + tmp + 1) / pow(tmp+1, 2);
```

`vector_a.Activation(vector_c,AF_SWISH);         // call with the default parameter`

`vector_a.Activation(vector_c,AF_SWISH,2.0);     // call with beta = 2.0`

`vector_a.Activation(vector_c,AF_SWISH,0.5);   // call with beta=0.5`

双曲正切（TanH）激活函数

`   f = tanh(x);`

`   d = 1 / pow(cosh(x),2);`

`vector_a.Activation(vector_c,AF_TANH);`

阈值修整线性单元（TReLU）激活函数

```   if(x > theta)
f = x;
else
f = 0;
```

```   if(x > theta)
d = 1;
else
d = 0;
```

`vector_a.Activation(vector_c,AF_TRELU);         // call with default parameter`

```vector_a.Activation(vector_c,AF_TRELU,0.0);     // call with theta = 0.0

```

`vector_a.Activation(vector_c,AF_TRELU,2.0);      // call with theta = 2.0`

参数化修整线性单元（PReLU）激活函数

```   if(x[i] >= 0)
f[i] = x[i];
else
f[i] = alpha[i]*x[i];
```

```   if(x[i] >= 0)
d[i] = 1;
else
d[i] = alpha[i];
```

```vector alpha=vector::Full(vector_a.Size(),0.1);
vector_a.Activation(vector_c,AF_PRELU,alpha);
```

特殊的 Softmax 函数

```   sum_exp = Sum(exp(vector_a))
f = exp(x) / sum_exp
```

`   d = f*(1 - f)`

结束语

该作者的其他文章

MQL5 中的范畴论 (第 8 部分)：幺半群（Monoids）