basecls.layers.activations#

basecls.layers.activations.activation(name, **kwargs)[源代码]#

Helper for building an activation layer.

参数

name (Union[str, Callable]) – activation name, supports "elu", "gelu", "hsigmoid", "hswish", "leaky_relu", "relu", "relu6", "prelu", "silu" and "tanh".

返回类型

Module

返回

An activation module.

class basecls.layers.activations.ELU(alpha=1.0, name=None)[源代码]#

基类:Module

ELU activation function.

\[\begin{split}\text{ELU}(x) = \begin{cases} x, & \text{if } x > 0, \\ \alpha \left( \exp(x) - 1 \right), & \text{if } x \le 0 \end{cases}\end{split}\]
参数

alpha (float) – the \(\alpha\) value for the ELU formulation. Default: 1.0

forward(x)[源代码]#
返回类型

Tensor

class basecls.layers.activations.HSigmoid(name=None)[源代码]#

基类:Module

Hard sigmoid activation function.

\[\begin{split}\text{HSigmoid}(x) = \begin{cases} 0 & \text{if } x \le -3, \\ 1 & \text{if } x \ge 3, \\ x / 6 + 1 / 2 & \text{otherwise} \end{cases}\end{split}\]
forward(x)[源代码]#
返回类型

Tensor

class basecls.layers.activations.HSwish(name=None)[源代码]#

基类:Module

Hard swish activation function.

\[\begin{split}\text{HSwish}(x) = \begin{cases} 0 & \text{if } x \le -3, \\ x & \text{if } x \ge 3, \\ x (x + 3) / 6 & \text{otherwise} \end{cases}\end{split}\]
forward(x)[源代码]#
返回类型

Tensor

class basecls.layers.activations.ReLU6(name=None)[源代码]#

基类:Module

ReLU6 activation function.

\[\text{ReLU6}(x) = \min \left( \max(0, x), 6 \right)\]
forward(x)[源代码]#
返回类型

Tensor

class basecls.layers.activations.Tanh(name=None)[源代码]#

基类:Module

Tanh activation function.

\[\text{Tanh}(x) = \text{tanh}(x) = \frac{\exp(x) - \exp(-x)}{\exp(x) + \exp(-x)}\]
forward(x)[源代码]#
返回类型

Tensor