A comprehensive overview and survey is presented for AFs in neural networks for deep learning, covering different classes of AFs such as Logistic Sigmoid and Tanh based, ReLU based, ELU based, and Learning based.
Authors
S. Dubey
4 papers
S. Singh
2 papers
B. Chaudhuri
2 papers
References159 items
1
A weight initialization method based on neural network with asymmetric activation function
2
Deep Learning for Weakly-Supervised Object Detection and Localization: A Survey
3
Review the state-of-the-art technologies of semantic segmentation based on deep learning
4
GAN-based anomaly detection: A review
5
A Decade Survey of Content Based Image Retrieval Using Deep Learning
6
Skin disease diagnosis with deep learning: a review
7
Elastic exponential linear units for convolutional neural networks
8
Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks
9
A survey on modern trainable activation functions
10
YOLOv4: Optimal Speed and Accuracy of Object Detection
11
Rational neural networks
12
Dynamic ReLU
13
The Influence of the Activation Function in a Convolution Neural Network Model of Facial Expression Recognition
14
Soft-Root-Sign Activation Function
15
Wide Hidden Expansion Layer for Deep Convolutional Neural Networks
16
Parametric Deformable Exponential Linear Units for deep neural networks
17
Being Bayesian, Even Just a Bit, Fixes Overconfidence in ReLU Networks
18
PCSGAN: Perceptual Cyclic-Synthesized Generative Adversarial Networks for Thermal and NIR to Visible Image Transformation
19
Learning Neural Activations
20
Symmetrical Gaussian Error Linear Units (SGELUs)
21
Time/Accuracy Tradeoffs for Learning a ReLU with respect to Gaussian Marginals
22
L*ReLU: Piece-wise Linear Activation Functions for Deep Fine-grained Visual Categorization
23
Improved Convolutional Neural Network Based on Fast Exponentially Linear Unit Activation Function
24
ReLTanh: An activation function with vanishing gradient resistance for SAE-based DNNs and its application to rotating machinery fault diagnosis
25
Fast Image Restoration With Multi-Bin Trainable Linear Units
26
An Evaluation of Parametric Activation Functions for Deep Learning
27
Improving Weight Initialization of ReLU and Output Layers
28
diffGrad: An Optimization Method for Convolutional Neural Networks
29
PowerNet: Efficient Representations of Polynomials and Smooth Functions by Deep Neural Networks with Rectified Power Units
30
Mish: A Self Regularized Non-Monotonic Neural Activation Function
31
Natural-Logarithm-Rectified Activation Function in Convolutional Neural Networks
32
Evaluation of maxout activations in deep learning across several big data domains
33
Padé Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks
34
Rectified Exponential Units for Convolutional Neural Networks
35
Learning Activation Functions: A new paradigm of understanding Neural Networks
36
A Convolutional Neural Network Model Based on Improved Softplus Activation Function
37
Adaptive activation functions accelerate convergence in deep and physics-informed neural networks
38
Enhancing batch normalized convolutional networks using displaced rectifier linear units: A systematic comparative study
39
ProbAct: A Probabilistic Activation Function for Deep Neural Networks
40
Improving the Antinoise Ability of DNNs via a Bio-Inspired Noise Adaptive Activation Function Rand Softplus
41
Ensemble of Convolutional Neural Networks Trained with Different Activation Functions
42
The Optimized Deep Belief Networks With Improved Logistic Sigmoid Units and Their Application in Fault Diagnosis for Planetary Gearboxes of Wind Turbines
43
An Empirical Study on Generalizations of the ReLU Activation Function
44
Linearized sigmoidal activation: A novel activation function with tractable non-linear characteristics to boost representation capability
45
A simple and efficient architecture for trainable activation functions
46
Widely Linear Kernels for Complex-valued Kernel Activation Functions
47
Kafnets: Kernel-based non-parametric activation functions for neural networks
48
Overfitting remedy by sparsifying regularization on fully-connected layers of CNNs
49
Multikernel activation functions: formulation and a case study
50
Activation Adaptation in Neural Networks
51
Impact of Fully Connected Layers on Performance of Convolutional Neural Networks for Image Classification
52
Activation Functions for Generalized Learning Vector Quantization - A Performance Comparison
53
The Benefits of Over-parameterization at Initialization in Deep ReLU Networks
54
Piecewise Polynomial Activation Functions for Feedforward Neural Networks
55
A Performance Evaluation of Loss Functions for Deep Face Recognition
56
Why ReLU Networks Yield High-Confidence Predictions Far Away From the Training Data and How to Mitigate the Problem
57
Singular Values for ReLU Layers
58
Review of Adaptive Activation Function in Deep Neural Network
59
Activation Functions: Comparison of trends in Practice and Research for Deep Learning
60
A Dynamic ReLU on Neural Network
61
Is it Time to Swish? Comparing Deep Learning Activation Functions Across NLP tasks
62
PLU: The Piecewise Linear Unit Activation Function
63
A Novel Parameterized Activation Function in Visual Geometry Group
64
The Quest for the Golden Activation Function
65
FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
66
Improved Learning in Convolutional Neural Networks with Shifted Exponential Linear Units (ShELUs)
67
Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning
68
vReLU Activation Functions for Artificial Neural Networks
69
A novel softplus linear unit for deep convolutional neural networks
70
Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks
71
Initialization of ReLUs for Dynamical Isometry
72
ARiA: Utilizing Richard's Curve for Controlling the Non-monotonicity of the Activation Function in Deep Neural Nets
73
Symmetric Power Activation Functions for Deep Neural Networks
74
Singularities of Three-Layered Complex-Valued Neural Networks With Split Activation Function
75
Comparison of non-linear activation functions for deep neural networks on MNIST classification task
76
A comparison of deep networks with ReLU activation function and linear spline-type methods
77
Average biased ReLU based CNN descriptor for improved face retrieval
78
Look-Up Table Unit Activation Function for Deep Convolutional Neural Networks
79
A novel type of activation function in artificial neural networks: Trained activation function
80
Complex-Valued Neural Networks With Nonparametric Activation Functions
81
Deep neural networks with Elastic Rectified Linear Units for object recognition
82
Randomly translational activation inspired by the input distributions of ReLU
83
Learning Combinations of Activation Functions
84
E-swish: Adjusting Activations to Different Network Depths
85
A joint residual network with paired ReLUs activation for image super-resolution
86
Adaptive activation functions in convolutional neural networks
87
Bi-modal derivative adaptive activation function sigmoidal feedforward artificial neural networks
88
Resurrecting the sigmoid in deep learning through dynamical isometry: theory and practice
89
Empirical analysis of non-linear activation functions for Deep Neural Networks in classification tasks
90
A comparative performance analysis of different activation functions in LSTM networks for classification
91
Training Feedforward Neural Networks with Standard Logistic Activations is Feasible
92
P-TELU: Parametric Tan Hyperbolic Linear Unit Activation for Deep Neural Networks
93
A Probabilistic Framework for Nonlinearities in Stochastic Neural Networks
94
Squeeze-and-Excitation Networks
95
DReLUs: Dual Rectified Linear Units
96
Neural Networks and Rational Functions
97
Self-Normalizing Neural Networks
98
Hexpo: A vanishing-proof activation function
99
Continuously Differentiable Exponential Linear Units
100
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
101
ConvNets with Smooth Adaptive Activation Functions for Regression
102
Investigative study of various activation functions for speech recognition
103
Activation Ensembles for Deep Neural Networks
104
Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning
105
Bounded activation functions for enhanced training stability of deep neural networks on visual pattern recognition problems
106
Understanding Deep Neural Networks with Rectified Linear Units
107
Nonparametrically Learning Activation Functions in Deep Neural Nets
108
Noisy Softplus: A Biology Inspired Activation Function
109
Error bounds for approximations with deep ReLU networks
110
Densely Connected Convolutional Networks
111
Gaussian Error Linear Units (GELUs)
112
Improving Deep Neural Network with Multiple Parametric Exponential Linear Units
113
Parametric Exponential Linear Unit for Deep Convolutional Neural Networks
114
Learning activation functions from data using cubic spline interpolation
115
Multi-Bias Non-linear Activation in Deep Neural Networks
116
Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units
117
Noisy Activation Functions
118
Revise Saturated Activation Functions
119
Adaptive Activation Functions for Deep Networks
120
A novel activation function for multilayer feed-forward neural networks
121
Deep Learning with S-Shaped Rectified Linear Activation Units
122
Deep Residual Learning for Image Recognition
123
Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs)
124
A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks
125
Improving deep neural networks using softplus units
126
Empirical Evaluation of Rectified Activations in Convolutional Network
127
Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification
128
Learning Activation Functions to Improve Deep Neural Networks
129
Bi-modal derivative activation function for sigmoidal feedforward networks
130
Going deeper with convolutions
131
Very Deep Convolutional Networks for Large-Scale Image Recognition
132
Speech recognition with deep recurrent neural networks
133
Maxout Networks
134
ImageNet classification with deep convolutional neural networks
135
Deep Sparse Rectifier Neural Networks
136
Rectified Linear Units Improve Restricted Boltzmann Machines
137
An activation function adapting training algorithm for sigmoidal feedforward networks
138
Bleu: a Method for Automatic Evaluation of Machine Translation
139
Artificial neural networks in intelligent manufacturing
140
JAC-Net: Joint learning with adaptive exploration and concise attention for unsupervised domain adaptive person re-identification
141
RMAF: Relu-Memristor-Like Activation Function for Deep Learning
142
Yuji Nakatsukasa
143
A Dynamic Rectified Linear Activation Units
144
Comparative Study of Convolution Neural Network’s Relu and Leaky-Relu Activation Functions
145
Dying relu and initialization: Theory and numerical examples
146
Searching for Activation Functions
147
Research on convolutional neural network based on improved Relu piecewise activation function
Performance Analysis of Various Activation Functions in Generalized MLP Architectures of Neural Networks
152
Learning Multiple Layers of Features from Tiny Images
153
Incorporating Second-Order Functional Knowledge for Better Option Pricing
154
Survey of Neural Transfer Functions
155
Gradient-based learning applied to document recognition
156
Artificial Neural Networks for Intelligent Manufacturing
157
This survey also summarizes the AFs with brief highlights and important discussions to depict its suitability for different types of data (Refer to Table 6)
158
This survey is compared with the existing survey and performance analysis to show its importance (Refer to Table 7)
159
This survey enriches the reader with the state-of-the-art AFs with analysis from various perspectives