A Performance Comparison of Super Resolution Model with Different Activation Functions


KIPS Transactions on Software and Data Engineering, Vol. 9, No. 10, pp. 303-308, Oct. 2020
https://doi.org/10.3745/KTSDE.2020.9.10.303,   PDF Download:
Keywords: Super Resolution, Performance Comparison, EDSR, Activation Function
Abstract

The ReLU(Rectified Linear Unit) function has been dominantly used as a standard activation function in most deep artificial neural network models since it was proposed. Later, Leaky ReLU, Swish, and Mish activation functions were presented to replace ReLU, which showed improved performance over existing ReLU function in image classification task. Therefore, we recognized the need to experiment with whether performance improvements could be achieved by replacing the RELU with other activation functions in the super resolution task. In this paper, the performance was compared by changing the activation functions in EDSR model, which showed stable performance in the super resolution task. As a result, in experiments conducted with changing the activation function of EDSR, when the resolution was converted to double, the existing activation function, ReLU, showed similar or higher performance than the other activation functions used in the experiment. When the resolution was converted to four times, Leaky ReLU and Swish function showed slightly improved performance over ReLU. PSNR and SSIM, which can quantitatively evaluate the quality of images, were able to identify average performance improvements of 0.06%, 0.05% when using Leaky ReLU, and average performance improvements of 0.06% and 0.03% when using Swish. When the resolution is converted to eight times, the Mish function shows a slight average performance improvement over the ReLU. Using Mish, PSNR and SSIM were able to identify an average of 0.06% and 0.02% performance improvement over the RELU. In conclusion, Leaky ReLU and Swish showed improved performance compared to ReLU for super resolution that converts resolution four times and Mish showed improved performance compared to ReLU for super resolution that converts resolution eight times. In future study, we should conduct comparative experiments to replace activation functions with Leaky ReLU, Swish and Mish to improve performance in other super resolution models.


Statistics
Show / Hide Statistics

Statistics (Cumulative Counts from September 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article
[IEEE Style]
Y. Yoo, D. Kim, J. Lee, "A Performance Comparison of Super Resolution Model with Different Activation Functions," KIPS Transactions on Software and Data Engineering, vol. 9, no. 10, pp. 303-308, 2020. DOI: https://doi.org/10.3745/KTSDE.2020.9.10.303.

[ACM Style]
Youngjun Yoo, Daehee Kim, and Jaekoo Lee. 2020. A Performance Comparison of Super Resolution Model with Different Activation Functions. KIPS Transactions on Software and Data Engineering, 9, 10, (2020), 303-308. DOI: https://doi.org/10.3745/KTSDE.2020.9.10.303.