Parametric Activation Functions for Improved Identity Verification in IoT Devices: A Deep Learning Approach
Subject Areas : International Journal of Mathematical Modelling & ComputationsSaeed Sarabadan 1 * , Seyed Mohammad Ali Mousavi 2
1 - Department of Mathematics Comprehensive Imam Hossein University Tehran, Iran
2 - Department of Mathematics, Imam Hossein University, Tehran, Iran
Keywords: Deep neural network, Machine learning, Activation functions, Network performance improvement, Internet of things, Radio frequency fingerprint,
Abstract :
One of the most important solutions in designing the architecture of a deep neural network is to use suitable activation functions in the hidden layers of the network. This function plays an important role in the back propagation algorithm and the calculation of the gradient of the cost function is based on the output of the activation function. In this paper, we will model a deep neural network to address an application problem in the Internet of Things, using experimental data recorded in a smart home, with the goal of identifying and preventing unauthorized devices from entering the Internet of Things network. The method used in this study relies on the radio frequency fingerprint of a radio device connected to the Internet of Things. The database used in this study consists of 8000 samples from 15 test series, collected using the One RF Hack radio receiver in the smart home and on 4 different connected devices. Finally ,we evaluated the performance of different activation functions in the hidden layers of the network. Ultimately, the most effective activation function was selected for the efficient and effective network. The Python code of the network architecture is located in GitHub https://github.com/SaeidSarabadan/RF_with_-ANN.
1. Amigo, J. M. "Data Mining, Machine Learning, Deep Learning, Chemometrics: Definitions, Common Points and Trends." *Brazilian Journal of Analytical Chemistry*, 2021.
2. Agostinelli, F., Hoffman, M., Sadowski, P., and Baldi, P. "Learning Activation Functions to Improve Deep Neural Networks." *arXiv preprint arXiv:1412.6830* (2014).
3. Al-Hazbi, S., Hussain, A., Sciancalepore, S., Oligeri, G., and Papadimitratos, P. "Radio Frequency Fingerprinting via Deep Learning: Challenges and Opportunities." *arXiv preprint arXiv:2310.16406v2* (2024).
4. Alcaide, E. "E-swish: Adjusting Activations to Different Network Depths." *arXiv* (2018).
5. Apicella, A., Donnarumma, F., Isgrò, F., and Prevete, R. "A Survey on Modern Trainable Activation Functions." *Neural Networks* 138 (2021): 14–32.
6. Bishop, C. M., and Bishop, H. *Deep Learning Foundations and Concepts*. Springer, 2023.
7. Chandra, P., and Singh, Y. "An Activation Function Adapting Training Algorithm for Sigmoidal Feedforward Networks." *Neurocomputing* 61 (2004): 429-437.
8. Cheridito, P., Jentzen, A., Riekert, A., and Rossmannek, F. "A Proof of Convergence for Gradient Descent in the Training of Artificial Neural Networks for Constant Target Functions." *Journal of Complexity* 101646 (2022), in press.
9. Chieng, H. H., Wahid, N., Pauline, O., and Perla, S. R. K. "Flatten-tswish: A Thresholded ReLU-Swish-like Activation Function for Deep Learning." *International Journal of Advances in Intelligent Informatics* 4, no. 2 (2018): 76–86.
10. Dureja, A., and Pahwa, P. "Analysis of Non-linear Activation Functions for Classification Tasks Using Convolutional Neural Networks." *Recent Patents on Computer Science* 12 (2019): 156–161.
11. Elfwing, S., Uchibe, E., and Doya, K. "Sigmoid-weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning." *Neural Networks* 107 (2018): 3-11.
12. Godfrey, L. B. "An Evaluation of Parametric Activation Functions for Deep Learning." In *IEEE International Conference on Systems, Man and Cybernetics (SMC)* (2019): 3006-3011.
13. Hammad, M. M., and Yahia, M. M. "Mathematics for Machine Learning and Data Science: Optimization with Mathematica Applications." *arXiv preprint* (2023).
14. Hammad, M. M. "Statistics for Machine Learning with Mathematica Applications." *arXiv* (2023).
15. He, K., Zhang, X., Ren, S., and Sun, J. "Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification." In *Proceedings of the IEEE International Conference on Computer Vision*, (2015): 1026–1034.
16. Hu, X., Liu, W., Bian, J., and Pei, J. "Measuring Model Complexity of Neural Networks with Curve Activation Functions." In *Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining*, Virtual Event, CA, USA, July 6–10, 2020, pp. 1521–1531.
17. Hu, Z., and Shao, H. "The Study of Neural Network Adaptive Control Systems." *Control and Decision* 7, no. 2 (1992): 361–366.
18. Jagannath, A., Jagannath, J., and Kumar, P. S. P. V. "A Comprehensive Survey on Radio Frequency (RF) Fingerprinting: Traditional Approaches, Deep Learning, and Open Challenges." *arXiv preprint arXiv:2201.00680v3* (2022).
19. Jian, T., Rendon, B. C., Ojuba, E., et al. "Deep Learning for RF Fingerprinting: A Massive Experimental Study." *Northeastern University, Boston, MA, USA*, 2020.
20. Klambauer, G., Unterthiner, T., Mayr, A., and Hochreiter, S. "Self-normalizing neural networks." Advances in neural information processing systems 30 (2017).
21. Maas, Andrew L., Awni Y. Hannun, and Andrew Y. Ng. "Rectifier nonlinearities improve neural network acoustic models." In Proc. icml, vol. 30, no. 1, p. 3. 2013..
22. Makhdoom, F., and Ul Rahman, J. "A Comparative Exploration of Activation Functions for Image Classification in Convolutional Neural Networks." *i-Manager's Journal on Artificial Intelligence & Machine Learning* 2, no. 1 (2024).
23. Pirnat, A., Bertalanič, B., Cerar, G., et al. "Towards Sustainable Deep Learning for Wireless Fingerprinting Localization." *arXiv preprint arXiv:2201.09071v1* (2022).
24. Prince, S. J. D. *Understanding Deep Learning*. MIT Press, Cambridge, MA, 2023.
25. Qiu, S., Xu, X., and Cai, B. "Frelu: Flexible Rectified Linear Units for Improving Convolutional Neural Networks." In *2018 24th International Conference on Pattern Recognition (ICPR)* (2018): 1223–1228.
26. Ramachandran, P., Zoph, B., and Le, Q. V. "Searching for Activation Functions." In *6th International Conference on Learning Representations*, Vancouver, BC, Canada, April 30 - May 3, 2018, Workshop Track Proceedings (2018).
27. Roy, S. K., Manna, S., Dubey, S. R., and Chaudhuri, B. B. "LiSHT: Non-parametric Linearly Scaled Hyperbolic Tangent Activation Function for Neural Networks." In *Computer Vision and Image Processing. CVIP 2022. Communications in Computer and Information Science*, vol. 1776, D. Gupta, K. Bhurchandi, S. Murala, B. Raman, and S. Kumar, Eds. Springer, (2023): 445-457.
28. Scardapane, S., Van Vaerenbergh, S., Totaro, S., and Uncini, A. "Kafnets: Kernel-based Non-parametric Activation Functions for Neural Networks." *Neural Networks* 110 (2019): 19–32.
29. Singh, Y., and Chandra, P. "A Class + 1 Sigmoidal Activation Functions for FFANNs." *Journal of Economic Dynamics and Control* 28, no. 1 (2004): 183–187.
30. Strang, G. *Linear Algebra and Learning from Data*. Wellesley-Cambridge Press, 2019.
31. Trottier, L., Gigu, P., Chaib-draa, B., et al. "Parametric Exponential Linear Unit for Deep Convolutional Neural Networks." In *Machine Learning and Applications (ICMLA)*, 16th IEEE International Conference on (2017): 207–214.
32. Wang, Xueliang, Honge Ren, and Achuan Wang. "Smish: A novel activation function for deep learning methods." *Electronics* 11, no. 4 (2022): 540.
33. Xu, B., Wang, N., Chen, T., and Li, M. "Empirical Evaluation of Rectified Activations in Convolutional Networks." *arXiv* (2015).
34. Ying, Y., Su, J., Shan, P., Miao, L., Wang, X., and Peng, S. "Rectified Exponential Units for Convolutional Neural Networks." *IEEE Access* 7 (2019): 101633-101640.
35. Zhang, J., Shen, G., Saad, W., and Chowdhury, K. "Radio Frequency Fingerprint Identification for Device Authentication in the Internet of Things." *IEEE Communications Magazine* (2023).