Information and Communication Technology
Permanent URI for this community
Browse
Browsing Information and Communication Technology by Subject "Activation functions"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
Item The influence of non-learnable activation functions on the positioning performance of deep learning-based fingerprinting models trained with small CSI sample sizes(Springer, 2022) Lutakamale, Albert Selebea; Manyesela, Yona ZakariaActivation functions, being mathematical ‘gates’ in between the input feeding the current neuron and its output going to the next layer, is very crucial in the training of deep learning models. They play a big part in determining the output of a model, its accuracy, and computational efficiency. In some cases, activation functions have a major effect on the model’s ability to converge and the convergence speed. To be able to train deep learning based fingerprint positioning models using small CSI sample sizes and have satisfactory positioning results, the choice of appropriate activation functions is very important. In this paper we explore several non-learnable activation functions and conduct a comprehensive analysis to study the influence they have on the positioning performance of deep learning fingerprint-based positioning models using small CSI sample sizes. We then propose a better model training approach with a view of getting the best out of those activation functions.Item The Influence of Non-learnable Activation Functions on the Positioning Performance of Deep Learning-Based Fingerprinting Models Trained with Small CSI Sample Sizes(Springer Science and Business Media LLC, 2022) Albert Selebea Lutakamale; Yona Zakaria ManyeselaActivation functions, being mathematical ‘gates’ in between the input feeding the current neuron and its output going to the next layer, is very crucial in the training of deep learning models. They play a big part in determining the output of a model, its accuracy, and computational efficiency. In some cases, activation functions have a major effect on the model’s ability to converge and the convergence speed. To be able to train deep learning based fingerprint positioning models using small CSI sample sizes and have satisfactory positioning results, the choice of appropriate activation functions is very important. In this paper we explore several non-learnable activation functions and conduct a comprehensive analysis to study the influence they have on the positioning performance of deep learning fingerprint-based positioning models using small CSI sample sizes. We then propose a better model training approach with a view of getting the best out of those activation functions.