• فهرس المقالات Activation function

      • حرية الوصول المقاله

        1 - Option pricing with artificial neural network in a time dependent market
        Mehran Araghi Elham Dastranj Abdolmajid Abdolbaghi Ataabadi Hossein Sahebi Fard
        In this article, the pricing of option contracts is discussed using the Mikhailov and Nogel model and the artificial neural network method. The purpose of this research is to investigate and compare the performance of various types of activator functions available in ar أکثر
        In this article, the pricing of option contracts is discussed using the Mikhailov and Nogel model and the artificial neural network method. The purpose of this research is to investigate and compare the performance of various types of activator functions available in artificial neural networks for the pricing of option contracts. The Mikhailov and Nogel model is the same model that is dependent on time. In the design of the artificial neural network required for this research, the parameters of the Mikhailov and Nogel model have been used as network inputs, as well as 700 data from the daily price of stock options available in the Tehran Stock Exchange market (in 2021) as the net-work output. The first 600 data are considered for learning and the remaining data for comparison and conclusion. At first, the pricing is done with 4 commonly used activator functions, and then the results of each are com-pared with the real prices of the Tehran Stock Exchange to determine which item provides a more accurate forecast. The results obtained from this re-search show that among the activator functions available in this research, the ReLU activator function performs better than other activator functions. تفاصيل المقالة
      • حرية الوصول المقاله

        2 - ‎Fuzzy Ordinary and Fractional General Sigmoid Function Activated‎ ‎Neural Network Approximation
        George Anastassiou
        Here we research the univariate fuzzy ordinary and fractional quantitative approximation of fuzzy real valued functions on a compact interval by quasi-interpolation general sigmoid activation function relied on fuzzy neural network operators. These approximations are de أکثر
        Here we research the univariate fuzzy ordinary and fractional quantitative approximation of fuzzy real valued functions on a compact interval by quasi-interpolation general sigmoid activation function relied on fuzzy neural network operators. These approximations are derived by establishing fuzzy Jackson type inequalities involving the fuzzy moduli of continuity of the function, or of the right and left Caputo fuzzy fractional derivatives of the involved function. The approximations are fuzzy pointwise and fuzzy uniform. The related feed-forward fuzzy neural networks are with one hidden layer. We study in particular the fuzzy integer derivative and just fuzzy continuous cases. Our fuzzy fractional approximation result using higher order fuzzy differentiation converges better than in the fuzzy just continuous case. تفاصيل المقالة
      • حرية الوصول المقاله

        3 - MCSM-DEEP: A Multi-Class Soft-Max Deep Learning Classifier for Image Recognition
        Aref Safari
        Convolutional neural networks show outstanding performance in many image processing problems, such as image recognition, object detection and image segmentation. Semantic segmentation is a very challenging task that requires recognizing, understanding what's in the imag أکثر
        Convolutional neural networks show outstanding performance in many image processing problems, such as image recognition, object detection and image segmentation. Semantic segmentation is a very challenging task that requires recognizing, understanding what's in the image in pixel level. The goal of this research is to develop on the known mathematical properties of the soft-max function and demonstrate how they can be exploited to conclude the convergence of learning algorithm in a simple application of image recognition in supervised learning. So, we utilize results from convex analysis theory which associated with hierarchical architecture to derive additional properties of the soft-max function not yet covered in the existing literature for Multi-Class Classification problems. The proposed MC-DEEP model represents an average accuracy of 90.25% in different layers setting with 95% confidence interval in best initial settings in deep convolutional layers which applied on MNIST dataset. The results show that the regularized networks not only could provide better segmentation results with regularization effect than the original ones but also have certain robustness to noise. تفاصيل المقالة