Option pricing with artificial neural network in a time dependent market
Subject Areas : Financial MathematicsMehran Araghi 1 , Elham Dastranj 2 , Abdolmajid Abdolbaghi Ataabadi 3 , Hossein Sahebi Fard 4
1 - Department of Mathematics, Faculty of Mathematical Sciences, Shahrood University of Technology, Shahrood, Semnan, Iran
2 - Department of Mathematics, Faculty of Mathematical Sciences, Shahrood University of Technology, Shahrood, Semnan,
Iran
3 - Department of Management, Industrial Engineering, Amp and Management Sciences, Shahrood University of Technology
4 - Department of Mathematics, Faculty of Mathematical Sciences, Shahrood University of Technology, Shahrood, Semnan, Iran.
Keywords: Option pricing, Mikhailov and Nogel model, Artificial neural network, Activation function,
Abstract :
In this article, the pricing of option contracts is discussed using the Mikhailov and Nogel model and the artificial neural network method. The purpose of this research is to investigate and compare the performance of various types of activator functions available in artificial neural networks for the pricing of option contracts. The Mikhailov and Nogel model is the same model that is dependent on time. In the design of the artificial neural network required for this research, the parameters of the Mikhailov and Nogel model have been used as network inputs, as well as 700 data from the daily price of stock options available in the Tehran Stock Exchange market (in 2021) as the net-work output. The first 600 data are considered for learning and the remaining data for comparison and conclusion. At first, the pricing is done with 4 commonly used activator functions, and then the results of each are com-pared with the real prices of the Tehran Stock Exchange to determine which item provides a more accurate forecast. The results obtained from this re-search show that among the activator functions available in this research, the ReLU activator function performs better than other activator functions.
[1] Björk, T., Arbitrage theory in continuous time, Oxford university press, 2009;
[2] Dastranj, E., Sahebi Fard, H., Abdolbaghi, A., Power option pricing under the unstable conditions (Evidence of power option pricing under fractional Heston model in the Iran gold market), Physica A: Statistical Mechanics and its Applications, 2020; 537:122690.
[3] Dastranj, E., Sahebi Fard, H., Exact solutions and numerical simulation for Bakstein-Howison model, Computational Methods for Differential Equations, 2022; 10(2):461-474.
[4] François-Lavet, V., Henderson, P., Islam, R., Bellemare, M.G., Pineau, J., An introduction to deep rein-forcement learning, Foundations and Trends in Machine Learning, 2018; 11(3-4): 219-354. doi:10.1561/2200000071
[5] Haykin, S., Neural networks and learning machines, Pearson Education India, 2009.
[6] Hull, J., Risk management and financial institutions, 2012; (Vol. 733). John Wiley & Sons.
[7] Kaustubh, Y., Formulation of a rational option pricing model using artificial neural networks, SoutheastCon, IEEE, 2021. doi:10.20944\preprints202012.0426.v1
[8] MacBeth, J.D., Merville, L.J., An empirical examination of the Black-Scholes call option pricing model, The journal of finance, 1979; 34(5), 1173-1186.
[9] Mikhailov, S., Nogel, U., Heston’s stochastic volatility model: Implementation, calibration and some extensions, Wilmott magazine, 2004.
[10] Mitra, S.K., An option pricing model that combines neural network approach and Black-Scholes for-mula, Global Journal of Computer Science and Technology, 2012; 12(4), P. 7-15.
[11] Pagnottoni, P., Neural network models for Bitcoin option pricing, Frontiers in Artificial Intelli-gence, 2019, 2, 5.
[12] Ren, P., Parametric estimation of the Heston model under the indirect observability framework, Diss. University of Houston, 2014.
[13] Sahebi Fard, H., Dastranj, E., Abdolbaghi Ataabadi, A., Analytical and numerical solutions for the pricing of a combination of two financial derivatives in a market under Hull-White model, Advances in Mathematical Finance and Applications, 2022; 7(4), 1013-1023.doi: 10.22034/amfa.2021.1902303.1447
[14] Sharma, S., Sharma, S., Athaiya, A., Activation functions in neural networks, Towards Data Sci, 2017; 6(12): 310-316.
[15] Shuaiqiang, L., Oosterlee, C.W., Bohte, S.M., Pricing options and computing implied volatilities using neural networks, risks, 2019; 7(1):16.
[16] Vasilev, G.S., Time-dependent Heston model, arXiv preprint arXiv, 2014, 1402.5679.
[17] Zohrabi, A., Tehrani, R., Souri, A., & Sadeghi Sharif, S.J., Provide an improved factor pricing model using neural networks and the gray wolf optimization algorithm, Advances in Mathematical Finance and Applications, 2023, 8(1).
Adv. Math. Fin. App., 2024, 9(2), P.723-736 | |
| Advances in Mathematical Finance & Applications www.amfa.iau-arak.ac.ir Print ISSN: 2538-5569 Online ISSN: 2645-4610
|
Original Research
Option Pricing with Artificial Neural Network in a Time Dependent Market
| ||||
Mehran Araghia, Elham Dastranjb,*, Abdolmajid Abdolbaghi Ata’abadic, Hossein Sahebi Fardd | ||||
a,b,d Department of Mathematics, Faculty of Mathematical Sciences, Shahrood University of Technology, Shahrood, Semnan, Iran cDepartment of Management, Faculty of Industrial Engineering and Management, Shahrood University of Technology, Shahrood, Semnan, Iran.
| ||||
Article Info Article history: Received 2023-03-09 Accepted 2024-02-23
Keywords: Option Pricing Mikhailov and Nogel Model Artificial Neural Network Activation Function |
| Abstract | ||
In this article, the pricing of option contracts is discussed using the Mikhailov and Nogel model and the artificial neural network method. The purpose of this research is to investigate and compare the performance of various types of activator functions available in artificial neural networks for the pricing of option contracts. The Mikhailov and Nogel model is the same model that is dependent on time. In the design of the artificial neural network required for this research, the parameters of the Mikhailov and Nogel model have been used as network inputs, as well as 700 data from the daily price of stock options available in the Tehran Stock Exchange market (in 2021) as the network output. The first 600 data are considered for learning and the remaining data for comparison and conclusion. At first, the pricing is done with 4 commonly used activator functions, and then the results of each are compared with the real prices of the Tehran Stock Exchange to determine which item provides a more accurate forecast. The results obtained from this research show that among the activator functions available in this research, the ReLU activator function performs better than other activator functions.
|
1 Introduction
In financial mathematics, the pricing of options is one of the most important parts. An option is a type of contract between the holder and the grantor of the option, so that the holder of the option buys from the grantor the right to buy or sell a certain asset at a certain time and at a certain price. In the option contract, both parties grant each other a point. The holder pays the assignor an amount under the title of condition fee, which is actually the option price, and the assignor grants the option holder the right to buy or sell the desired asset at a certain time and at a certain price. That is, for this contract to be fair, the holder must pay an amount as the price of the option to the grantor of the option at the time of signing the contract. So the issue of option pricing, in the purchase and sale of option transactions, is investigated under different models [6].
Fischer Black, Myron Scholes, and Robert Merton (1970) initiated a major leap forward in the pricing of options contracts. The result of their work was to present a model named Black-Scholes model. This model had a significant impact on option pricing and risk hedging. The Black-Scholes model played a pivotal role in the success of financial engineering in the 1980s and 1990s and boosted the derivatives pricing market. But it lost its effectiveness in October 1978 when the European and American stock markets crashed. This model of volatility was calculated using the past changes in the price of the underlying asset, if the goal is to measure the volatility in the future [8].Therefore, in order to eliminate these criticisms, in new models such as the Heston model, volatility is considered a stochastic process, so that pricing on options contracts can be done more accurately. To analyze currency option contracts, Heston (1993) presented a model called the Heston model, which was highly scalable and used for more complex situations. Another strength of Heston model is the reversion to the mean process, which justifies the reversion to the mean property of volatility in financial markets. According to this feature, the return of an asset always moves towards the average market return, or more precisely, towards the long-term average return of the same asset. That is, returns that are lower or higher than the average always return to the average. In this model, stock price volatility is not constant and is calculated based on a random process. Kristofferson (2009) added another stochastic process to Heston model and improved this model. The double Heston model also has two stochastic volatility processes, compared to the standard Heston model, it has more flexibility for extreme prices. Making the parameters of this model dependent on time, Mikhailov and Nogel improved Heston model so that the price of short-term options is closer to the real market. Neural network science researchers, McCulloch and Pitts (1940), conducted studies on the internal communication ability of a neuron model. The result of their work was to present a computational model based on a simple pseudo-neuronal element. At that time, other scientists such as Donald Hebb were also working on the rules of adaptation in neuronal systems. Donald Hebb (1949) proposed a learning rule for adapting connections between artificial neurons. Verbus (1971) published a back propagation algorithm in his doctoral dissertation, and finally Rosenblatt (1986) rediscovered this technique. Nowadays, Artificial Neural Networks are widely studied with the aim of achieving human-like efficiency. These networks are composed of linear and non-linear computing elements that operate in parallel. Among the most important features of artificial neural networks is the ability to learn, generalize and parallel processing. So artificial neural networks can learn complex tasks that are difficult for rule-based systems .Artificial neural networks are inspired by the human brain system. In the human brain, there are billions of neurons, through which learning is done, and humans can predict future phenomena using the past they have experienced. Artificial neural network is used as a method to simulate neurons.In order to investigate the pricing of option trading using the Mikhailov and Nogel model and artificial neural networks, articles that are close to the mentioned subject in terms of content were studied. Topics such as artificial neural network, option pricing, Heston and Mikhailov and Nogel model or a combination of them.In recent years, several financial studies have been conducted on options pricing. In [2] the hypothetical options pricing have been done based on fractional Heston model in Iran’s gold market.European options with transaction cost under some Black-Scholes markets are priced with the gold asset as underlying asset in [3]. In [13] European option pricing is driven when zero-coupon bond is considered as underlying asset in a market under Hull-White model. In [15], the optimized artificial neural network is trained on a dataset created by a complex financial model and tested on three models: Black-Scholes, Heston, and Brent's iterative rooting method, and shows that artificial neural networks It can significantly decreases the computation time. Combining artificial neural networks and two Black-Scholes and Heston models in [7], it is concluded that artificial neural networks can be considered as an efficient alternative to existing quantitative models for option pricing. Results available in [11] dedicates the pricing of Bitcoin options sheets are systematically overpriced by classical methods, while there is a significant improvement in price prediction using neural network models. In this research, a number of daily option prices were extracted from the Tehran stock market, and an artificial neural network was designed using the parameters of the Mikhailov and Nogel model. Then the initial learning of the artificial neural network, pricing on the extracted options was done by 4 activation functions. In the sequel their results were compared with the actual value to determine which activator function has better performance than the other functions.This research consists of 6 parts. In section 2, European call option is presented and in section 3 Mikhailov and Nogel model as a time dependent Heston model and parameter estimation are presented. In section 4 artificial neural network and considered activation functions are discussed. The results of option pricing with neural network in a real market and error analysis are examined in section 5. The paper is conclude in section 6.
Option contracts are usually divided into two categories: purchase option and put option. The price specified in the contract for buying or selling in the future is called the agreed price (E) and the time set for the execution of the contract is called the maturity date (t). The value of each call option contract for the underlying asset at the price S is calculated from equation (1) [6]
The value of each put option contract is obtained from
Trading options are also divided into two types, European and American, in terms of contract execution time, the difference being that the European trading option can only be executed at the time of maturity, but the American trading option can also be executed before the maturity date [1]. In this research, European options are used for pricing.
3 Mikhailov and Nogel model
In Heston model, the price process of the basic asset, namely , at moment is
For volatility in this model, another stochastic differential equation is considered
where is the basic asset price process, is the basic asset price volatility process, and are the Brownian process with correlation coefficient . is the long-term mean, is the rate of return to the mean, and is the variance of the volatility process. The Mikhailov and Nogel model is actually Heston model, which is dependent on time and is expressed as follows [9].
where is the basic asset price change process and is the standard deviation process of the basic asset price and r is the risk-free interest rate and is the rate of return to the average of the basic asset price process, is the long-term average of the basic asset price process, is the price fluctuation process It is a basic asset. Also, and are two Brownian processes with correlation coefficient , that is, the following relationship holds
The differential equation related to the Mikhailov and Nogel model is
3.1 Parameter estimation
At first, the following points should be taken into account; The initial value of , called , is always positive. the rate of return to the mean of the underlying asset price process, the underlying asset price volatility process, and the long-term mean of the underlying asset price process all have positive values. and Feller's condition, which is expressed as [12].
Theorem: The MLE estimation for calculating parameters κ θ and σ in each time step is
where and is an arbitrary positive number.[12]
4 Artificial neural network
Artificial neural network is built as a method to simulate neurons or a network of neurons in the brain, which is generally made of three main parts; Input layer, hidden layer, output layer. In the artificial neural network system, the processing elements are called artificial neurons or nodes. In this model, the flow of information from input to output units is one-way. Data passes through multiple nodes without any information feedback.
Fig. 1: Simple simulation of an artificial neural network [4]
First, the primary signals or data enter the input layer, and then they are modified by multiplying by a modified weight and enter the node in the next layer, and the modified signals in each node are added together and modified again by multiplying by a new weight until the final output. In this process, the system seeks to find the most optimal weights so that the output of the system is as close to the expected output as possible. This operation is performed for a large number of initial data to obtain the final optimal weights. The components of the artificial neural network are: Input, hidden and output layer, weights, bias, activation and cost function.
Fig. 2: Mathematical scheme of a neuron [5]
In the artificial neural network of Figure 2, the value of the output layer is calculated from
In this system, interaction (Bias) is an additional parameter that is defined as the bias of artificial neural network. In this way, it is added to the product of variables and weights and helps the model to direct the activator function to the positive or negative side. The role of bias is similar to the role of constant value in linear function [4].
4.1 Feed forward
In the feed forward propagation stage, the information flow is forward. In other words, in feed forward propagation, through the input layer, data enters the network and by applying calculations on them using activation functions in hidden layers, the output of each layer is transferred to its next layer until finally the output of the network is determined in the last layer. In the feed forward propagation stage, the activator function is considered as a "port" that sends the inputs of each layer to the next layer [4].
4.2 Back propagation
The purpose of the back propagation step is to reduce the value of the cost function by adjusting the values of the weights and biases of the network. The gradients of the cost function determine the amount of changes according to parameters such as activation function, weights, bias and other relevant items.
Fig. 3: Computation method of an artificial neural network [4]
4.3 Loss function
The task of the cost function, as an important part of the artificial neural network system, is to check the predicted error rate with the actual value and correct the existing weights. The cost function for artificial neural networks is defined as follows
In relation (15) is the total number of layers in the neural network, the number of nodes in the first layer (not including the bias node), the number of output classes, the weight matrix of the first layer, the actual value of the tth node in the output layer and is the prediction value of the artificial neural network in the tth node of the output layer. Now, if we consider a simple classification and a class () and regularization is ignored, then our cost function will be calculated as
4.4 Activation function
The function that is placed on the neuron to affect the inputs of the neuron and produce the desired output is called the activation function of that neuron. The main component of artificial neural networks is the activation functions. Without the activation functions, the work of artificial neural network will be like linear functions; In this case, it will not work for complex models. While artificial neural networks try to solve problems that are nonlinear and complex in nature. So the existence of the activation function, which is also called the transfer function, is necessary and unavoidable. Because the nodes of the input layer are combined with each other and the output is transferred to the neural network activation function. In other words, the values obtained by multiplying the weights with the input features are added together and the resulting value is transferred to the activation function. The activation function in neural networks makes the linear combination of inputs nonlinear and maps the input values to the space with a certain interval based on the type of activation function. In other words, the activator function decides which neuron will be active and which neuron will be inactive. In the following, the concepts related to the activator function will be examined.
There are many activation functions for neural networks, some of them are discussed [14].
(a) Sigmoid (b) Hyperbolic Tangent
(c) ReLU (d) SELU
Fig. 4: Activation functions
4.4.1 Sigmoid
This nonlinear activation function transforms its input to a value between 0 and 1. The larger the input value, the closer the output value of this function is to 1. While if the input value of this function is very small (negative number), the output value of the sigmoid function will be closer to zero. The sigmoid function is considered a monotonic function, but the derivative of this function is not a monotonic function. This function is considered as one of the most widely used nonlinear activation functions. The mathematical function of this activator is
(1)
4.4.2 Hyperbolic Tangent
This function is very similar to the sigmoid activation function and the curve of this function is similar to s. The only difference between this function and the sigmoid function is the output range of this function, which maps its input value to the range between -1 and 1. The mathematical function of this activator is
(2)
4.4.3 ReLU (Rectified Linear Unit)
The ReLU function is very famous in the field of deep learning and is used most of the time. Although this function looks like a linear function, this function is differentiable and can be used in the backward propagation step. This function does not activate all nodes at the same time. In other words, the nodes are inactive when the input value of this function is less than zero. The mathematical function of this activator is
4.4.4 SELU (Scaled Exponential Linear Unit)
The SELU function performs network normalization. In other words, this function keeps the mean and variance values of each layer. This function uses positive and negative input values to shift the mean value while negative values were ignored in ReLU functions. In this function gradient is used to adjust the variance value. The mathematical function of this activator is
In the artificial neural network system, the more the number of hidden layers and nodes, the more optimal the resulting weights and the more accurate the prediction, but if their number is too high, it reduces the generalization power of the network. There is no specific formula for calculating the number of hidden layers and nodes in each layer, and it is calculated by trial and error, but equation (20) is often used for the number of hidden layer nodes;
where is the number of hidden layer neurons, is the number of samples in the training data, and the number of input and output neurons respectively and α the scaling factor.
5 Approximate Solutions in Real Market
In this research, first, 700 real data from stock option prices in Tehran stock market were prepared, and then the first 600 data were used to train the artificial neural network. These data are placed in the output layer and the parameters of the Mikhailov and Nogel model are placed in the input layers. Thus, for each output data, 9 parameters in the Mikhailov and Nogel model must be calculated in order to complete one step of training. The parameters calculation of the Mikhailov and Nogel model is shown in Table 1.
Table 1: Parameters of Model [12]
Parameter | Definition | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| time to maturity 0.004 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Strike price | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Daily final price | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| time to maturity 0.0008 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Volatility process of underlying asset price | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Correlation between and |
Table 2: Comparison of activation functions [Researcher's findings] | |||||||||
Real price | ReLU | Errors | SELU | Errors | Tanh | Errors | Sigmoid | Errors | |
274 | 275/5446 | 1/5446 | 322/2493 | 48/2493 | 300/1324 | 26/1324 | 332/9395 | 58/9395 | |
131 | 94/3548 | 36/6451 | 78/1182 | 52/8817 | 95/2562 | 35/7437 | 93/3235 | 37/6765 | |
549 | 550/1474 | 1/1474 | 542/9693 | 6/0306 | 540/9936 | 8/0064 | 523/3686 | 25/6314 | |
630 | 651/477 | 21/477 | 651/7065 | 21/7065 | 679/0558 | 49/0558 | 665/3094 | 35/3094 | |
484 | 510/0044 | 26/0044 | 516/8265 | 32/8265 | 546/3494 | 62/3494 | 519/6682 | 35/6682 | |
191 | 142/1177 | 48/8822 | 139/5809 | 51/419 | 125/2397 | 65/7602 | 104/8571 | 86/1429 | |
628 | 572/829 | 55/171 | 560/9797 | 67/0203 | 576/7271 | 51/2729 | 582/2881 | 45/7119 | |
711 | 698/7699 | 12/23004 | 731/2762 | 20/2762 | 725/7449 | 14/7449 | 750/2635 | 39/2635 | |
192 | 187/0587 | 4/9413 | 154/5762 | 37/4237 | 171/0093 | 20/9906 | 200/6189 | 8/6189 | |
292 | 295/0389 | 3/0389 | 302/2323 | 10/2323 | 283/6824 | 8/3175 | 326/0997 | 34/0997 | |
86 | 52/7241 | 33/2758 | 73/0421 | 12/9579 | 82/7172 | 3/2827 | 108/2782 | 22/2782 | |
373 | 374/5111 | 1/51114 | 409/5271 | 36/5271 | 392/4873 | 19/4873 | 408/4270 | 35/4270 | |
200 | 168/9358 | 31/0641 | 165/4225 | 34/5774 | 182/0275 | 17/9725 | 198/4610 | 1/5389 | |
174 | 176/3338 | 2/3338 | 176/7212 | 2/72127 | 183/1125 | 9/11258 | 177/2652 | 3/26526 |
As can be seen in table (2), in the first column, the real price of the call option sheets available in the market is given. In the next columns, the value of the price prediction of the same call option sheets made by the activator functions is placed. In front of the prediction of each activation function, its error rate (the difference between the prediction and the actual value of the first column) is also written. In the sequel, using the equation (21) and (22), the total error value of each activator function was calculated (in Table 2, 14 cases among 106 cases are given).
6 Conclusions
In this research using stock option contracts of the Tehran Stock Exchange and the Mikhailov and Nogel model for pricing option, it was possible to provide up to 600 data. It was prepared for learning the artificial neural network and 4 considered activation functions were learned it. Then, the number of 100 option contracts was collected to compare the predictive power of these 4 functions, and using the Mikhailov and Nogel model, the input parameters for these 100 purchase option contracts were calculated and provided to them. Comparing the prediction of these functions with the actual value available in the market, which can be seen in Table 2 and Figure 8 shows the ReLU function has better performance and more accurate prediction.According to the results obtained in this research as well as the results obtained in a research titled “ Neural network models for Bitcoin option pricing ” and another research with the title “Provide an Improved Factor Pricing Model Using Neural Net-works and the Gray Wolf Optimization Algorithm ” It can be concluded that the pricing on option sheets or common stock sheets using the neural network method and the Relu activator function is much more accurate and faster than the classical methods, and Its use is recommended for financial market participants [11],[17].
References
[1] Björk, T., Arbitrage theory in continuous time, Oxford university press, 2009;
[2] Dastranj, E., Sahebi Fard, H., Abdolbaghi, A., Power option pricing under the unstable conditions (Evidence of power option pricing under fractional Heston model in the Iran gold market), Physica A: Statistical Mechanics and its Applications, 2020; 537:122690.
[3] Dastranj, E., Sahebi Fard, H., Exact solutions and numerical simulation for Bakstein-Howison model, Computational Methods for Differential Equations, 2022; 10(2):461-474.
[4] François-Lavet, V., Henderson, P., Islam, R., Bellemare, M.G., Pineau, J., An introduction to deep reinforcement learning, Foundations and Trends in Machine Learning, 2018; 11(3-4): 219-354. doi:10.1561/2200000071
[5] Haykin, S., Neural networks and learning machines, Pearson Education India, 2009.
[6] Hull, J., Risk management and financial institutions, 2012; (Vol. 733). John Wiley & Sons.
[7] Kaustubh, Y., Formulation of a rational option pricing model using artificial neural networks, SoutheastCon, IEEE, 2021. doi:10.20944\preprints202012.0426.v1
[8] MacBeth, J.D., Merville, L.J., An empirical examination of the Black-Scholes call option pricing model, The journal of finance, 1979; 34(5), 1173-1186.
[9] Mikhailov, S., Nogel, U., Heston’s stochastic volatility model: Implementation, calibration and some extensions, Wilmott magazine, 2004.
[10] Mitra, S.K., An option pricing model that combines neural network approach and Black-Scholes formula, Global Journal of Computer Science and Technology, 2012; 12(4), P. 7-15.
[11] Pagnottoni, P., Neural network models for Bitcoin option pricing, Frontiers in Artificial Intelligence, 2019, 2, 5.
[12] Ren, P., Parametric estimation of the Heston model under the indirect observability framework, Diss. University of Houston, 2014.
[13] Sahebi Fard, H., Dastranj, E., Abdolbaghi Ataabadi, A., Analytical and numerical solutions for the pricing of a combination of two financial derivatives in a market under Hull-White model, Advances in Mathematical Finance and Applications, 2022; 7(4), 1013-1023.doi: 10.22034/amfa.2021.1902303.1447
[14] Sharma, S., Sharma, S., Athaiya, A., Activation functions in neural networks, Towards Data Sci, 2017; 6(12): 310-316.
[15] Shuaiqiang, L., Oosterlee, C.W., Bohte, S.M., Pricing options and computing implied volatilities using neural networks, risks, 2019; 7(1):16.
[16] Vasilev, G.S., Time-dependent Heston model, arXiv preprint arXiv, 2014, 1402.5679.
[17] Zohrabi, A., Tehrani, R., Souri, A., & Sadeghi Sharif, S.J., Provide an improved factor pricing model using neural networks and the gray wolf optimization algorithm, Advances in Mathematical Finance and Applications, 2023, 8(1).
* Corresponding author. Tel.: 09112433899 E-mail address: elham.dastranj@shahroodut.ac.ir
|
© 2024. All rights reserved. |
Related articles
-
-
Evaluation of the association between company performance and Iran’s stock market liquidity
Print Date : 2018-06-01
The rights to this website are owned by the Raimag Press Management System.
Copyright © 2021-2024