• XML

    isc pubmed crossref medra doaj doaj
  • List of Articles


      • Open Access Article

        1 - Implementing a Realtime Localization System based on Zigbee protocol
        Omid Pakdel Azar
        Realtime Localization System is one of the several technologies in order to track the objects and target location. In this paper, a novel positioning system is represented, which is based on Ultra-Wideband technology. The proposed system is used to utilized the TDOA alg More
        Realtime Localization System is one of the several technologies in order to track the objects and target location. In this paper, a novel positioning system is represented, which is based on Ultra-Wideband technology. The proposed system is used to utilized the TDOA algorithm. In this method, the distance between the source and the target node is estimated by measuring the time of sending and receiving the message. Furthermore, we use a Zigbee protocol to implement a communication link between different nodes in this system. The software libraries are implemented independently. The result shows that our positioning system provides an accuracy of 20cm in terms of ranging. In addition, the data rate of the Zigbee link is 72byte per second. This amount of error can be greatly reduced by interpolating and calibrating custom environments. This accuracy is an acceptable and negligible error due to the error of 10 to 100 meters in expensive modules with GPS technology. Manuscript profile
      • Open Access Article

        2 - QoS Management Solution in Software Defined Networking using Ryu Controller
        Shiva Karimi Amir Joz Ashoori
        Introduction: Enterprise networks are increasingly becoming larger and more dynamic due to vast deployments of virtualization technologies. Consequently, the explosion of new network applications and services has strained the capabilities of traditional networking archi More
        Introduction: Enterprise networks are increasingly becoming larger and more dynamic due to vast deployments of virtualization technologies. Consequently, the explosion of new network applications and services has strained the capabilities of traditional networking architecture in terms of scalability, agility, and efficient traffic management. SDN (Software Defined Networking) is a novel approach to build networks in which control logic is decoupled from data forwarding in order to enable programmability and ease of configuration across the entire network. The centralized control in SDN provides a global view of the entire network resources and their performance which enables the innovation of new service models. This paper demonstrates the implementation of SDN in a sample data center network topology using Mininet and the RYU controller, followed by employing policy-based network management and a differentiated service mechanism for guaranteeing the QoS for different classes of traffic. The proposed framework is a foundation to develop an enterprise-level network control and management product. Method: The approach of this paper is an implementation of a software-based architecture in the topology of a data center. It manages and guarantees the quality of service, using network policy-oriented management and service quality methods. The presented framework is an expandable infrastructure to solve the challenge of dynamic and agile management in the network of data centers and virtualization and cloud processing service providers.Findings : With the implementation done, h1r1 server node listens on ports 5001, 5002, 5003 with UDP protocol. The h1r4 client node sends 1Mbps UDP traffic to port 5001, 300Kbps UDP traffic to port 5002, and 600Kbps UDP traffic to port 5003 of the h1r1 server. The results obtained using the IPerf3/JPerf tool show that for traffic marked with AF41 code sent to port 5003, minimum bandwidth of 500Kbps and for traffic marked with AF31 code sent to port 5002, minimum bandwidth of 200Kbps is guaranteed. Is. When sending traffic of AF classes, the bandwidth of the best-effort traffic sent to port 5001 is limited.Conclusion: Guaranteeing full quality of service for all types of applications is not possible with the current network architecture based on the best-effort model. Different applications have different service needs that require dynamic management of network resources. In this article, a solution based on SDN architecture was presented for service quality management, which uses a differentiated service model. Differentiated service mechanism allocates resources based on different traffic classes. In this method, all streams belonging to a class are routed equally. The results obtained from the simulations show the optimal performance of the introduced framework in meeting the needs of traffic flows and optimal and maximum use of network resources. Manuscript profile
      • Open Access Article

        3 - The effect of the organizational approach in knowledge sharing on the performance of information security management
        edris abbaszadeh mohamadreza sanaei Reza Ehtesham Rasii
        Introduction: The rapid movement of countries toward the information society has caused the vast growth of information systems and services and the emergence of a new type of organization called virtual organization, which are information-based organizations. Considerin More
        Introduction: The rapid movement of countries toward the information society has caused the vast growth of information systems and services and the emergence of a new type of organization called virtual organization, which are information-based organizations. Considering the role of information as a valuable commodity in these organizations, the existence of security risks and threats that arise in the virtual environment and through the Internet connection, it is necessary to protect this information and to achieve this The goal of every organization depends on its level of information requires the design of an information security management system so that it can identify and manage the threats that the organization is exposed to and protect its information assets against these attacks and the security of the organization's information. Continuously improving considering the importance of the role of current information in every organization, it seems vital to use information security management systems to set up, implement, control, check, maintain and improve information security.Method: This study filled the gap in previous writings by considering organizational methods in the study of information security management. A framework was developed based on the framework proposed in the recent work of Perez Gonzalez et al. This framework examined three organizational factors: information security knowledge sharing, observability, and science. In this study, Gonzalez's framework was modified to include two additional organizational factors: commitment and information security learning.Results: This study identifies the need for managers and decision-makers to consider the role of employees in information security. Managers can influence the levels of motivation, loyalty, and innovative risk-taking of their employees to create an ethical relationship in the organization.Discussion: This study investigated and analyzed the effects of information security organizational measures (information security knowledge sharing, learning, security observability, security training, and commitment in an organization) on the information security management performance of small and medium companies. The findings show that information security knowledge sharing, learning, and security observability have a significant effect on security performance. In addition, this study clarifies the importance of learning information security and recording acquired knowledge in the commitment of an organization. Regarding information security knowledge sharing, information security learning, and security observability, the results show their positive effect on the information security performance of small and medium companies.  Manuscript profile
      • Open Access Article

        4 - Intrusions detection system in the cloud computing using heterogeneity detection technique
        Ali Ghaffari Rozbeh Hossinnezhad
        Introduction: The distributed structure of cloud computing makes it an attractive target for potential cyberattacks by intruders. In this paper, using the anomaly detection approach, a method for embedding an intrusion detection system for cloud computing is presented. More
        Introduction: The distributed structure of cloud computing makes it an attractive target for potential cyberattacks by intruders. In this paper, using the anomaly detection approach, a method for embedding an intrusion detection system for cloud computing is presented. Therefore, by studying how to check the parameters and the combined role of the parameters in the detection of penetration in the cloud, a method for detecting suspicious behavior in the cloud is provided. The most logical way to detect an intrusion is to use supervised methods to learn the parameters of normal customer behavior. Therefore, the detection of biased behavior in the form of suspicious behavior was implemented and discussed, investigated, and compared with an initial simulation in the form of identifying abnormal behavior in different behavioral areas by the neural network.Method: In this article, the basis of abnormality detection in different aspects is to examine the behavior of users and use the capabilities of reproducing inputs in RNN neural networks. In these networks, during the training of the network, the weights are adjusted in such a way that they can minimize the average square of the error so that the network can produce common repeating patterns well. Therefore, after training, these networks cannot reproduce well the input patterns that are actually significantly different from the training samples. Hence, these networks are able to identify anomalies in the tested sets. Accordingly, RNN networks are used here to model normal behavior.Findings: The simulation results show that the proposed method, which is based on the recurrent neural network, can improve the false positive, false negative, and detection accuracy compared to the classification method.Discussion: In this article, the detection of biased behavior in the form of suspicious behavior was implemented and discussed, investigated, and compared with an initial simulation in the form of identifying abnormal behavior in different behavioral fields. The simulation results show that the proposed method, which is based on the iterative neural network, can improve the false positive, false negative, and detection accuracy compared to the classification method. Manuscript profile
      • Open Access Article

        5 - Improving the security of image watermarking based on the combination of discrete wavelet transform, singular value decomposition and discrete cosine conversion methods
        hossein nematizadeh Mohammad Tahghighi Sharabian
        Introduction: One of the methods of ensuring information security is the use of encryption methods, but these methods are not able to hide the existence of information. In order to hide information, algorithms called Steganography have been created. Steganography&n More
        Introduction: One of the methods of ensuring information security is the use of encryption methods, but these methods are not able to hide the existence of information. In order to hide information, algorithms called Steganography have been created. Steganography is the method of hiding important data in a file or ordinary message, in order to prevent detection by others. This secret information is extracted to the original state at the destination. The use of Steganography can be combined with encryption as an additional step to hide or protect data. Steganography can be used to hide almost any type of digital content, including text, images, video, or audio content. Often the content to be hidden is encrypted before the Steganography process to provide an extra layer of protection. Steganography is more focused on keeping information hidden, while cryptography is more involved with the issue of ensuring access to information. According to the type of cover signal and also the insertion algorithm, different steganography methods have been presented, which in terms of hiding capacity and security are different. This issue has also led to reproduction, redistribution, and illegal digital media. Since copying and changing digital data has become easy and undetectable, its investigation is also of great importance.Method: In this article, a new method for non-blind image hiding that is resistant to affine transformation and normal image manipulation is used. The proposed Steganography method is based on additive discrete wavelet transform and singular value decomposition. After using RWDT for overlay and hidden images, we use SVD for their LL subbands. Then we modify the singular values of the overlay image using the singular values of the visual masker.Result: Analytical studies on the extracted watermarked image show that the Steganography method is capable and resistant against Salt & Pepper attacks with a scale of 0.1 out of 1.0 and Gaussian attack with a rate of 0.01 and the watermarked image is well recovered. Discussion: One of the most common cryptography techniques in the field of transformation is the modification of the coefficients obtained from the Singular Value Decomposition (SVD) of the image mask. The proposed algorithm has a good performance against rotation and cutting attacks, also the Steganography based on multiple SVD has performed poorly against these two attacks.  Manuscript profile
      • Open Access Article

        6 - Improve the Quality of Mammogram Images by Image Processing Techniques
        Mahdi Hariri Hassan Najafy
        Abstract: Due to the spread of breast cancer, its early detection from mammogram images using computerized methods have been considered an effective method to reduce the death rate of patients.In this research, a method based on image processing techniques is presented More
        Abstract: Due to the spread of breast cancer, its early detection from mammogram images using computerized methods have been considered an effective method to reduce the death rate of patients.In this research, a method based on image processing techniques is presented to improve the quality of mammography images. Therefore, in this research, we try to improve the quality of mammography images with image processing techniques to create a medical system. The research has two stages of pre-processing, including equalizing the dimensions and adjusting the histogram of the images, and a stage of feature extraction using Contourlet and Curlet transforms from mammography images, which provides three categories of morphological and histological, statistical, and frequency features to improve diagnosis. And it increases the accuracy of diagnosis. The proposed improvement method was implemented on the MIAS dataset and a subset of the extracted features was selected for the input of the classifier. Comparing the performance of the proposed method on different classifications, this method shows an accuracy rate of 86.3, which is a better result than other methods.MethodThis research is looking for a method that can improve the accuracy of the final diagnosis. Therefore, after the pre-processing stage, which includes rescaling and adjusting the texture of the image, highlighter transforms in the frequency domain such as Curvelet and Contourlet are used to highlight and increase the differentiation of areas with masses in the image for decision-making.Local features based on image zoning are used for image segmentation, and these methods are also used to increase the contrast of mammogram images concerning their surroundings. The improvement methods used in this research use features based on the wavelet domain.ResultsThe input image to the system is subjected to the feature extraction process and three main categories of frequency, morphology, and histology features are extracted from it. This process is done through the cycle, size equalization, histogram adjustment, contourlet transform, and curvelet transform. Due to the sensitivity of the systems, it has been tried to extract the features with various levels and known matrices such as the matrix of events and gray groups.The classification results evaluated the said method. The best results on the data set were the proposed method, which reached an accuracy rate of 86.3 and showed a good improvement on the displayed data set. Manuscript profile