• XML

    isc pubmed crossref medra doaj doaj
  • List of Articles


      • Open Access Article

        1 - A Review and Evaluation of Statistical Process Control Methods inMonitoring Process Mean and Variance Simultaneously
        Ahmad Ostadsharifmemar Seyed Taghi Akhavan Niaki
        In this paper, first the available single charting methods, which have been proposed to detect simultaneous shifts in a single process mean and variance, are reviewed. Then, by designing proper simulation studies these methods are evaluated in terms of in-control and ou More
        In this paper, first the available single charting methods, which have been proposed to detect simultaneous shifts in a single process mean and variance, are reviewed. Then, by designing proper simulation studies these methods are evaluated in terms of in-control and out-ofcontrol average run length criteria (ARL). The results of these simulation experiments show that the EWMA and EWMS methods are quite capable to detect large shifts in the means and variances. However, while the two EWMV procedures under study do not perform well, the Max-Min EWMA and Max-EWMA perform very well in all scenarios of mean and variance shifts. Manuscript profile
      • Open Access Article

        2 - Scheduling in Container Terminals using NetworkSimplex Algorithm
        Hassan Rashidi
        In static scheduling problem, where there is no change in situation, the challenge is that the large problems can be solved in a short time. In this paper, the Static Scheduling problem of Automated Guided Vehicles in container terminal is solved by the Network Simplex More
        In static scheduling problem, where there is no change in situation, the challenge is that the large problems can be solved in a short time. In this paper, the Static Scheduling problem of Automated Guided Vehicles in container terminal is solved by the Network Simplex Algorithm (NSA). The algorithm is based on graph model and their performances are at least 100 times faster than traditional simplex algorithm for Linear Programs. Many random data are generated and fed to the model for 50 vehicles. The results show that NSA is fast and efficient. It is found that, in practice, NSA takes polynomial time to solve problems in this application. Manuscript profile
      • Open Access Article

        3 - Fuzzy Particle Swarm Optimization Algorithm for a Supplier ClusteringProblem
        esmaeil Mehdizadeh reza Tavakkoli Moghaddam
        This paper presents a fuzzy decision-making approach to deal with a clustering supplier problem in a supply chain system. During recent years, determining suitable suppliers in the supply chain has become a key strategic consideration. However, the nature of these decis More
        This paper presents a fuzzy decision-making approach to deal with a clustering supplier problem in a supply chain system. During recent years, determining suitable suppliers in the supply chain has become a key strategic consideration. However, the nature of these decisions is usually complex and unstructured. In general, many quantitative and qualitative factors, such as quality, price, and flexibility and delivery performance, must be considered to determine suitable suppliers. The aim of this study is to present a new approach using particle swarm optimization (PSO) algorithm for clustering suppliers under fuzzy environments and classifying smaller groups with similar characteristics. Our numerical analysis indicates that the proposed PSO improves the performance of the fuzzy c-means (FCM) algorithm. Manuscript profile
      • Open Access Article

        4 - Application of Rough Set Theory in Data Mining for Decision SupportSystems (DSSs)
        Mohammad Hossein Fazel Zarandi Abolfazl Kazemi
        Decision support systems (DSSs) are prevalent information systems for decision making in many competitive business environments. In a DSS, decision making process is intimately related to some factors which determine the quality of information systems and their relate More
        Decision support systems (DSSs) are prevalent information systems for decision making in many competitive business environments. In a DSS, decision making process is intimately related to some factors which determine the quality of information systems and their related products. Traditional approaches to data analysis usually cannot be implemented in sophisticated Companies, where managers need some DSS tools for rapid decision making. In traditional approaches to decision making, usually scientific expertise together with statistical techniques are needed to support the managers. However, these approaches are not able to handle the huge amount of real data, and the processes are usually very slow. Recently, several innovative facilities have been presented for decision making process in enterprises. Presenting new techniques for development of huge databases, together with some heuristic models have enhanced the capabilities of DSSs to support managers in all levels of organizations. Today, data mining and knowledge discovery is considered as the main module of development of advanced DSSs. In this research, we use rough set theory for data mining for decision making process in a DSS. The proposed approach concentrates on individual objects rather than population of the objects. Finally, a rule extracted from a data set and the corresponding features (attributes) is considered in modeling data mining. Manuscript profile
      • Open Access Article

        5 - The preemptive resource-constrained project scheduling problem subjectto due dates and preemption penalties: An integer programming approach
        Behrouz Afshar nadjafi Shahram Shadrokh
        Extensive research has been devoted to resource constrained project scheduling problem. However, little attention has been paid to problems where a certain time penalty must be incurred if activity preemption is allowed. In this paper, we consider the project scheduling More
        Extensive research has been devoted to resource constrained project scheduling problem. However, little attention has been paid to problems where a certain time penalty must be incurred if activity preemption is allowed. In this paper, we consider the project scheduling problem of minimizing the total cost subject to resource constraints, earliness-tardiness penalties and preemption penalties, where each time an activity is started after being preempted; a constant setup penalty is incurred. We propose a solution method based on a pure integer formulation for the problem. Finally, some test problems are solved with LINGO version 8 and computational results are reported. Manuscript profile
      • Open Access Article

        6 - Practical common weights scalarizing function approach for efficiencyanalysis
        Alireza Alinezhad Reza Kiani Mavi Majid Zohrehbandian Ahmad Makui
        A characteristic of Data Envelopment Analysis (DEA) is to allow individual decision making units (DMUs) to select the factor weights which are the most advantageous for them in calculating their efficiency scores. This flexibility in selecting the weights, on the other More
        A characteristic of Data Envelopment Analysis (DEA) is to allow individual decision making units (DMUs) to select the factor weights which are the most advantageous for them in calculating their efficiency scores. This flexibility in selecting the weights, on the other hand, deters the comparison among DMUs on a common base. For dealing with this difficulty and assessing all the DMUs on the same scale, this paper proposes using a multiple objective linear programming (MOLP) approach based on scalarizing function for generating common set of weights under the DEA framework. This is an advantageous of the proposed approach against general approaches in the literature which are based on multiple objective nonlinear programming. Manuscript profile
      • Open Access Article

        7 - Using neural network to estimate weibull parameters
        Babak Abbasi behrouz Afshar nadjafi
        As is well known, estimating parameters of the tree-parameter weibull distribution is a complicated task and sometimes contentious area with several methods vying for recognition. Weibull distribution involves in reliability studies frequently and has many applications More
        As is well known, estimating parameters of the tree-parameter weibull distribution is a complicated task and sometimes contentious area with several methods vying for recognition. Weibull distribution involves in reliability studies frequently and has many applications in engineering. However estimating the parameters of Weibull distribution is crucial in classical ways. This distribution has three parameters, but for simplicity, a parameter is ridded off and as a result, the estimation of the others will be easily done. When the three-parameter distribution is of interest, the classical estimation procedures such as maximum likelihood estimation (MLE) will be quite boring. In this paper to take advantage of application of artificial neural networks (ANN) to statistics, we propose using a simple neural network to estimate three parameters of Weibull distribution simultaneously. Trained neural network similar to moment method estimates Weibull parameters based on mean, standard deviation, median, skewness and kurtosis of the sample accurately. To assess the power of the proposed method we carry out simulation study and compare the results of the proposed method with real values of the parameters. Manuscript profile
      • Open Access Article

        8 - Estimation of the cost function of a two-echelon inventory system withlost sales using regression
        Mehdi Seif Barghy Maghsoud Amiri mostafa Heidari
        Multi-echelon inventory systems are one of the most important and attractive areas in modelling supply chain management so that several researches and studies have been done about. In this paper, the total cost function of a two-echelon inventory system with central war More
        Multi-echelon inventory systems are one of the most important and attractive areas in modelling supply chain management so that several researches and studies have been done about. In this paper, the total cost function of a two-echelon inventory system with central warehouse and many identical retailers controlled by continuous review inventory policy is estimated. We have assumed the demand in the retailers experiences independent Poisson distribution and unsatisfied demands are lost in the retailers and unsatisfied retailer order is backordered in the central warehouse. The transportation time from warehouse to each retailer and the lead time of the warehouse orders are assumed to be constant. The accuracy of the estimation is assessed by simulation. The main contribution of this paper is using statistical tools in two-echelon inventory systems analysis. Manuscript profile