• XML

    isc pubmed crossref medra doaj doaj
  • List of Articles


      • Open Access Article

        1 - A two-sided Bernoulli-based CUSUM control chart with autocorrelated observations
        S. M. T. Fatemi Ghomi F. Sogandi
        Usually, in monitoring a proportionp , the binary observations are considered independent; however, in many real cases, there is a continuous stream of autocorrelated binary observations in which a two-state Markov chain model is applied with first-order dependence. On More
        Usually, in monitoring a proportionp , the binary observations are considered independent; however, in many real cases, there is a continuous stream of autocorrelated binary observations in which a two-state Markov chain model is applied with first-order dependence. On the other hand, the Bernoulli CUSUM control chart which is not robust to autocorrelation can be applied two-sided control chart to able to detect either increases or decreases in the process parameter. In this paper, a two-sided Bernoulli-based CUSUM control chart is proposed based on a log-likelihood-ratio statistic using a Markov chain model and average run length relationship. The average run length relationship is set using the corresponding upper and lower Bernoulli CUSUM charts. Simulation studies show the superior performance of the proposed monitoring scheme. Numerical results show the superior performance of the proposed control chart. Manuscript profile
      • Open Access Article

        2 - Application of PROMETHEE method for green supplier selection: a comparative result based on preference functions
        Lazim Abdullah Waimun Chan Alireza Afshari
        The PROMETHEE is a significant method for evaluating alternatives with respect to criteria in multi-criteria decision-making problems. It is characterized by many types of preference functions that are used for assigning the differences between alternatives in judgement More
        The PROMETHEE is a significant method for evaluating alternatives with respect to criteria in multi-criteria decision-making problems. It is characterized by many types of preference functions that are used for assigning the differences between alternatives in judgements. This paper proposes a preference of green suppliers using the PROMETHEE under the usual criterion preference functions. Comparable results are presented to check the effect of different preference functions on the final preference. Seven economical and environmental criteria, four suppliers and five decision makers were the main structures in the green supplier selection problem. Data were collected via personal communication with decision makers using five-point Likert scale. The algorithm of PROMETHEE under usual criterion function was implemented, and the results show that supplierA1is the most preferred alternative. Comparative results also show that supplierA1is the most preferred alternative despite the difference in preference functions used. Manuscript profile
      • Open Access Article

        3 - Parametric optimization of Nd:YAG laser microgrooving on aluminum oxide using integrated RSM-ANN-GA approach
        Salila Ranjan Dixit Sudhansu Ranjan Das Debabrata Dhupal
        Nowadays in highly competitive precision industries, the micromachining of advanced engineering materials is extremely demand as it has extensive application in the fields of automobile, electronic, biomedical and aerospace engineering. The present work addresses the mo More
        Nowadays in highly competitive precision industries, the micromachining of advanced engineering materials is extremely demand as it has extensive application in the fields of automobile, electronic, biomedical and aerospace engineering. The present work addresses the modeling and optimization study on dimensional deviations of square-shaped microgroove in laser micromachining of aluminum oxide (Al2O3) ceramic material with pulsed Nd:YAG laser by considering the air pressure, lamp current, pulse frequency, pulse width and cutting speed as process parameters. Thirty-two sets of laser microgrooving trials based on central composite design (CCD) design of experiments (DOEs) are performed, and response surface method (RSM), artificial neural network (ANN) and genetic algorithm (GA) are subsequently applied for mathematical modeling and multi-response optimization. The performance of the predictive ANN model based on 5-8-8-3 architecture gave the minimum error (MSE = 0.000099) and presented highly promising to confidence with percentage error less than 3% in comparison with experimental result data set. The ANN model combined with GA leads to minimum deviation of upper width, lower width and depth value of − 0.0278mm, 0.0102mm and − 0.0308mm, respectively, corresponding to optimum laser microgrooving process parameters such as 1.2 kgf/cm2of air pressure, 19.5 Amp of lamp current, 4kHz of pulse frequency, 6% of pulse width and 24mm/s of cutting speed. Finally, the results have been verified by performing a confirmatory test. Manuscript profile
      • Open Access Article

        4 - Dynamic configuration and collaborative scheduling in supply chains based on scalable multi-agent architecture
        Fu-Shiung Hsieh
        Due to diversified and frequently changing demands from customers, technological advances and global competition, manufacturers rely on collaboration with their business partners to share costs, risks and expertise. How to take advantage of advancement of technologies t More
        Due to diversified and frequently changing demands from customers, technological advances and global competition, manufacturers rely on collaboration with their business partners to share costs, risks and expertise. How to take advantage of advancement of technologies to effectively support operations and create competitive advantage is critical for manufacturers to survive. To respond to these challenges, development of a dynamic scheme to better manage collaborative workflows is urgent. In this paper, we will study how to develop a flexible and scalable framework to dynamically and coherently configure workflows that can meet order requirements based on multi-agent systems (MAS). Configuring and scheduling collaborative workflows is a challenging problem due to the computational complexity involved, distributed architecture and dependency among different partners’ workflows. To achieve flexibility and reduce the cost and time involved in configuration of a supply chain network, we propose an approach that combines MAS, contract net protocol, workflow models and automated transformation of the workflow models to dynamically formulate the scheduling problem. To attain scalability, we develop a solution algorithm to solve the optimization problem by a collaborative and distributed computation scheme. We implement a software system based on industrial standards, including FIPA and the Petri net workflow specification model. In addition, we also illustrate effectiveness and analyze scalability of our approach by examples. Our approach facilitates collaboration between partners and provides a scalable solution for the increasing size of supply chain networks. Manuscript profile
      • Open Access Article

        5 - Cat swarm optimization for solving the open shop scheduling problem
        Abdelhamid Bouzidi Mohammed Essaid Riffi Mohammed Barkatou
        This paper aims to prove the efficiency of an adapted computationally intelligence-based behavior of cats called the cat swarm optimization algorithm, that solves the open shop scheduling problem, classified as NP-hard which its importance appears in several industrial More
        This paper aims to prove the efficiency of an adapted computationally intelligence-based behavior of cats called the cat swarm optimization algorithm, that solves the open shop scheduling problem, classified as NP-hard which its importance appears in several industrial and manufacturing applications. The cat swarm optimization algorithm was applied to solve some benchmark instances from the literature. The computational results, and the comparison of the relative percentage deviation of the proposed metaheuristic with other’s existing in the literature, show that the cat swarm optimization algorithm yields good results in reasonable execution time. Manuscript profile
      • Open Access Article

        6 - Combining data envelopment analysis and multi-objective model for the efficient facility location–allocation decision
        Jae-Dong Hong Ki‑Young Jeong
        This paper proposes an innovative procedure of finding efficient facility location–allocation (FLA) schemes, integrating data envelopment analysis (DEA) and a multi-objective programming (MOP) model methodology. FLA decisions provide a basic foundation for designi More
        This paper proposes an innovative procedure of finding efficient facility location–allocation (FLA) schemes, integrating data envelopment analysis (DEA) and a multi-objective programming (MOP) model methodology. FLA decisions provide a basic foundation for designing efficient supply chain network in many practical applications. The procedure proposed in this paper would be applied to the FLA problems where various conflicting performance measures are considered. The procedure requires that conflicting performance measures classified as inputs to be minimized, or outputs to be maximized. Solving an MOP problem generates diverse alternative FLA schemes along with multi-objective values. DEA evaluates these schemes to generate a relative efficiency score for each scheme. Then, using stratification DEA, all of these FLA schemes are stratified into several levels, from the most efficient to the most inefficient levels. A case study is presented to demonstrate the effectiveness and efficiency of the proposed integrating method. We observe that the combined approach in this paper performs well and would provide many insights to academians as well as practitioners and researchers. Manuscript profile
      • Open Access Article

        7 - Robustness-based portfolio optimization under epistemic uncertainty
        Md. Asadujjaman Kais Zaman
        In this paper, we propose formulations and algorithms for robust portfolio optimization under both aleatory uncertainty (i.e., natural variability) and epistemic uncertainty (i.e., imprecise probabilistic information) arising from interval data. Epistemic uncertainty is More
        In this paper, we propose formulations and algorithms for robust portfolio optimization under both aleatory uncertainty (i.e., natural variability) and epistemic uncertainty (i.e., imprecise probabilistic information) arising from interval data. Epistemic uncertainty is represented using two approaches: (1) moment bounding approach and (2) likelihood-based approach. This paper first proposes a nested robustness-based portfolio optimization formulation using the moment bounding approach-based representation of epistemic uncertainty. The nested robust portfolio formulation is simple to implement; however, the computational cost is often high due to the epistemic analysis performed inside the optimization loop. A decoupled approach is then proposed to un-nest the robustness-based portfolio optimization from the analysis of epistemic variables to achieve computational efficiency. This paper also proposes a single-loop robust portfolio optimization formulation using the likelihood-based representation of epistemic uncertainty that completely separates the epistemic analysis from the portfolio optimization framework and thereby achieves further computational efficiency. The proposed robust portfolio optimization formulations are tested on real market data from five S&P 500 companies, and performance of the robust optimization models is discussed empirically based on portfolio return and risk. Manuscript profile
      • Open Access Article

        8 - Practical benchmarking in DEA using artificial DMUs
        Hosein Didehkhani Farhad Hosseinzadeh Lotfi Soheil Sadi-Nezhad
        Data envelopment analysis (DEA) is one of the most efficient tools for efficiency measurement which can be employed as a benchmarking method with multiple inputs and outputs. However, DEA does not provide any suggestions for improving efficient units, nor does it provid More
        Data envelopment analysis (DEA) is one of the most efficient tools for efficiency measurement which can be employed as a benchmarking method with multiple inputs and outputs. However, DEA does not provide any suggestions for improving efficient units, nor does it provide any benchmark or reference point for these efficient units. Impracticability of these benchmarks under environmental conditions is another challenge of benchmarking by DEA. The current study attempts to extend basic models for benchmarking of efficient units under practical conditions. To this end, we construct the practical production possibility set (PPPS) by employing the concept of artificial decision-making units and adding these decision-making units to the production possibility set (PPS) such that these artificial units satisfy all environmental constraints. Then, the theorems related to PPPS and their proofs are provided. Moreover, as a secondary result of this study, efficient units can be ranked according to their practical efficiency scores. Manuscript profile
      • Open Access Article

        9 - A mathematical model of the effect of subsidy transfer in cooperative advertising using differential game theory
        Peter E. Ezimadu
        This work deals with subsidy transfer from a manufacturer to a retailer through the distributor in cooperative advertising. While the retailer engages in local advertising, the manufacturer indirectly participates in retail advertising using advertising subsidy which is More
        This work deals with subsidy transfer from a manufacturer to a retailer through the distributor in cooperative advertising. While the retailer engages in local advertising, the manufacturer indirectly participates in retail advertising using advertising subsidy which is given to the distributor, who in turn transfers it to the retailer. The manufacturer is the Stackelberg game leader; the distributor is the first follower, while the retailer is the last follower. The work employs differential game in modelling the effect of subsidy on the individual and channel payoffs; and models the awareness share dynamics using Sethi’s sales-advertising model. It obtains Stackelberg equilibriums characterising four-game scenario: no subsidy from neither the manufacturer nor the distributor; withholding of manufacturer’s subsidy by the distributor; provision of subsidy by the distributor in the absence of the manufacturer’s participation; and the participation of both the manufacturer and distributor in retail advertising. It shows that in the absence of subsidy from the manufacturer, the distributor should intervene by providing subsidy to the retailer. However, if this is impossible, he should avoid withholding the subsidy meant for retail advertising. The players’ payoffs as well as the channel payoff are worst with non-participation of both the manufacturer and the distributor, and best with transfer of subsidy. Manuscript profile
      • Open Access Article

        10 - Developing an economical model for reliability allocation of an electro-optical system by considering reliability improvement difficulty, criticality, and subsystems dependency
        Maryam Mohamadi Mahdi Karbasian
        The nature of electro-optical equipment in various industries and the pursuit of the goal of reducing costs demand high reliability on the part of electro-optical systems. In this respect, reliability improvement could be addressed through a reliability allocation probl More
        The nature of electro-optical equipment in various industries and the pursuit of the goal of reducing costs demand high reliability on the part of electro-optical systems. In this respect, reliability improvement could be addressed through a reliability allocation problem. Subsystem reliability must be increased such that the requirements as well as defined requisite functions are ensured in accordance with the designers’ opinion. This study is an attempt to develop a multi-objective model by maximizing system reliability and minimizing costs in order to investigate design phase costs as well as production phase costs. To investigate reliability improvement feasibility in the design phase, effective feasibility factors in the system are used and the sigma level index is incorporated in the production phase as the reliability improvement difficulty factor. Thus, subsystem reliability improvement priorities are taken into consideration. Subsystem dependency degree is investigated through the design structure matrix and incorporated into the model’s limitation together with modified criticality. The primary model is converted into a single-objective model through goal programming. This model is implemented on electro-optical systems, and the results are analyzed. In this method, reliability allocation follows two steps. First, based on the allocation weights, a range is determined for the reliability of subsystems. Afterward, improvement is initiated based upon the costs and priorities of subsystem reliability improvement. Manuscript profile
      • Open Access Article

        11 - Analysis of critical drivers affecting implementation of agent technology in a manufacturing system
        Om Ji Shukla Abhijeet Joshi Gunjan Soni Rajesh Kumar
        Technological advancement in the manufacturing system in current scenario is inevitable due to today’s customer-driven and volatile nature of the market. Implementation of agent technology in a manufacturing system increases flexibility which handles uncertainty g More
        Technological advancement in the manufacturing system in current scenario is inevitable due to today’s customer-driven and volatile nature of the market. Implementation of agent technology in a manufacturing system increases flexibility which handles uncertainty generated due to advance technology. Therefore, in this paper, the critical drivers affecting implementation of agent technology are identified and the relationships among them are analysed for a case study of a manufacturing system in an Indian steering wheel manufacturing company. Interpretive structural modelling (ISM) is used to provide binary relationships among identified critical drivers (CDs), while MICMAC approach describes sensitive analysis of driving and dependence behaviour of CDs. The classification of the drivers affecting agent technology and their relationships according to ISM-MICMAC approach provides importance to this study. A structural model is developed for providing rank to the identified critical drivers, and driving-dependent power diagram is presented for analysing the behaviour of different critical drivers with respect to others. The identification of the most influential CDs that lead to increase the effect of other drivers is the major finding of this study. Finally, the implication of this research for the industries is also described. Manuscript profile
      • Open Access Article

        12 - Bi-objective optimization of multi-server intermodal hub-location-allocation problem in congested systems: modeling and solution
        Mahdi Rashidi Kahag Seyed Taghi Akhavan Niaki Mehdi Seifbarghy Sina Zabihi
        A new multi-objective intermodal hub-location-allocation problem is modeled in this paper in which both the origin and the destination hub facilities are modeled as an M/M/m queuing system. The problem is being formulated as a constrained bi-objective optimization model More
        A new multi-objective intermodal hub-location-allocation problem is modeled in this paper in which both the origin and the destination hub facilities are modeled as an M/M/m queuing system. The problem is being formulated as a constrained bi-objective optimization model to minimize the total costs as well as minimizing the total system time. A small-size problem is solved on the GAMS software to validate the accuracy of the proposed model. As the problem becomes strictly NP-hard, an MOIWO algorithm with an efficient chromosome structure and a fuzzy dominance method is proposed to solve large-scale problems. Since there is no benchmark available in the literature, an NSGA-II and an NRGA are developed to validate the results obtained. The parameters of all algorithms are tuned using the Taguchi method and their performances are statistically compared in terms of some multi-objective metrics. Finally, the entropy-TOPSIS method is applied to show that MOIWO is the best in terms of simultaneous use of all the metrics. Manuscript profile