Industrial and Management Systems Engineering

 

Date of this Version

Summer 7-25-2013

Citation

S. Zamiri Marvizadeh, ENTROPY APPLICATIONS IN INDUSTRIAL ENGINEERING, Doctoral Dissertation, Department of Industrial and management systems engineering, University of Nebraska-Lincoln, 2013

Comments

A DISSERTATION Presented to the Faculty of The Graduate College at the University of Nebraska In Partial Fulfillment of Requirements For the Degree of Doctor of Philosophy, Major: Engineering, Under the Supervision of Professor Fred Choobineh: Lincoln, Nebraska: July, 2013

Copyright (c) 2013 Saeed Zamiri Marvizadeh

Abstract

Entropy is a fundamental measure of information content which has been applied in a wide variety of fields. We present three applications of entropy in the industrial engineering field: dispatching of Automatic Guided Vehicles (AGV), ranking and selection of simulated systems based on the mean performance measure, and comparison between random variables based on cumulative probability distributions.

The first application proposes three entropy-based AGV dispatching algorithms. We contribute to the body of knowledge by considering the consequence of potential AGV moves on the load balance of the factory before AGVs are dispatched. Kullback-Leibler directed divergence is applied to measure the divergence between load distribution after each potential move and load distribution of a balanced factory. Simulation experiments are conducted to study the effectiveness of suggested algorithms.

In the second application, we focus on ranking and selection of simulated systems based on the mean performance measure. We apply maximum entropy and directed divergence principles to present a two stage algorithm. The proposed method contributes to the ranking and selection body of knowledge because it relaxes the normality assumption for the underlying population which restricts the frequentist algorithms, it does not assume any priori distribution which is assumed by bayesian approaches, and finally it provides ranking of systems based on their observed performance measures.

Finally, we present an entropy-based criterion for comparing two alternatives. Our comparison is based on directed divergence between alternatives’ cumulative probability distributions. We compare the new criterion with stochastic dominance criteria such as first order stochastic dominance (FSD) and second order stochastic dominance (SSD). Since stochastic dominance rules may be unable to detect dominance even in situations when most decision makers would prefer one alternative over another, our criterion increases the probability of identifying the best system and reduces the probability of obtaining the nondominance set in such situations. Among two alternatives, we show that if one alternative dominates the other one by SSD, the dominating alternative will be dominated by our new criterion. In addition, we show that the probability associated with our new criterion is consistent with the probability corresponding to p almost stochastic dominance (p-AFSD).

Entropy