Off-campus UNL users: To download campus access dissertations, please use the following link to log into our proxy server with your NU ID and password. When you are done browsing please remember to return to this page and log out.
Non-UNL users: Please talk to your librarian about requesting this dissertation through interlibrary loan.
Entropy applications in industrial engineering
Entropy is a fundamental measure of information content which has been applied in a wide variety of fields. We present three applications of entropy in the industrial engineering field: dispatching of Automatic Guided Vehicles (AGV), ranking and selection of simulated systems based on the mean performance measure, and comparison between random variables based on cumulative probability distributions. ^ The first application proposes three entropy-based AGV dispatching algorithms. We contribute to the body of knowledge by considering the consequence of potential AGV moves on the load balance of the factory before AGVs are dispatched. Kullback-Leibler directed divergence is applied to measure the divergence between load distribution after each potential move and load distribution of a balanced factory. Simulation experiments are conducted to study the effectiveness of suggested algorithms. ^ In the second application, we focus on ranking and selection of simulated systems based on the mean performance measure. We apply maximum entropy and directed divergence principles to present a two stage algorithm. The proposed method contributes to the ranking and selection body of knowledge because it relaxes the normality assumption for the underlying population which restricts the frequentist algorithms, it does not assume any priori distribution which is assumed by bayesian approaches, and finally it provides ranking of systems based on their observed performance measures.^ Finally, we present an entropy-based criterion for comparing two alternatives. Our comparison is based on directed divergence between alternatives' cumulative probability distributions. We compare the new criterion with stochastic dominance criteria such as first order stochastic dominance (FSD) and second order stochastic dominance (SSD). Since stochastic dominance rules may be unable to detect dominance even in situations when most decision makers would prefer one alternative over another, our criterion increases the probability of identifying the best system and reduces the probability of obtaining the nondominance set in such situations. Among two alternatives, we show that if one alternative dominates the other one by SSD, the dominating alternative will be dominated by our new criterion. In addition, we show that the probability associated with our new criterion is consistent with the probability corresponding to p almost stochastic dominance (p-AFSD).^
Zamiri Marvizadeh, Saeed, "Entropy applications in industrial engineering" (2013). ETD collection for University of Nebraska - Lincoln. AAI3590332.