INTEGRATED DECISION MAKING FOR PLANNING AND CONTROL OF DISTRIBUTED MANUFACTURING ENTERPRISES USING DYNAMIC-DATA-DRIVEN ADAPTIVE MULTI-SCALE SIMULATIONS (DDDAMS)
Dynamic data driven simulations
Committee ChairSon, Young-Jun
MetadataShow full item record
PublisherThe University of Arizona.
RightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.
AbstractDiscrete-event simulation has become one of the most widely used analysis tools for large-scale, complex and dynamic systems such as supply chains as it can take randomness into account and address very detailed models. However, there are major challenges that are faced in simulating such systems, especially when they are used to support short-term decisions (e.g., operational decisions or maintenance and scheduling decisions considered in this research). First, a detailed simulation requires significant amounts of computation time. Second, given the enormous amount of dynamically-changing data that exists in the system, information needs to be updated wisely in the model in order to prevent unnecessary usage of computing and networking resources. Third, there is a lack of methods allowing dynamic data updates during the simulation execution. Overall, in a simulation-based planning and control framework, timely monitoring, analysis, and control is important not to disrupt a dynamically changing system. To meet this temporal requirement and address the above mentioned challenges, a Dynamic-Data-Driven Adaptive Multi-Scale Simulation (DDDAMS) paradigm is proposed to adaptively adjust the fidelity of a simulation model against available computational resources by incorporating dynamic data into the executing model, which then steers the measurement process for selective data update. To the best of our knowledge, the proposed DDDAMS methodology is one of the first efforts to present a coherent integrated decision making framework for timely planning and control of distributed manufacturing enterprises.To this end, comprehensive system architecture and methodologies are first proposed, where the components include 1) real time DDDAM-Simulation, 2) grid computing modules, 3) Web Service communication server, 4) database, 5) various sensors, and 6) real system. Four algorithms are then developed and embedded into a real-time simulator for enabling its DDDAMS capabilities such as abnormality detection, fidelity selection, fidelity assignment, and prediction and task generation. As part of the developed algorithms, improvements are made to the resampling techniques for sequential Bayesian inferencing, and their performance is benchmarked in terms of their resampling qualities and computational efficiencies. Grid computing and Web Services are used for computational resources management and inter-operable communications among distributed software components, respectively. A prototype of proposed DDDAM-Simulation was successfully implemented for preventive maintenance scheduling and part routing scheduling in a semiconductor manufacturing supply chain, where the results look quite promising.
Degree ProgramSystems & Industrial Engineering
Degree GrantorUniversity of Arizona
Showing items related by title, author, creator and subject.
Simulating the Long House Valley: An evaluation of the role of agent-based computer simulation in archaeologyReid, J. Jefferson; Littler, Matthew Laws, 1973- (The University of Arizona., 1998)This study presents the results of a detailed analysis of an agent-based computer simulation called Artificial Anasazi. The simulation attempts to replicate the population growth and settlement patterns of the prehistoric Kayenta Anasazi of Long House Valley in northeastern Arizona between A.D. 400-1300. Agent-based simulations model social evolution from the bottom-up, using heterogeneous agents that follow simple rules, in contrast to the top-down computer simulations usually used by archaeologists. Artificial Anasazi is tested against the archaeological record of the real Long House Valley through both qualitative and quantitative methods, and an analysis of the relevant ethnographic information is presented. The ultimate goal of this study is to elucidate the potentials and pitfalls of using agent-based computer simulation as a serious research tool in archaeology.
Temporal, Spectral, and Spatial Treat Simulation Using a Towed Airborne Plume Simulator (TAPS)Taylor, Rick; Redmond, Neal; Balding, Jeff; Science Applications International Corporation; Center for Countermeasures (International Foundation for Telemetering, 2009-10)Efforts are underway to develop Infrared countermeasure (IRCM) systems to defend aircraft against IR guided surface-to-air (SAM) and air-to-air (AAM) missiles. One such system is the Large Aircraft Infrared Counter Measure (LAIRCM) which employs temporal, spatial, and spectral missile warning techniques. There is no current technique however, for installed system flight testing of such countermeasures in a realistic temporal, spatial, and spectral environment. This paper is an introduction to the Towed Airborne Plume Simulator (TAPS), a system designed to address this test shortfall. The TAPS operational concept is described as well as techniques for simulating missile signatures.
Knowledge based simulation system--an application in controlled environment simulation systemZeigler, Bernard P.; Zhang, Guoging, 1963- (The University of Arizona., 1988)This thesis systematically identifies the building blocks of a knowledge based system for simulation and modelling. We present the design and implementation of Controlled Environment Simulation System (CESS), which bridges a discrete event simulation system (DEVS-SCHEME) and a continuous simulation system (TRNSYS). The rationale behind the approach is that a discrete or a continuous model can be abstracted to a level at which the uniform treatment on these two kinds of models is possible. A top-down approach to model creation (abstraction) is proposed, in contrast to the traditional bottom-up approach. CESS is implemented on an object-oriented programming environment (SCOOPS on TI-SCHEME). A knowledge representation scheme known as System Entity Structure is employed for MODEL management, recording system structural knowledge, and the utilization of techniques in Artificial Intelligence. Some prospective research topics are also brought up.