This Is AuburnElectronic Theses and Dissertations

Show simple item record

Learning Explanatory Models for Robust Decision-Making Under Deep Uncertainty


Metadata FieldValueLanguage
dc.contributor.advisorYilmaz, Levent
dc.contributor.authorRodriguez, Brodderick
dc.date.accessioned2020-04-06T20:31:59Z
dc.date.available2020-04-06T20:31:59Z
dc.date.issued2020-04-06
dc.identifier.urihttp://hdl.handle.net/10415/7102
dc.description.abstractDecision-makers rely on simulation models to predict and investigate the implications of their decisions. However, the use of monolithic simulation models based on fixed assumptions lack the requisite adaptivity needed when the real-world system contains significant uncertainty. Exploratory modeling is a methodology that involves iterative and incremental exploration of alternative hypotheses about the underlying assumptions of the real-world system under a broad range of contextual conditions. Through exploration, decision-makers gain an understanding of the breadth of the system and pinpoint robust policies. However, exploratory modeling tools lack mechanisms to generate, evaluate, and learn from the results of simulating an ensemble of alternative, possibly competing models. Additionally, exploratory modeling over a pop- ulation of models generates significant amount of data that may obscure fundamental system mechanics and their interaction with the context. This thesis introduces a modeling architecture with (1) a feature-oriented generative modeling mechanism for rapid derivation of alternative causal model structures and (2) a rule-based machine learning strategy in terms of a Learn- ing Classifier System to produce explanatory models in the form of a population of rules and its associated visual heat-maps that convey the robustness and resilience of alternative system designs. The use of both of these mechanisms accelerates the decision-support exercise and yields more intuitive interpretations of system insights when modeling for decision-making under deep uncertainty.en_US
dc.rightsEMBARGO_GLOBALen_US
dc.subjectComputer Science and Software Engineeringen_US
dc.titleLearning Explanatory Models for Robust Decision-Making Under Deep Uncertaintyen_US
dc.typeMaster's Thesisen_US
dc.embargo.lengthMONTHS_WITHHELD:6en_US
dc.embargo.statusEMBARGOEDen_US
dc.embargo.enddate2020-10-06en_US

Files in this item

Show simple item record