This Is AuburnElectronic Theses and Dissertations

Learning Explanatory Models for Robust Decision-Making Under Deep Uncertainty

Date

2020-04-06

Author

Rodriguez, Brodderick

Type of Degree

Master's Thesis

Department

Computer Science and Software Engineering

Abstract

Decision-makers rely on simulation models to predict and investigate the implications of their decisions. However, the use of monolithic simulation models based on fixed assumptions lack the requisite adaptivity needed when the real-world system contains significant uncertainty. Exploratory modeling is a methodology that involves iterative and incremental exploration of alternative hypotheses about the underlying assumptions of the real-world system under a broad range of contextual conditions. Through exploration, decision-makers gain an understanding of the breadth of the system and pinpoint robust policies. However, exploratory modeling tools lack mechanisms to generate, evaluate, and learn from the results of simulating an ensemble of alternative, possibly competing models. Additionally, exploratory modeling over a pop- ulation of models generates significant amount of data that may obscure fundamental system mechanics and their interaction with the context. This thesis introduces a modeling architecture with (1) a feature-oriented generative modeling mechanism for rapid derivation of alternative causal model structures and (2) a rule-based machine learning strategy in terms of a Learn- ing Classifier System to produce explanatory models in the form of a population of rules and its associated visual heat-maps that convey the robustness and resilience of alternative system designs. The use of both of these mechanisms accelerates the decision-support exercise and yields more intuitive interpretations of system insights when modeling for decision-making under deep uncertainty.