Download (2.0 MB)
It is essential to objectively test how well policy models predict real world behavior. The method used to support this assertion involves the review of three SD policy models emphasizing the degree to which the model was able to fit the historical outcome data and how well model-predicted outcomes matched real world outcomes as they unfolded. Findings indicate that while historical model agreement is a favorable indication of model validity, the act of making predictions without knowing the actual data, and comparing these predictions to actual data, can reveal model weaknesses that might be overlooked when all of the available data is used for model development. Although this finding is based on just three cases, the value of using prediction to validate models, as recommended by leaders in the field, is compellingly demonstrated. The implication for decision makers is to be cautious about analyses made using models that have been tested only against historical data. The primary contribution is to clearly demonstrate the oft-prescribed but less often performed validation method of testing model predictions against reality.
Wayne is Professor and Systems Science Program Chair at Portland State University. He earned a B.S. and a Master of Engineering at Harvey Mudd College (1973); and a Ph.D. in Systems Science at Portland State U. (1977). His current research focuses on health policy related to drug diversion, abuse and treatment; applied data mining; and environmental/ecological sustainability.
Forecasting -- Simulation methods -- Case studies, Mathematical models -- Calibration, System theory
Statistical Models | Theory and Algorithms
Wakeland, Wayne, "Prediction: The Quintessential Model Validation Test" (2015). Systems Science Friday Noon Seminar Series. 66.