Sponsor
Portland State University. Department of Engineering and Technology Management
First Advisor
Tugrul U. Daim
Date of Publication
Fall 11-7-2016
Document Type
Dissertation
Degree Name
Doctor of Philosophy (Ph.D.) in Technology Management
Department
Engineering and Technology Management
Language
English
Subjects
Multiple criteria decision making -- Mathematical models, Research institutes -- United States -- Management -- Evaluation, National Science Foundation (U.S.). Industry/University Cooperative Research Centers Program
DOI
10.15760/etd.3276
Physical Description
1 online resource (xvii, 232 pages)
Abstract
This research provides performance metrics for cooperative research centers that enhance translational research formed by the partnership of government, industry and academia. Centers are part of complex ecosystems that vary greatly in the type of science conducted, organizational structures and expected outcomes. The ability to realize their objectives depends on transparent measurement systems to assist in decision making in research translation.
A generalizable, hierarchical decision model that uses both quantitative and qualitative metrics is developed based upon program goals. Mission-oriented metrics are used to compare the effectiveness of the cooperative research centers through case studies.
The US National Science Foundation (NSF) industry university cooperative research center (IUCRC) program is the domain of organizational effectiveness because of its longevity, clear organizational structure, repeated use and availability of data. Not unlike a franchise business model, the program has been replicated numerous times gaining recognition as one of the most successful federally funded collaborative research center (CRC) programs. Understanding IUCRCs is important because they are a key US policy lever for enhancing translational research. While the program model is somewhat unique, the research project begins to close the gap for comparing CRCs by introducing a generalizable model and method into the literature stream.
Through a literature review, program objectives, goals, and outputs are linked together to construct a four-level hierarchical decision model (HDM). A structured model development process shows how experts validate the content and construct of the model using these linked concepts.
A subjective data collection approach is discussed showing how collection, analysis and quantification of expert pair-wise-comparison data is used to establish weights for each of the decision criteria. Several methods are discussed showing how inconsistency and disagreement are measured and analyzed until acceptable levels are reached.
Six case studies are used to compare results, evaluate the impact of expert disagreement and conduct criterion-related validity. Comparative analysis demonstrates the ability of the model to efficiently ascertain criteria that are relatively more important towards each center's performance score. Applying this information, specific performance improvement recommendations for each center are presented.
Upon review, experts generally agreed with the results. Criterion-related validity discusses how the performance measurement scoring system can be used for comparative analysis among science and engineering focused research centers. Dendrograms highlight where experts disagree and provide a method for further disagreement analysis. Judgment quantification values for different expert clusters are substituted into the model one-at-a-time (OAT) providing a method to analyze how changes in decisions based on these disagreements impact the results of the model's output. This research project contributes to the field by introducing a generalizable model and measurement system that compares performance of NSF supported science and engineering focused research centers.
Rights
In Copyright. URI: http://rightsstatements.org/vocab/InC/1.0/ This Item is protected by copyright and/or related rights. You are free to use this Item in any way that is permitted by the copyright and related rights legislation that applies to your use. For other uses you need to obtain permission from the rights-holder(s).
Persistent Identifier
http://archives.pdx.edu/ds/psu/18786
Recommended Citation
Gibson, Elizabeth Carole, "A Measurement System for Science and Engineering Research Center Performance Evaluation" (2016). Dissertations and Theses. Paper 3285.
https://doi.org/10.15760/etd.3276