Evaluation of the Influence of Wording Changes and Course Type on Motivation Instrument Functioning in Chemistry

Published In

Chemistry Education Research and Practice

Document Type


Publication Date



Science -- Study and teaching (Higher), Science education -- Research, Science education -- Methodology, Chemistry -- Study and teaching (Higher)


Increased understanding of the importance of the affective domain in chemistry education research has led to the development and adaptation of instruments to measure chemistry-specific affective traits, including motivation. Many of these instruments are adapted from other fields by using the word ‘chemistry’ in place of other disciplines or more general ‘science’ wording. Psychometric evidence is then provided for the functioning of the new adapted instrument. When an instrument is adapted from general language to specific (e.g. replacing ‘science’ with ‘chemistry’), an opportunity exists to compare the functioning of the original instrument in the same context as the adapted instrument. This information is important for understanding which types of modifications may have small or large impacts on instrument functioning and in which contexts these modifications may have more or less influence. In this study, data were collected from the online administration of scales from two science motivation instruments in chemistry courses for science majors and for non-science majors. Participants in each course were randomly assigned to view either the science version or chemistry version of the items. Response patterns indicated that students respond differently to different wordings of the items, with generally more favorable response to the science wording of items. Confirmatory factor analysis was used to investigate the internal structure of each instrument, however acceptable data-model fit was not obtained under any administration conditions. Additionally, no discernable pattern could be detected regarding the conditions showing better data-model fit. These results suggest that even seemingly small changes to item wording and administration context can affect instrument functioning, especially if the change in wording affects the construct measured by the instrument. This research further supports the need to provide psychometric evidence of instrument functioning each time an instrument is used and before any comparisons are made of responses to different versions of the instrument.


© The Royal Society of Chemistry 2018


This is the author’s version of a work that was sebsequently published in Chemistry Education Research and Practice. Changes resulting from the publishing process, such as editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Chemistry Education Research and Practice, 2018 19, 184-198, and can be found online at: https://doi.org/10.1039/C7RP00181A.



Persistent Identifier