Rasch analysis was used to develop the final FUNMOVES assessment tool, with appropriate modifications to FUNMOVES made after each iteration to enhance its structural validity. Rasch is a form of probabilistic mathematical modelling that has several advantages over classical testing of outcome measures (such as exploratory factor analysis). It determines whether an outcome measure’s psychometric properties permit the summing of items’ raw scores to provide a total outcome score [45]. In the case of FUNMOVES, the activities form the ‘items’ of the FUNMOVES evaluation. Moreover, the Rasch approach combines evaluation of a number of psychometric issues such that if item responses (the scores) meet the expected Rasch model, the summed ordinal scores can be transformed to interval level scaling [45]. Additionally, it enables you to evaluate not only whether all items are measuring the same overarching construct, but also (i) whether there are redundant items in the scale (local dependency) and (ii) how changes to activities (e.g. changes to scoring) may impact the validity of the measure. It is therefore useful when a new scale, such as FUNMOVES, is developed from first principles. Rasch analysis works on the premise that the ability to complete an ‘item’ is dependent on (i) the difficulty of the item and (ii) the ability of the participant [48]. It uses an item-response model to evaluate participant ability and item difficulty on a shared continuum (logit scale) [49]. Items positioned high on the logit scale are more difficult and individuals high on the scale are more capable. Rasch analysis uses the logit scale to assess the psychometric characteristics of assessment tools [50]. The Rasch analyses in these studies were conducted on each school’s item responses that were gathered using the procedure outlined above. The analyses used the unrestricted partial credit model in RUMM 2030 software, as responses varied between items [51]. Each Rasch analysis generates summary statistics including mean ‘person’ and ‘item’ locations and a chi squared test indicating fit to the Rasch model. A non-significant chi-square value would indicate no difference between scores expected by the model and those observed in testing, and would suggest that items were measuring consistently across different ability levels [52]. Internal consistency values are also calculated using a ‘person separation index’ (PSI). An assessment tool which has the ability to differentiate between two or more groups of ability should have a PSI value of ≥0.7 [53].

Analyses for individual items (i.e. each activity within FUNMOVES) included fit to the Rasch model (measured using chi-squared and fit residuals), response category thresholds, item response bias (Differential Item Functioning- DIF), and response dependency. Unidimensionality was assessed using principle component analysis which identified the two most divergent subsets of items within the first factor [54]. Person estimates for each of the two sets of items were calculated, and differences between these estimates were assessed using t-tests. For a measure to be classified as unidimensional, there should be no more than 5% of significant tests, or the lower bound of the binomial confidence interval should be less than 5% [52]. Rasch analysis is a more accurate and comprehensive measure of structural validity than factor analysis [55] and has been used previously to validate motor skill measures [5660]. In the case that FUNMOVES was not multidimensional or had response dependency, items were removed. To ameliorate disordered thresholds, two or more adjacent response categories may be combined. To evaluate the external structural validity of FUNMOVES, in each study an ANOVA was conducted using mean logit scores to see whether there were significant differences between school year groups, genders, and whether or not teachers thought each child had motor difficulties prior to testing.

Note: The content above has been extracted from a research article, so it may not display correctly.



Q&A
Please log in to submit your questions online.
Your question will be posted on the Bio-101 website. We will send your questions to the authors of this protocol and Bio-protocol community members who are experienced with this method. you will be informed using the email address associated with your Bio-protocol account.



We use cookies on this site to enhance your user experience. By using our website, you are agreeing to allow the storage of cookies on your computer.