Research reproducibility was assessed using the markers of pre-registration; sharing of protocols, data, materials, and analysis scripts; replication; and open access publishing (Table 1). Research transparency was assessed using the markers of funding source and conflicts of interest declarations. Inter-rater reliability of the independent coding of the two researchers was calculated using Krippendorff's alpha [33] using Python 3.6 (https://github.com/HumanBehaviourChangeProject/Automation-InterRater-Reliability).
Do you have any questions about this protocol?
Post your question to gather feedback from the community. We will also invite the authors of this article to respond.