request Request a Protocol
ask Ask a question
Favorite

Our sampling process included five criteria. First, the amount of funding, using the work of Viergiver and Hendriks [41, 42] to gain an overview of the largest funders in the field of health and life sciences in terms of annual spending. We opted for this particular field of research because funders and researchers in this area are considered to be rather advanced in their data sharing practices [24, 43, 44]. Second, some European funding agencies, which are leading national agencies in their respective countries, were added to the list. This allowed us to investigate a potential collaboration of funding agencies and convergence of policies within Europe, as several agencies are connected via funder networks such as Science Europe [45]. Third, we reviewed data sharing policies available on funders’ websites to identify agencies that have policies on data sharing, and then primarily contacted advanced and innovative agencies with detailed and elaborate policy documents. At a later stage, we also asked early interviewees about which other agencies they deemed exemplary or progressive and updated our sample accordingly. Fourth, we expected significant differences between funders specialised in the funding of health and life science research and funders invested in a number of different disciplines. The anticipated challenge of addressing a spectrum of scientific fields with policies led us include both kinds of agencies. Finally, we included some private funders to check for potential deviations from public funders. Our sampling criteria can be summed up in the following sampling procedure (Fig 1).

Overall, we contacted 33 funding agencies from Europe and North America, 27 of which are public and six of which are private. After approaching their respective Data Sharing, Open Research Data, Open Science or Open Access departments via email (with up to two reminders), our final sample included 16 funding agencies available for interviews (48% positive response rate). In total, we interviewed funders from ten different countries, six from the Anglosphere and ten from continental Europe. Seven of them specialise in health and life sciences, while nine of them fund research from a spectrum of different scientific fields. The vast majority of them are public funding agencies. Of the 17 unsuccessful interview requests, three agencies explicitly declined our request and 14 did not respond at all. To reduce the risk of any potential re-identification of the 16 interviewees and to ensure their anonymity, we assigned three different pseudonyms to each interview, leading to a list of 48 ciphers (Interview 01–48). This approach of using multiple ciphers for each interview is the most suitable way for us to cite passages from the interviews in the results section and maximise the scientific value of the interviews, while at the same time protecting the anonymity of the interviewees. It is particularly important because some interviewees made critical statements about their organisations’ efforts and strategies.

Do you have any questions about this protocol?

Post your question to gather feedback from the community. We will also invite the authors of this article to respond.

post Post a Question
0 Q&A