Representative exams from each course section were collected from instructors beginning 1 year before implementation of the transformation project (year 0) and continued for three additional years (years 1 to 3). We note that in some cases, particularly for the chemistry and physics courses, multiple sections of a single course in a given semester used common exams; these exams were counted only once. Further, when instructors used the same exams verbatim from year to year, which was rare, we coded each unique exam only once. In total, we collected and analyzed 4020 questions from 134 unique exams, fully representing all 185 course sections of the eight courses that were offered during the 4-year span (Table 1). Identifying information about the instructor(s), course, section, and term offered was removed from each exam so that the exams were identifiable only by the researcher who organized the data. Each exam and every question on each exam were tagged with a unique identification number, and multipart questions were identified as clusters (e.g., if a question had a part “a” and “b,” then these parts were coded together as a single cluster) (16). We also recorded the number of points associated with each question.

A detailed summary by course is provided in table S2.

Interrater reliability on practice items was achieved using a subset of the exams, as discussed in Laverty et al. (16), and then all items within each disciplinary area were coded with the 3D-LAP by team members with expertise in that discipline [we refer readers to the S1 Exemplars Supporting Information in Laverty et al. for a substantial set of example 3DL assessment items, and to Underwood et al. (35) and Laverty and Caballero (36) for comparisons between traditional and 3DL assessment items]. In alignment with the intent of the Framework (9) that the three dimensions be integrated, we report the results in a binary way, that is, an assessment item either met the criteria for all three dimensions or it did not. After all items were coded, we determined the fraction of points on each exam that was associated with questions coded as 3D. In this way, we are able to compare different sections of the same course within a discipline and compare courses across different disciplines.

Note: The content above has been extracted from a research article, so it may not display correctly.

Please log in to submit your questions online.
Your question will be posted on the Bio-101 website. We will send your questions to the authors of this protocol and Bio-protocol community members who are experienced with this method. you will be informed using the email address associated with your Bio-protocol account.

We use cookies on this site to enhance your user experience. By using our website, you are agreeing to allow the storage of cookies on your computer.