We employed a semantic relatedness judgement task in which participants were asked to decide if the words in each pair were related or unrelated. This basic task was unchanged across conditions, but the nature of the semantic relationship changed across semantically-related trials – some words were from the same category, while some were thematically related. We manipulated the opportunity for top-down controlled semantic retrieval by changing the specificity of the instructions. On half of the trials, a specific task instruction, ‘Category?’ or ‘Thematic?’, was presented such that participants knew in advance which type of semantic relationship would be relevant on the trial (Known Goal). On the other trials, the semantic decision about the word pair was preceded by a non-specific instruction, ‘Category or Thematic?’ (Unknown Goal). This allowed us to compare the brain's response to meaningful items during a semantic task when participants either knew the kind of information that would be relevant to making a subsequent decision, or did not have this information in advance and simply decided based on the words that were presented. As a baseline condition, meaningless letter strings were presented and participants were asked to decide whether the number of letters in the two strings was the same or not. For these trials, a ‘Letter number?’ task instruction was presented (See Fig. 1).

Illustration of Known and Unknown Taxonomic, Known and Unknown Thematic, and Letter string trials.

As Fig. 1 indicates, each trial started with a fixation cross presented for a jittered interval of 1–3 s in the centre of the screen. Then the task instruction slide appeared for 1s followed by a jittered inter-stimulus fixation for 1–3 s. After that, the probe item was presented on the screen for 1 s. After a longer jittered fixation interval lasting 2–4 s, the target was presented for 3 s. This corresponded to the response period during which participants made their judgments (i.e. decisions about the semantic relationship between the words, or if the number of letters was the same or not) and responded as fast and accurately as possible. They pressed buttons on a response box with their right index and middle fingers to indicate YES and NO responses. The overall likelihood of a YES or NO response was the same across conditions.

Stimuli were presented in five runs each containing 45 trials: 6 related and 3 unrelated trials in each of the four experimental conditions, and 9 letter string trials (6 “same number” and 3 “different number” trials). Each run lasted 9 minutes, and trials were presented in a random order. The runs were separated by a short break and started with a 9-second alerting slide (i.e. Experiment starts soon).

Before entering the scanner, participants received detailed instructions, and the experimenter explained the distinction between taxonomic and thematic relationships. Participants were told that taxonomically-related items are from the same category and share physical features and were given the example of Kangaroo and Hedgehog. They were told thematically-related items are found or used together and were given the example of Leaves and Hedgehog. To ensure participants fully understood this distinction, as well as to increase their familiarity with the task format, they completed a 15-trial practice block containing all types of judgements. They were given feedback about their performance and, if accuracy was less than 75%, they repeated the practice trials (this additional training was only needed for one participant who pressed the wrong response buttons).

Note: The content above has been extracted from a research article, so it may not display correctly.

Please log in to submit your questions online.
Your question will be posted on the Bio-101 website. We will send your questions to the authors of this protocol and Bio-protocol community members who are experienced with this method. you will be informed using the email address associated with your Bio-protocol account.

We use cookies on this site to enhance your user experience. By using our website, you are agreeing to allow the storage of cookies on your computer.