Please enable JavaScript.
Coggle requires JavaScript to display documents.
Emotions (Amygdala (Adolphs et al. (1994; 1995) described patient SM who…
Emotions
Amygdala
Adolphs et al. (1994; 1995) described patient SM who had a bilateral (both sides of brain) amygdala lesion due to a disease
- impaired recognition of emotion in facial expressions --> Specific fear recognition deficit when asked to rate Ekman faces on a 0-5 (not at all-very much) scale
- also specific fear deficit when ask to draw faces from memory: for afraid drew baby --> potentially because when she was a child she encountered a vicious dog
- Not a deficit because
-unfamiliar with the concept of fear: able to describe situations that elicit fear, and has experienced fearful situations (drawing baby)
-she doesn't have emotional responses (its fear specific)
= suggests critical role of the amygdala in fear processing -> in line with animal work using fear conditioning that suggests the amygdala has a role in the rapid, automatic detection of fear/threat
BUT, SM can rapidly detect & process fearful faces like controls, despite no amygdala
- Tsuchiya et al. (2009) SM saw target stimulus (fearful or angry face or scene showing threat) next to neutral stimuli for 40ms (unmasked) and had to push button as rapidly as possible to indicate which face showed more fear/anger or which scene was more threatening (test of rapid detection of fear and threat related stimuli)
- SM’s performance on this task was completely normal for all categories
- they used NimSim faces as SM might be overtrained with Ekman faces BUT SM might just be discriminating emotional and neutral faces, not specifically fear
- 2nd expt: morphed faces between neutral and fear and asked p's to categorise as neutral or afraid. SM needed higher intensity of fearful morph than controls in order to categorise as afraid, yet when given visual search task to detect rapidly fearful compared to neutral faces performed normally to controls. Same results for happy-fear and sad-fear.
= SM implicitly discriminated between fearful and other expression with the same fear category effects as normal p’s, despite not being able to explicitly recognise fearful faces
- Concluded amygdala is not essential for nonconscious rapid fear detection- its still possible that it participates in such processing
- Some neuroimaging studies have found that the amygdala’s response to fearful faces is strongly modulated by conscious detectability, at least when backward masking is used - Jiang (2006)
- Electrophysiological latencies recorded in the amygdala are, by and large, inconsistent with rapid visual processing (Mormann et al. 2008) and there is no direct anatomical evidence to support the rapid visual subcortical route that has been hypothesized (Adolphs 2008)
= these suggest that the amygdala modulates social judgments of fear rather than initial pre-attentive detection
- If just give people eyes to look at they're very good at identifying emotion, yet SM is not
- BUT if she is instructed to look at the eyes in faces, something she fails to do spontaneously, SM becomes normal at recognising fearful faces (Adolphs et al. 2005)
= In the absence of the amygdala, explicit fear recognition may be impaired as a result of an absence of the amygdala's normal modulation of information processing (for example, directing visual attention to the eyes in faces).
Argument that amygdala gives rapid unconscious response to fear is not possible because fMRI responses to fearful faces depend on awareness of that face being afraid (Pessoa et al. 2006)
- fMRI study of normal p's (could detect/were aware of fearful face at 67ms) and overachievers (could detect/aware of fearful face at 33ms).
- Found no sig. amygdala activation when unaware i.e. 33ms for normal p's. But some activity for overachievers at 33ms as they were aware.
= awareness is necessary for amygdala activity to arise
The amygdala also responds to positive stimuli and emotional arousal (e.g. Ball et al., 2009)
= Perhaps the amygdala plays a broad role in emotions, and is important for detecting saliency and relevance of events
Other emotions
Dalgleish et al. (2009) review:
- Right hemisphere originally thought to be specialised for emotional processing based on patterns of responses in patients with unilateral cortical damage
- More recent is valence hypothesis= left hemisphere for +ve emotions (approach behaviour), right hemisphere for -ve emotions (avoidance behaviour) --> from EEG
Emotion-related brain regions (see table)
- PFC:
-orbitofrontal region of the PFC proposed to be involved in learning the emotional and motivational value of stimuli (Rolls, 1999).
-Specifically, PFC regions work with the amygdala to learn and represent relationships between new stimuli (secondary reinforcers) and primary reinforcers such as food, drink, and sex.
-Importantly, claims neurons in the PFC can detect changes/reversals in the reward value of learned stimuli and change their response accordingly.
-These ideas have been based on 30 years of electrophysiological and brain imaging studies of humans and animals and derive from the pioneering work of Mowrer in the 1950s and 1960s (Mowrer, 1960).
-Role in anger: Phineas Gage damage to frontal lobe changes in behaviour = increases aggression
- Link to Damasio (1994) somatic marker hypothesis
Insula:
- involved in the generation of a risk prediction error that guides decision-making in risky situations and can facilitate learning about uncertain rewards (Preuschoff et al. 2008)
- supports the role for bodily feedback in decision-making proposed by the somatic marker hypothesis (Damasio, 1994)
Future research:
- behavioural genetics is likely to greatly enhance our understanding of mechanisms involved in the generation/regulation of affect (Hariri et al., 2002) e.g. genetic variations in the serotonin transporter gene have already been linked to activation in the key neural circuitry for affect regulation when processing aversive stimuli.
- The combination of neuroimaging with such genetic measures seems good avenue for more clearly accounting for individual differences in affective responding.
Multivariate analysis
- Few studies looking at positive emotions or across arousal/valence, many only looking at 2 emotions at a time or the 6 basic emotions
- led to turn to meta-analyses to compare multiple expts. e.g. Hamann (2012) contrast 5 basic emotions: showed diff in underlying brain regions- not quite easy as they completely seperate, there was also overlap
-Kragel & LaBar (2016) noted that meta-analyses have demonstrated consistent increased activity in the amygdala when eliciting fear, yet argue it has also been activated by eliciting both positive and negative emotions = attribute to the limited spatiotemporal resolution of fMRI
fMRI activity is measured via voxels, which tend to be in order of 3x3x3mm cube of the brain, and when smoothed for analysis are about 10mm2= contain millions of neurons, which means its unlikely that a single voxel will demonstrate emotion-specific activation (i.e., consistently exhibit increased activation for one emotion but not for other emotions), even if specialized neurons reside within a voxel.
- Turned to new ways of characterizing emotion representations - multivoxel pattern analysis (MVPA)
-involves looking whether patterns exist across groups of voxels rather than individual ones (e.g. group of 9) to see if consistencies exist/differences between different experimental conditions
-e.g. If you look at brain activity for faces and houses, and you were to compare 9 voxels (doing univariate analysis might bring no difference) but if look across 3 trials can see that the pattern across the voxels is consistent for faces, which differs to those for houses
Advantage = high sensitivity because incorporate more info than a single summary statistic of the most sig. activated voxel in a brain region.
Has been success when using pattern analyses:
- Wager et al. (2015) suggest emotion categories are distributed across systems rather than single brain regions
-analysed brain activity patterns from 148 studies of emotion categories using hierarchical Bayesian model (could predict with 66% accuracy across categories)
-analyses of the activity patterns in the model showed each emotion category was associated with unique patterns of activity across multiple brain systems including the cortex, thalamus, amygdala, and other structures.
= emotion categories are not contained within any one region or system, but are represented as configurations across multiple brain networks.
-consistent with categorical or dimensional theories
- Chang et al. (2015) 182 p's rated emotional state after viewing negative
and neutral scenes, followed by pattern classification analyses
-Multiple, rather than individual, regions predicted negative affect
-Network included anterior cingulate, insula, and amygdala
-consistent with dimensional theories of emotion
Specific MVPA examples:
- Saarimaki et al. (2015) tested whether categorical models of emotions posit neurally and physiologically distinct human basic emotions
-used fMRI and MVPA analysis to classify brain activity of 6 basic emotions in 3 expts where emotion induced through movies or mental imagery
-found all basic emotions were associated with specific activation patterns within a distributed network of cortical and subcortical areas (regardless of how emotion was induced--> most consistent differential patterns were focused on the cortical midline structures and sensorimotor regions, but also extended to areas traditionally associated with emotion processing such as the insula or amygdala
-Similarity of subjective experiences between emotions was associated with similarity of neural patterns for the same emotions = claim direct link between activity in these brain regions and the subjective emotional experience
= basic emotions are supported by discrete neural signatures within several brain areas
- BUT, Clark-Polner et al. (2016) dispute these conclusions
-claim Saarimaki et al's findings do not actually meet the criteria for basic emotion theory, but are more consistent with a constructed emotion theory
Classification
- Not easy to define: combination of physiological responses (e.g. sweaty palms etc), changes in behaviour and subjective feelings
- Emotions = immediate responses to specific objects/situations that allow animals to adapt/react to events of biological/personal experience
- Moods = diffuse and long-lasting emotional states
- Categorical theory: Ekman (1992) cross-cultural study concluded that there are 6 basic emotions: anger, disgust, fear, happiness, sadness, and surprise. Each emotion acts as a discrete category rather than an individual emotional state, are innate and universal. Different to more complex emotions, these are learned/shaped by context
- Dimensional theories: Each emotion is a point in a complex space formed of
continuous dimensions, usually arousal (intensity of the emotion) and valence (relative pleasantness) e.g. angry is towards unpleasant and intense or mild depending on how aroused
Neural substrates of emotions:
- Origins in James-Lange body theory of emotion, followed by Canon-Bard
- Damasio (1991) outlined somatic marker hypothesis, then Panskepp (1992) coined the term "affective neuroscience" as the study of the neural mechanisms of emotion
- Followed by decade of focus on the role of the amygdala
W/ Cognition
Influence with attention:
- Attentional blink task = rapid presentation of stimuli (every 100ms) --> so quick can't identify individual stimulus. If tell p's to selectively attend to a few targets they are able to process and later identify. But, if 2nd target presented few items after 1st= will often miss it. noticing and encoding 1st target = temporary refractory period where attention is blinked.
- Anderson (2005) used emotional and neutral words as stimuli. When 2nd target word was arousing= attentional blink effect reduced. Ability to detect arousing words greater than neutral words when presented as 2nd target. P's with left amygdala damage failed to show normal attenuation of the attentional blink effect with emotion
= in situations with limited attentional resources, emotional stimuli are more likely to reach awareness, and amygdala plays a critical role in this facilitation.
- Anxious individuals show an attentional bias (attend to threat-related stimuli present at same time as neutral stimuli) on emotional stroop task --> Calvo & Avero (2005) reviewed studies found attentional bias associated with high anxiety in 58% of the studies and findings not significant in remaining 42%
Mood congruity and mood state-dependent memory:
- more easily remember info thats valence fits current mood state --> recall +ve material in +ve moods, -ve material in -ve moods e.g. Direnfeld & Roberts (2006) effect occurs regardless of whether the mood is naturally occurring or experimentally induced and Christodoulou & Burke (2016) shown effect in 3 & 4 yr old children
- memory is best when mood at retrieval matches that at time of learning e.g. Lewis et al. (2005) mood congruence was accompanied by a boost in activation in the parts of the brain involved in the encoding of valence-specific material during retrieval--> left posterolateral orbitofrontal cortex more active during encoding for later remembered negative words, and more active for negative words remembered in a negative mood = encoding and retrieval mood matches activated brain area
- Mood improves memory? Yang et al. (2013) +ve affect improves WM performance by boosting controlled processing, whilst Curci et al. (2013) demonstrated that WM deficits arise in response to negative affect, and that this is triggered by increases in rumination.