What Is The Function Of Noradrenaline In The Brain

The 60 Second Panic Solution

Latest Treatments for Panic Attack

Get Instant Access

Because central noradrenergic pathways are so diffuse, and the synaptic effects of noradrenaline have a comparatively slow time-course, these neurons could have a wide range of functions, depending on the brain region being targeted and the neurobiological status of the individual. In general terms, however, it is agreed that noradrenergic neurons influence arousal. This encompasses not only the sleep/waking cycle (see Chapter 22) but also more specific activities, such as selective attention and vigilance (Aston-Jones et al. 1994). Indeed, depression and anxiety, both of which are relieved by drugs that modify noradrenergic transmission, can be regarded as arousal disorders. Yet, despite nearly 40 years of research, it is still uncertain whether an increase in noradrenergic transmission contributes to unpleasant emotional responses to environmental stimuli (e.g. fear and anxiety) or whether its main role is to ameliorate the emotional impact of such stimuli (i.e. contributes to 'coping').

Many electrophysiological studies have shown that single-unit activity of noradrenergic neurons in the locus coeruleus is increased by sensory stimuli. Effective stimuli range from those causing physical discomfort (e.g. footshock) and interoceptive cues (e.g. hypoglycaemia) to certain arousing environmental stimuli (e.g. the approach of the experimenter). On the basis of this evidence, it has been suggested that central noradrenergic neurons could form part of an 'alarm system'. This would be consistent with the attenuation of the neuronal response on repeated presentation of the test stimulus, the presumption being that this change underlies behavioural habituation.

The precise features of environmental stimuli that provoke increased noradrenergic transmission are unclear but recent experiments using in vivo microdialysis suggest that neither 'novelty' nor the 'aversiveness' of the stimulus alone is responsible (McQuade and Stanford 2000). Electrophysiological studies suggest that it could be the 'salience' (i.e. its significance or relevance to the individual), or change in salience, of a stimulus that is the key factor and that increased noradrenergic transmission in the brain mediates changes in selective attention.

Even if this turns out to be the case, it is likely that noradrenergic neurons in different brain regions make different contributions to this process. This complication is suggested by the results of a recent microdialysis study in which release of noradrenaline in response to the sound of a buzzer alone was provoked after repeated

Figure 8.11 Noradrenaline efflux, measured by microdialysis, in the rat frontal cortex and hypothalamus. (a) Repeated exposure to a tone, alone, has no effect on noradrenaline efflux in either brain region. (b) After repeated pairing of the tone with transfer of the rat to a brightly lit (aversive) arena, the sound of the tone alone triggers a significant (*: P<0.05, cf. last basal sample) increase in noradrenaline efflux in the frontal cortex, but not the hypothalamus. (Based on a figure from McQuade and Stanford 2000)

Figure 8.11 Noradrenaline efflux, measured by microdialysis, in the rat frontal cortex and hypothalamus. (a) Repeated exposure to a tone, alone, has no effect on noradrenaline efflux in either brain region. (b) After repeated pairing of the tone with transfer of the rat to a brightly lit (aversive) arena, the sound of the tone alone triggers a significant (*: P<0.05, cf. last basal sample) increase in noradrenaline efflux in the frontal cortex, but not the hypothalamus. (Based on a figure from McQuade and Stanford 2000)

pairing of this normally neutral stimulus with transfer of the rats to a brightly-lit novel arena. This adaptive change occurred in the frontal cortex but not the hypothalamus suggesting that only noradrenergic neurons innervating the former brain region (i.e. those arising from the locus coeruleus) show adaptive changes in response to a change in the salience of an environmental stimulus (McQuade and Stanford 2000) (Fig. 8.11).

Another concept is that noradrenergic transmission influences the emotional impact of a given stimulus, i.e. individuals' ability to 'cope'. One obvious possibility is that inadequate noradrenergic transmission explains depression, whereas moderate activity provokes attentive interest that is vital for appropriate cognitive function, and excessive noradrenergic activation culminates in anxiety or agitation. Evidence supporting this single axis for central noradrenergic function/dysfunction is discussed in Chapters 19 and 20.

It is equally possible that the role and consequences of central noradrenergic transmission depend on the type or severity of the stimulus and individual differences in the neurobiological coding of behaviour. This would mean that the optimal behavioural response to a given environmental stimulus requires a specific increase in noradrenergic transmission. The optimal response could be determined genetically or by the individuals' previous experience of that stimulus, or both. Deviation of the response, in either direction (i.e. either under- or overactivity), would then result in a deficit in 'coping' (Fig. 8.12(a)). However, it is also possible to envisage disruption of this neurochemical coding of behaviour in the ways illustrated in Figs 8.12(b) and 8.12(c). If there is a shift of the curve to either the right or the left, then the noradrenergic response that would be optimal in normal subjects now produces a suboptimal coping response. In the case of a shift to the left, a reduction in noradrenergic transmission would be required to restore optimal coping whereas for a shift to the right, an increase would be required.

This hypothetical scheme means that there are two possible sources of mismatch that could account for an abnormal behavioural response to a given stimulus and result in an 'inability to cope'. One is that the underlying coding is correct but it is the noradrenergic response evoked by the stimulus that is inappropriate. A second is that the amplitude of the noradrenergic response to arousing stimuli is normal but the underlying coding is not.

Several findings support this model. For instance, an early report suggested that there is a positive correlation between the density of (postsynaptic) ^-adrenoceptors in rat cortex and behavioural resistance to a mild environmental stress (novelty and frustration) but a negative correlation between these parameters when the stress is intensified (Stanford and Salmon 1992). More recently, it has been proposed that the phasic response of neurons in the locus coeruleus (which governs 'attentiveness') depends on their tonic activity (which determines arousal). Evidence suggests that the relationship between these two parameters is described by a bell-shaped curve and so an optimal phasic response is manifest only at intermediate levels of tonic activity (Rajkowski et al. 1998).

Obviously, it is extremely unlikely that noradrenergic transmission is the sole factor to determine the behavioural response to even simple environmental stimuli. Indeed, a bell-shaped dose-response curve immediately suggests the intervention of one or more additional factors (neurotransmitters?). Such interactions with other neurotransmitters could well define the relationship between noradrenergic transmission and the coding of the coping response.

Noradrenergic transmission Noradrenergic transmission

Figure 8.12 Schematic diagram showing the hypothetical relationship between noradrenergic transmission and an individual's ability to 'cope' with aversive environmental stimuli, (a) Optimal coping is attained when the brain rallies a specific noradrenergic response which is determined either genetically and/or by previous experience of the stimulus. Either a reduction or an increase in noradrenergic transmission produces a functional mismatch and diminishes coping. (b) The hatched area depicts the normal relationship between changes in noradrenergic transmission and coping with aversive stimuli (as illustrated in (a)). In these normal subjects, optimal coping is attained when the noradrenergic response to a specific stimulus corresponds to that marked (♦). If there is a leftward shift of the curve that describes the neurochemical coding of coping, then the (predetermined) noradrenergic response that would be optimal in normal individuals now produces suboptimal coping (#). One remedy for such a dysfunction is to reduce noradrenergic transmission so as to restore optimal coping. Similarly, in the case of a rightward shift of the coping curve (c), a predetermined noradrenergic response to a specific stimulus, that would be optimal in normal individuals, will again produce suboptimal coping (#). This time, the remedy is to increase noradrenergic transmission. In both (b) and (c) an alternative way to restore optimal coping would be to reverse the shift in the noradrenergic transmission/coping curve. This could explain the changes in mood that occur after chronic administration of drugs that cause long-latency changes in neurochemical factors that influence noradrenergic transmission (see Chapters 19 and 20)

Was this article helpful?

0 0
Stop Anxiety Attacks

Stop Anxiety Attacks

Here's How You Could End Anxiety and Panic Attacks For Good Prevent Anxiety in Your Golden Years Without Harmful Prescription Drugs. If You Give Me 15 minutes, I Will Show You a Breakthrough That Will Change The Way You Think About Anxiety and Panic Attacks Forever! If you are still suffering because your doctor can't help you, here's some great news...!

Get My Free Ebook


Post a comment