GENERAL AROUSAL

A. INTRODUCTION

B. SLEEP, AROUSAL AND ENVIRONMENTAL CHANGE

Brain Mechanisms of Arousal

Sleep and the EEG

Circadian Rhythms

Arousal as Reward

C. DRUGS THAT INCREASE AROUSAL

Strychnine, Picrotoxin and Pentylenetetrazol

The Xanthine Derivatives

Nicotine

Sympathomimetics

Amphetamines

Cocaine

D. DRUGS THAT DECREASE AROUSAL

Benzodiazepines and Barbiturates

Alcohol

Anticholinergic Drugs

E. DRUGS THAT CHANGE PERCEPTION

F. SUMMARY

Principles

Terms


Return to main Table of Contents

GENERAL AROUSAL

A. INTRODUCTION

The brain/behavior/environment triangle has been discussed at several points to emphasize the mutual interactions of these three components. It may be useful now to analyze the material that has been covered, as well as the material that is about to be covered, in terms of the way they fit into this interaction. Three general themes can be identified:

B. SLEEP, AROUSAL AND ENVIRONMENTAL CHANGE

Brain Mechanisms of Arousal

One of the most important developments in the understanding of arousal was Moruzzi and Magoun's (1949) description of the reticular formation of the midbrain (it is called reticular formation because of the net-like anatomical complexity of the small neurons). Damage to this region caused continuous somnolence in the cat, whereas electrical stimulation of the area causes immediate awakening of a sleeping cat. Although the reticular formation receives inputs from the major sensory systems, it does not appear to be a part of the classical sensory projection systems. Based on their experimental observations, Moruzzi and Magoun proposed that the reticular formation was a sort of general "power supply" to determine the level of activity of the entire brain. The functional and anatomical characteristics have been combined in the descriptive term, ascending reticular activating system (ARAS; see Fig. 8.1).

The idea that a brain area was responsible for arousal set the stage for the complementary notions that a brain area could be directly responsible for putting an animal to sleep, i.e., sleep may be an active process rather than a passive result of reduced sensory stimulation. The evidence for this emerged from a number of different experiments that involved transection of the brain at various levels. Transection at the level of the spinal cord had little or no effect on arousal, transection at a somewhat higher level resulted in permanent wakefulness (presumably because of separation from the active sleep centers), and transection at a still higher level led to permanent somnolence (presumably because of separation from centers for arousal). The overall regulation of sleep and wakefulness is complicated, but the diagram in Figure 8.2 is a reasonable shorthand version of these systems.

Sleep and the EEG

On a purely statistical basis, sleep is the most important behavior that we engage in. It normally consumes about a third of our days, and rather steadfastly denies any attempts to significantly change the total amount of or even the pattern of its influence. This has led one sleep researcher (Webb, 1975) to refer to sleep as the gentle tyrant, imposing its demands on our schedules in a most willful manner. But sleep is only one part of a more pervasive tendency of the nervous system to be in oscillation: Whether we are referring to a single cell or to the entire nervous system, there is a cyclic change in activity level, specific biochemical mechanisms to accomplish this change, and an allowance for behavioral modification of the cyclic changes.

It has been known since the early 1800's that electricity was somehow involved in nervous activity (e.g., observations by Helmholtz and by Galvani), but it was not until 1930 that the first electroencephalogram was recorded. Berger (1930) inserted needles just under the scalp (of his son) and was able to record rhythmic electrical waves, the frequency and amplitude of which changed with the level of arousal. This crude demonstration was the beginning of a very active field of study which attempts to use the electrical activity as a sort of mirror of mental events. The results of these studies have not lived up to the hopes (or in many cases, to the interpretations) of the investigators, with one exception: The EEG has been an indispensable tool in the investigation of sleep and related processes of arousal.

The rule of thumb is that the rhythmic fluctuations of the EEG become slower and larger as the level of arousal declines (refer to Fig. 8.3). It is not necessary for the present discussion to go into the details of the EEG, but there are several important categories that deserve mention. When the eyes are closed and the subject relaxes (without visual imagery), the normally aroused EEG slows to about 10 Hz with an increase in amplitude. This is the so-called alpha wave The alpha state has been touted as a highly desirable state of meditation which can be monitored and fostered by commercial devices costing up to several hundred dollars (alternatively, one can stop looking at the catalogue, close the eyes, and achieve about the same state). If this relaxation continues, the wave slows even more, and the subject enters a rather nebulous state between sleep and wakefulness. This corresponds roughly to the theta wave. The theta state has also been viewed as a desirable state for the creative processes, although the creations are frequently forgotten. Beyond this stage, spiked impulses called sleep spindles appear and the EEG continues to become slower and larger until large, sweeping delta waves are accompanying by the behavioral state of deep sleep.

There is one major exception to the relationship between level of arousal and EEG pattern. Kleitman and one of his students observed periods of apparent deep sleep (arousal was difficult to obtain) during which time the EEG pattern was rapid and small. This so-called paradoxical EEG was accompanied by rapid eye movement (REM) and appears to be related to periods of dreaming (cf., Dement & Kleitman, 1957; Kleitman, 1963).

Most will agree that sleep is a pleasant enough pastime, but it is neither a luxury nor an option. Attempts to eliminate or reduce the amount of sleep are accompanied by compelling urges to sleep. If these urges are fought, irritability ensues and performance becomes impaired. Webb (e.g., 1975) has observed that desert soldiers neglect to keep their canteens filled with water, and nurses fail to make optional rounds to check patients. Eventually, frank psychotic behavior may result, but heroic efforts are usually required to allow sleep deprivation to reach this degree. Webb has remarked that "Sleep is a fixed biological gift and we had better learn to adjust to its requirements rather than try to make it serve our paltry demands." (p. 28 in Goleman, 1982.)

The phenomenon of sleep serves as sort of a caricature for two important observations: (a) cyclic levels of activity appear to be a recurrent theme of brain function, and (b) these differing levels of activity provide different ways of processing environmental information. We turn now to the effects of varying degrees of arousal within the waking state before examining several classes of drugs that have been used to alter, in one way or another, these levels of activity.



The pharmacology of sleep and arousal remains poorly understood. Jouvet (e.g., 1974) has been the champion of the serotonergic theory of sleep, marshaling considerable evidence that serotonin produced in the raphe nucleus is responsible for the induction of sleep. In general, drugs that inhibit serotonin cause insomnia while the administration of serotonin causes sleep. More recently, sleep onset has been related to benzodiazepine receptors (cf., Chapter 4; Mendelson et al, 1983), but this does not preclude the possibility that these receptors are on cells that release serotonin as the neurotransmitter. On the arousal side, there are at least three candidates for neurotransmitters: dopamine, norepinephrine and acetylcholine. Each of these is released at a higher rate during periods of arousal, and drugs that block the effects of these compounds can produce drowsiness. The general features of these systems is summarized in Figure 8.4. Specific examples will be given later in the chapter.

All of these local changes in sleep patterns and levels of arousal occur against a more global backdrop of circadian (24-hr) rhythms. There is a powerful rhythmicity for virtually all organisms and systems within organisms (e.g., Moore-Ede et al, 1982). Although these rhythms are synchronized with and, in many cases, adjusted by the light/dark cycle, most of these rhythms continue in the absence of normal 24-hr cues. In a dramatic demonstration of this, sleep researcher Nathaniel Kleitman and several associates went deep into a deep cave where they attempted to adapt to an arbitrary 19.5-hr day. None of the individuals could do this; they each showed a so-called "free-running" rhythm that was somewhat greater than 24 hrs, drifting forward with respect to the "real" time set by the sun up at the earth's surface (Kleitman, 1963). The brain area that is responsible for setting many of these circadian events is the suprachiasmatic nucleus, located in the hypothalamus. This nucleus receives information via an accessory optic system (and perhaps through other sensory channels) to adjust and maintain the accuracy of its inherent 24-hr rhythmicity. Injection of radioactive 2-deoxy glucose (2-DG) into brain areas can provide a graphic illustration of the rate of metabolism for a particular area. When this procedure is used to study the suprachiasmatic nucleus, it shows a marked circadian rhythm of activity. This corresponds with fluctuating levels of melatonin that are produced and released by the pineal gland (see Figure 8.5).

Circadian Rhythms

The circadian rhythm is interesting in its own right, but it is especially relevant in the present context because it provides a constantly changing environment in which drugs must act. Changes in hormone levels, body temperature, rate of metabolism, heart rate, blood pressure, gastrointestinal activity, sleep cycles and behavioral activity levels all change markedly on a 24-hr schedule. A drug that interacts with any of these (and how could one not!) will have differing effects for a given dosage, depending upon the time of day. This can be shown dramatically in the case of anesthetic drugs, which are effective in much lower dosages during the rats' normal (daytime) quiet periods (Davis, 1962; see Figure 8.6). Higher doses are required during the active periods, increasing the risk of overdose (Pauly & Scheving, 1964). Conversely, drugs that specifically alter any of these physiological systems have the potential to disrupt the normal rhythmicity and cause secondary problems.

Arousal as Reward

The level of arousal on a more local or moment to moment basis has been linked closely to the phenomena of motivation and reward. One of the earliest and most influential statements about this relationship is the Yerkes-Dodsen Law (1908) which states that an organism interacts most efficiently with its environment when the level of stimulation is at some intermediate level; below this level the arousal level is too low and the organism misses essential features of the environment, while levels of arousal that are above the optimal result in exaggerated responses to all elements and a decrement in performance. As shown in Figure 8.7, this relationship between arousal and performance can be characterized as an inverted U-shaped curve.

The importance of this formulation is not just that behavior changes when the level of arousal changes, but that behavior is also an important way to achieve a change in arousal. Butler (1958), for example, found that monkeys would press a lever in order to open a small window that would allow them to see the laboratory or hear the sounds of the monkey colony. Similarly, Tapp (1969) showed that rats will press a lever to turn on a light for a brief period. These exploratory behaviors, as well as several different types of locomotor activity (e.g., running wheel, tilt cages, jiggle cages, open field, etc.) are all behaviors that cannot be explained by the traditional motivators of hunger, thirst, or reproduction. Rather, these behaviors appear to be reinforced simply by the feedback from the responses. Interestingly, a rat will run considerably more if it has a food pellet in the running wheel. Not because it wants to eat it, but because it makes more noise! A marble works as well.

The idea that a rat or monkey would perform a response out of "curiosity" or an "exploratory drive", with the result being nothing more than a change in arousal level was a bold proposal. It flew in the face of the very thorough and formal drive theory of Hull (1952) and Spence (1956.) But even traditional drive theory assumes an important role for arousal. The drive state is said to have two effects: (a) an energizing effect that increases the general behavior of the organism, and (b) a directing effect that channels the behavior toward relevant goal objects. When reinforcement is obtained, the drive and its resulting energy is reduced. When extinction or nonreinforcement in encountered, the energy is intensified and the behavior is less channeled.

In summary, the brain systems that control arousal are subject to the same controls and interactions that we have seen for other behaviors (cf., Fig. 8.8). The environment can adjust or change the level of arousal, which results in changes in behavior. But changes in behavior also produce changes in the level of arousal, and behaviors may occur for the purpose of changing arousal levels. Furthermore, the behavior can actually change the environment, or at least move the organism to a different part of the environment. Finally, drugs can rather directly enter into this scheme by changing the level of arousal. It should not be surprising, therefore, that these drugs have assumed considerable importance both in the practice of medicine and as a part of human cultures. Because of their action on the very core of behavior, these drugs are often overused or abused, having social consequences that may outweigh their effects on an individual. We turn now to a discussion of some of the pharmacological and behavioral effects of these drugs.

C. DRUGS THAT INCREASE AROUSAL

Strychnine, Picrotoxin and Pentylenetetrazol

The complexities of behavioral arousal (e.g., involvement of several different neurotransmitters, interaction with the reward system, etc.) provides many different points at which drugs may influence the system. The three drugs that are considered in this section are representative of a class of drugs that have rather general excitatory effects through their actions on inhibitory neurons or directly on the properties of the action potential (see Franz, 1980 for a detailed treatment).

Strychnine is an extremely potent drug that causes general excitation of the central nervous system, especially the reflexes of the spinal cord. This bitter tasting substance is derived from an extract of a tree that is native to India. It has a long history of medicinal use, as a general tonic, to increase the appetite, to cure constipation, and as a general stimulant. Commercial forms of the substance were once widely available as so-called "bitters" and can still be obtained in some over-the-counter preparations. There is virtually no evidence that strychnine has any general curative properties, and it is a very dangerous drug to use as a stimulant.

Strychnine has also been used (probably inappropriately) in more traditional medicine as an antidote for the respiratory and cardiovascular depression that occurs with barbiturate or anesthetic poisoning. Again, there is probably no rational basis for this, since the addition of the second drug only complicates the already challenged physiology of the patient. It is usually more advisable to use mechanical means of supporting the respiratory and cardiovascular deficiencies of the patient.

Currently, strychnine is used in some street formulations to add a stimulating effect to a variety of different drugs. It is especially dangerous in this informal context, because of potentially lethal overdoses or interactions with other drugs.

Strychnine has played an important role in research as a tool to help unravel some of the mechanisms of brain and spinal cord circuits. High dosages of strychnine produce an exaggeration of spinal reflexes that can result in tonic seizures with the limbs rigidly extended (an interesting exception to this is the sloth, which shows extreme flexion owing to the reversed organization of its anti-gravity muscles.) The most likely action of strychnine is that it excites these actions by blocking the normal inhibitory effects (see Figure 8.9).

The Renshaw cell, an interneuron of the spinal cord, has long been of interest because of its receptors. It was known that the alpha motor neurons release acetylcholine at the nerve-muscle junction. Because of Dale's law (which may still be in effect in this instance) that any neuron manufactures and releases only one neurotransmitter, the Renshaw cell was the first neuron within the central nervous system that was known to be cholinoceptive. The transmitter substance released by the Renshaw cell was not so easily determined, but electrophysiological studies revealed that it had an inhibitory effect on the alpha motor neuron. More recent investigations indicate that glycine is the transmitter substance, and both glycine and strychnine bind to the same receptor sites on the alpha motor neuron. Thus, it would appear that strychnine stimulates activity in spinal reflexes by blockade of the recurrent inhibition produced by the Renshaw cells. Tetanus toxin, which causes similar convulsive activity, does so by blocking the release of the transmitter, rather than blocking the receptor sites.

Another naturally occurring substance, picrotoxin, is derived from the seeds of the fishberry shrub (so named because the berries were fed to fish so they would float to the surface). This drug also stimulates nervous system activity by blocking inhibition, but apparently through different mechanisms than strychnine. It appears to block the receptors that normally mediate the inhibitory effects of GABA (see Figure 8.10); GABA refers to gamma amino butyric acid. In fact, most of the evidence for GABA as a central neurotransmitter is based upon the experimental effects of picrotoxin.

A third compound that has been widely used for research purposes is a synthetic drug called pentylenetetrazol (Metrazol). It appears that this drug does not interfere with any particular transmitter, but rather reduces the recovery time following action potentials. As indicated in Figure 8.11, this is probably accomplished by increasing the permeability to potassium, leaving the cell in a state of partial depolarization. Pentylenetetrazol has been used as a seizure inducing drug in the process of screening drugs that may have anticonvulsant activities.

The Xanthine Derivatives

The xanthine derivatives are by far the most widely used stimulants, and appear to be safe in moderate dosages (see Rall, 1980 for a review). The most common and most potent of these is caffeine, with theophylline being somewhat less effective, and theobromine being considerably less effective. All three substances are present to varying degrees in coffee, tea, cocoa and cola (see Table 8.1):

TABLE 8-1

Annual Consumption of Xanthine Derivatives

Relative Caffeine Content
Coffee 10 pounds 1 cup coffee 2 cups tea
Cocoa 3 pounds 1 cup coffee 24 oz cola
Tea 1 pound 1 cup coffee 20 cups cocoa
1 cup coffee 5 oz chocolate


(Cocoa contains much more theobromine.)

As in the case of pentylene tetrazol, these compounds have not been linked to a specific transmitter. They appear to cause an increase in calcium permeability, and may increase cyclic AMP production (see Fig. 8.12). These actions stimulate a wide range of physiological systems and mood changes. The major psychological effects include a decrease in fatigue and drowsiness, an increase in speed and efficiency, and a decrease in the number of errors (especially in an over-learned task such as typing). They increase the ability to do muscular work, including an increase in the action of the heart muscle. Peripheral vasodilation increases the perfusion of organs (including the diuretic effect on the kidneys), except for the brain, which shows a decrease in blood flow. This latter effect may account for the fact that coffee is effective in relieving hypertensive headaches for some individuals. The increased gastric acid secretion can be a liability, especially for those who may be prone to the development of gastric ulcers, but this effect appears to be blocked completely by cimetidine (see Chapter 6). In general, these compounds provide safe stimulating effects that do not appear to lead to serious abuse.

Nicotine

Nicotine is a powerful stimulant that is most commonly administered by smoking tobacco. It was in widespread use throughout the Americas when the first European explorers arrived. Tobacco use (chewing, smoking, and snuffing) has had many stormy periods in terms of cultural and legal acceptance (and is in another right now), but for the most part, the use of tobacco has prevailed (cf., Ray, 1978). In 1978, more than a decade after the Surgeon General's cancer warning, about 4,000 cigarettes were sold for every person over 18 in the United States. The health hazards are legion, but for the present purposes we shall consider only those that relate to the direct neuropharmacological effects and ignore the potentially more threatening effects of the associated tars and additives.

Nicotine produces a bewildering array of influences, most of which occur by virtue of its ability to mimic acetylcholine at certain receptor cites (e.g., Taylor, 1980). In fact, the categories of acetylcholine receptors (muscarinic and nicotinic) are defined, in part, by their responsiveness to nicotine (cf., Chapter 1). In the periphery, nicotine acts on the autonomic ganglia. This action is complicated not only because it acts on both sympathetic and parasympathetic ganglia, but also because of the biphasic action of the compound. At low dosages or during the initial stages of higher dosages, the effect is one of stimulation, and expected changes in autonomic effector cells can be observed. But unlike acetylcholine, nicotine is not rapidly inactivated by acetylcholinesterase, and its long lasting effects on the receptors lead to prolonged depolarization. This paralysis of ganglionic activity occurs not only while the cells are depolarized, but appears to continue for some time after normal polarization has been restored. At moderate dosages, some ganglia may be more affected than others, so heart rate (for example) could be increased either by stimulation of sympathetic ganglia or by paralysis of the parasympathetic inhibitory effects. Contrariwise, heart rate could be decreased by relative stimulation of the parasympathetic or paralysis of the sympathetic ganglia. Some of these increases may be dangerous, because the oxygen demands of the heart muscle may be increased while the oxygen supply remains the same. This could trigger cardiac failure in certain individuals. Figure 8.13 summarizes the synaptic effects that mediate these physiological changes.

Nicotine also acts on skeletal muscle receptors, but the initial stimulation phase is either very short lived or nonexistent. The overall effect is, therefore, a relaxation of these muscles at low or moderate dosages and paralysis at higher dosages.

Nicotine also influences neurons throughout the central nervous system, although the distribution of these remains rather ill-defined. The drug appears to produce its central effects through both a direct action on cholinoceptive cells, and indirectly through stimulation of dopaminergic fibers. The administration of nicotine produces a rapid arousal of the EEG and an increase in the release of norepinephrine and dopamine. Although the implications are not clear, it also increases the release of growth hormone, antidiuretic hormone, and cortisol. Acute nicotinic poisoning can lead to nausea, vomiting, diarrhea, and ultimately respiratory and cardiovascular collapse. In lower dosages, the untoward gastrointestinal symptoms rather quickly disappear through tolerance, but the EEG arousal effects continue. Although the mechanisms are not yet known, it appears that the chronic administration of nicotine can lead to enzyme induction which facilitates not only the metabolism of nicotine, but other apparently unrelated stimulants such as the xanthine derivatives and various drugs that act on the catecholamine systems.

Sympathomimetics

The sympathomimetic compounds mimic or otherwise increase the activity of neurotransmitters associated with the sympathetic nervous system, namely, norepinephrine and dopamine(cf., Weiner, 1980b). The most widely used drugs within this class include the amphetamines and cocaine. Although the mechanisms of action of these drugs differ, they act upon the same neural substrates, and their effects upon mood and other behaviors are remarkably similar. These drugs are powerful stimulants of the central nervous system, and in moderate dosages, they produce EEG and behavioral arousal, decreased fatigue and boredom, increased psychomotor performance, decreased appetite, and elevations in mood that are frequently described as euphoria. At higher dosages they produce a variety of motor symptoms (twitching, restlessness, stereotyped repetition), perceptual symptoms (distortion of time, tactile hallucinations or the so-called "cocaine bugs"), mood distortions (fear, paranoia, psychotic symptoms), and the possibility of convulsions and death.

Amphetamines

The amphetamines have a variety of different effects on neurons that release catecholamines, including the ability to directly mimic the neurotransmitters at the receptor site. Their major action, however, appears to be the indirect release of newly synthesized dopamine (and perhaps norepinephrine); if the enzyme of synthesis, tyrosine hydroxylase, is inhibited, the effect of amphetamine is greatly reduced (see Fig. 8.14).

The long term effects of the amphetamines may include some serious dangers to the motor system. The symptoms include an increase in the startle response, twitching, and related dyskinesias that may be due to a reduction of dopamine in the caudate. The most likely cause of this decrease is a chronic decline in tyrosine hydroxylase activity, which is probably attributable to erroneous feedback from the increased transmitter release.

Cocaine

Cocaine is present in the leaves of a shrub that grows high (so to speak) in the Andean mountains of South America. The natives have chewed or sucked on these leaves for centuries (averaging as much as four or five kilograms of leaves per year) for the elevation of mood that is produced. It also produces a numbing sensation because of its local anesthetic actions, but was not used clinically as a local anesthetic until Sigmund Freud made this suggestion. In the early 1900's synthetic substitutes (e.g., procaine, lidocaine, xylocaine) began to be produced, but none is as effective as cocaine in blocking pain, although all appear to have some euphoria producing effects. Cocaine is still widely used and abused as a street drug, and also continues to be used clinically because of its unparalleled strength and duration of local anesthesia for eye, nose and throat surgery.

Cocaine acts on the same neuronal systems as amphetamine, but enhances the effects of catecholamines (primarily dopamine) by blocking reuptake (see Fig. 8.15) rather than stimulating release (Ritz et al, 1987).

The local anesthetic properties of cocaine and the synthetic derivatives appear to be the result of direct actions on the cell membrane. These changes block the transient change in sodium permeability that is necessary for the propagation of the action potential. This action continues as long as the drug is in contact with the cell, so most preparations include a solution of epinephrine and norepinephrine to produce vasoconstriction that prevents the dispersion of the drug. One of the reasons that cocaine is so effective is that it serves as its own vasoconstrictor through its action on local sympathetic terminals.

The powerful behavioral effects of cocaine and the amphetamines are probably due to their effects on two populations of brain cells. One effect is to increase the general level of arousal by stimulating the catecholamine containing neurons that are involved in this system. The other is to stimulate the catecholamine containing neurons that are involved in rewarded behavior. Together, these effects are very potent: The organism is more responsive to the environment, and the rewarding effects of that environment are amplified.

D. DRUGS THAT DECREASE AROUSAL

Benzodiazepines and Barbiturates

The hypnotic and sedative drugs were discussed in Chapter 4 in terms of their rather specific abilities to counter fear and anxiety. They have also been used extensively in the clinic for their related abilities to promote sleep and, in high dosages, to produce varying degrees of anesthesia. The general effects of these drugs include a shift toward slower EEG frequencies (indicating lowered arousal), a decline in psychomotor performance, and reduced perceptual abilities. Although they promote sleep, there is frequently a reduction in the important REM stage of sleep.

Both the benzodiazepines and the barbiturates reduce the activity of excitable tissues, but their effects are much more pronounced on the central nervous system. Their action is widespread, but appears to selectively influence polysynaptic pathways that involve small fibers (most notably, cortical functions and the reticular formation). As indicated in Figure 8.16, these drugs apparently lower arousal by virtue of their interaction with neurons that release the inhibitory transmitter, GABA. This neurotransmitter is apparently involved with both presynaptic and postsynaptic inhibition of a variety of different transmitters, but certainly includes those transmitters that are involved with arousal systems. (Note that these effects are directly opposite those produced by picrotoxin, cf., Figure 8.10).

In discussing the effects of the benzodiazepines, it was noted that these compounds bind rather specifically to receptor sites in the brain that do not appear to be involved with any known neurotransmitters. The conclusion was that there may be endogenous benzodiazepines which are involved in counteracting anxiety. Extending this model into the present context, it might be proposed that the GABA releasing neurons have receptors that are specific for the benzodiazepines. Further support for this indirect action is the observation that depletion of GABA prevents the sedative effects of the benzodiazepines.

Figure 8-16 summarizes the current model of the GABA receptor complex. This receptor complex appears to have three separate, but interacting receptor sites: a sedative/hypnotic site, a benzodiazepine site, and a GABA site. The inhibitory effects of the GABA neurotransmitter appear to be mediated by the enlargement of the chloride (Cl-) channel. This effect is augmented by the presence of either barbiturates or benzodiazepines, and blocked by convulsants.

One of the difficulties with the model shown in Figure 8-16, is that the benzodiazepines and barbiturates have been classified as different types of drugs on the basis of their differing clinical effects. In particular, the benzodiazepines are more effective in the reduction of anxiety, while the barbiturates are much more effective as general anesthetics. One possibility is that these compounds share the ability to enhance the activity of GABA, but that the barbiturates are less specific in this regard and have additional effects as well. In particular, there has been evidence that the barbiturates may block the reuptake of GABA and may be specifically involved in blocking the activity of certain synapses that utilize norepinephrine or acetylcholine. The barbiturates seem to produce a slower recovery time of neurons, which is of little importance in a single synapse, but produces substantial impairment in pathways that involve multiple synapses.

Both the barbiturates and the benzodiazepines have been used to treat sleep disorders, but with mixed results. These drugs make it easier to fall asleep, and may increase the total time spent sleeping, but in many cases there is a reduction in the amount of REM sleep. As a result, the sleep is less effective than normal and the patient becomes more sleep deprived. Dement (1974) and other sleep researchers have cautioned against the use of so-called sleeping pills, because many of them are more likely to cause insomnia than to cure it!

There is some evidence that the benzodiazepines may be useful in preventing the disruptive effects of changing one's circadian rhythms (e.g., with shift work). Seidel and associates (1984) imposed an abrupt 12-hr shift in the sleeping schedule of volunteers, delaying their bedtime from midnight until noon. Over the course of the next three days, untreated subjects experienced a loss of sleep and impaired function during the waking hours. Subjects treated with triazolam, a fast acting benzodiazepine, did not show these disruptive effects.

Alcohol

Alcohol is one of the most ancient of drugs, with references to its use appearing in some of the earliest recorded histories (cf., Ritchie, 1980). Alcohol use and abuse probably even preceded the appearance of the human species, since it has been shown that a variety of animals (e.g, birds, bees, wild pigs, and even elephants) have been known to partake of the naturally fermenting fruits. The fermentation process can produce concentrations of ethyl alcohol in the range of 12 to 14 percent, at which point the reaction is self limiting because the alcohol kills the yeast that supports the fermentation process. This limits the concentration of naturally fermented wines and beers (mostly 4% in the United States). However, man was quick to increase the potential of this drug, and the Arabs invented the distillation process some 1200 years ago to extract higher concentrations of alcohol. This alcohol can be produced from a variety of different sources, including fruits, grains, and even potatoes. Almost every known culture has contributed to the science and the business of producing alcohol, and we now have a staggering array of "preparations" of this compound from which to choose.

The mechanism of action of alcohol remained a mystery for many years despite the intense research efforts that were aimed toward a better understanding of this drug. It is certainly a local irritant, which can lead to the inflammation of tissues, especially the membranes. In sufficiently high concentrations, it can even serve to coagulate protoplasm and kill the cells. These effects on cell membranes can reduce the ability of peripheral nerves to conduct impulses (by decreasing the permeability to both sodium and potassium), giving alcohol some local anesthetic properties.

All of the effects noted above occur in concentrations that are many times greater than the plasma concentrations that are reached in the blood. As in the case of the benzodiazepines and barbiturates, alcohol appears to selectively influence polysynaptic pathways, in part at least, through the facilitation of GABA. Although the details of the interaction are not yet fully known, alcohol can be very dangerous when taken in combination with a sedative compound such as Librium or Valium. Normally safe dosages of each compound can combine synergistically to produce coma or death.

The behavioral effects of alcohol are comparable in many respects to those produced by the benzodiazepines and barbiturates. As shown in Figure 8.17, there is a selective depression of the reticular activating system and a corresponding increase in EEG slow wave activity. Inhibitory processes decline first and with smaller dosages, resulting in an exaggeration of spinal reflexes and the appearances of behaviors (e.g., talkative, boisterous, aggressive, etc.) that are normally under the influence of social inhibition. Hence, the mistaken notion that alcohol is a stimulant. These effects precede or are accompanied by a marked decline in perceptual abilities (especially pain) and psychomotor functions (especially previously trained responses). There is virtually no evidence that alcohol can enhance motor or cognitive abilities beyond normal, except in those cases where some aspect of the behavior is inhibited (e.g., it would probably greatly enhance the ability to swim in the nude at one's in-laws). At very high dosages, alcohol has general anesthetic effects, but it is not medically useful in this regard because the anesthetic dosage is very close to the lethal dosage. In this regard, it should be pointed out that alcohol is a dangerous drug strictly on the basis of its therapeutic ratio. Suppose, for example, that one considers two or three drinks to be the effective dose for the "desired" effects of alcohol. A dosage of 12 to 15 drinks represents a dangerous overdose that can lead to coma or death. This yields a therapeutic index (LD/ED) in the range of about 6, which is much too low to be considered safe. It is for this reason that hazing rituals, drinking contests, and so forth so frequently result in tragic death.

The behavioral effects of alcohol are accompanied by a variety of physiological changes which, interestingly, fall into a pattern that is very much like that of a general stress response. The local irritating effects on the oral membranes, gastrointestinal tract, and somatic muscles trigger histaminic reactions. There is an increase in the release of ACTH and adrenal hormones. Lactic acid and fatty acids are released into the bloodstream, the heart rate increases, and peripheral vasodilation occurs. There is a decrease in antidiuretic hormone that increases urine outflow which can result in the depletion of calcium, magnesium and zinc. These responses form an important part of the general abuse syndrome, which will be discussed later in this chapter.

The behavioral and physiological effects of alcohol are closely related to the concentrations of the drug that appear in the bloodstream. This is true for nearly all drugs (cf., Chapter 3), but has assumed greater significance in the case of alcohol because the plasma level has become almost synonymous (legally synonymous in many states) with the level of intoxication. There is a germ of truth in some of the folklore concerning the effects of alcohol. Absorption is slower in the stomach than in the intestine, so anything that helps to retard the progress of alcohol from the stomach into the small intestine will also retard the climb in blood alcohol levels. Fatty foods, milk, and meat all cause reflexive closing of the stomach valves to allow greater digestion at this stage, and indirectly result in slower absorption of the alcohol. Meanwhile, the liver enzymes are breaking down the alcohol as it enters the bloodstream, so the overall effect of a particular dosage of alcohol will be prolonged, but the peak effect will be lower. Carbonated beverages enhance the absorption process, highly concentrated drinks slow down absorption, and emotional changes can either increase or decrease absorption. Finally, the effectiveness of a given blood level of alcohol is greater on the ascending side of the curve than on the descending (especially when the rate of ascent is rapid), presumably because some of the cells of the nervous system become somewhat refractory to the alcoholic environment and resume some of their normal functions before the blood alcohol concentrations begin to decline.

Regardless of these influences on absorption rates and other aspects of blood alcohol levels, the alcohol is ultimately metabolized to produce energy. The first stage of this reaction converts the ethyl alcohol into acetaldehyde. The acetaldehyde is poisonous, but the presence of the enzyme acetaldehyde dehydrogenase normally results in the quick conversion of this compound into acetic acid. The drug known as Antabuse (disulfiram) interferes with this enzyme and allows the acetaldehyde levels to build up and cause illness following alcohol ingestion. Genetic variations in this enzyme system may contribute to individual differences in the tendency to consume alcohol (cf., Horowitz & Whitney, 1975).

Anticholinergic Drugs

The involvement of acetylcholine in arousal suggests that the anticholinergic compounds might provide a powerful method of lowering arousal levels. The cholinergic blocking drugs such as atropine and scopolamine compete with acetylcholine at muscarinic synapses throughout the nervous system and especially in certain parts of the limbic system (see Fig. 8.18). When these compounds were discussed for their potential antianxiety effects (cf., Chapter 4), it was noted that one of the major drawbacks was that the compounds were too broad in their spectrum of action. Their action within the parasympathetic system produces such undesirable side effects as dry mouth, pupil dilation with blurred vision, rapid heart beat with palpitations, and others. These same disadvantages apply to the potential use of these drugs for their hypnotic or sedative effects. It is interesting, in this regard, that these compounds seem to have only modest potential for abuse, despite their very real effects on the central nervous system.

Although atropine and scopolamine are not used routinely for their sedative effects (except as a presurgical treatment), they have been important as a research tool. The effects of cholinergic blockade have been used as an example of those uncommon situations in which the EEG seems to be dissociated from behavior. Rinaldi and Himwich (1955) reported that atropine produces a slow-wave EEG that is typical of sleep, but that the animal was still behaviorally awake. A more accurate portrayal of this paradox might be that the animals show a slow-wave EEG while not being behaviorally asleep--their state of wakefulness is somewhat questionable. High dosages of cholinergic blocking agents greatly reduce the activity of rats in their home cage, and although they appear to be awake (eyes open, upright), there also appears to be a lack of "voluntary" attention to the environment (not unlike the state of college students during lectures). Yet, these rats will show a greatly enhanced response when given the opportunity to explore a new environment or when presented with specific external stimuli such as loud noises.

The effects of these drugs have been linked to the action of the septohippocampal system (cf., related discussion of behavioral inhibition in Chapters 1 & 4). It appears that the septum contains cholinoceptive neurons that project to the hippocampus, producing the characteristic theta pattern in the hippocampal EEG. This hippocampal theta activity is seen in a variety of situations that involve attention to new aspects of the environment or to changes in the rewarding contingencies of the environment (most notably, nonreinforcement or punishment). The blockade of this system with cholinergic blocking agents results in a variety of deficits that can be characterized as disinhibitory or failures of attention (see Gray, 1970).

E. DRUGS THAT CHANGE PERCEPTION

There is probably no other area of behavioral pharmacology that has so rich a mine of interesting stories. There is, for example, the often quoted drug experience of Hoffman's discovery of LSD, or one of the many stories of pagan (and not so pagan) rituals that involve the drinking of urine to obtain the non-metabolized drug that passed through the first user's body, or a description of colonial soldiers gamboling about under the influence of scopolamine. It is tempting to expand these anecdotes, but the bottom line is that a good pharmacological story cannot be related--there is no specific mechanism of action or neurotransmitter interaction that can account for the effects of these drugs. There are, however, some general considerations about the behavioral changes that should be discussed.

The drugs that alter perception have been somewhat loosely classified on the basis of their ability to produce distorted experiences of the environment. Descriptions of these effects allude to dreamlike states, orgasmic feelings, florid visual imagery, a sort of distortion of time and space, synesthesia (hearing visual stimuli, seeing odors, etc.), a feeling of oneness with the universe, a feeling of separation from the universe, and so on. These experiences, which seldom occur with other drugs or in the absence of drugs, have led to the terms hallucinogenic, psychedelic, psychotomimetic, mind altering, or even mind expanding drugs.

A logical case can be made that all centrally active drugs alter perception. For example, the stimulant and depressant drugs that were just discussed can produce changes in mood and level of arousal. Since an individual's interpretation of the environment is heavily dependent on mood and arousal, changes in perception certainly will occur. In fact, one of the major reasons for the voluntary consumption of drugs like alcohol may be to reduce the perception of anxiety provoking stimuli in the environment. (This is not to say which, if either, set of perceptions is veridical; initially false perceptions fall prey to drug effects as easily as those that can be verified.)

In defense of a special category for the hallucinogenic drugs, the altered perceptions produced by most other classes of drugs are much less profound. This does not mean that the drugs, as a class, share a common mechanism. Their actions are diverse: Scopolamine and related compounds block the cholinergic receptors, LSD and related compounds are serotonergic agonists, the amphetamines and cocaine stimulate systems that use norepinephrine and dopamine, the recently infamous phenylcyclidine (a.k.a. PCP, PeaCe Pill, angel dust, Hog) influences several different transmitter systems, while the specific actions of the much studied marijuana remain unknown. Any attempts to build an all encompassing theory forces one into complex notions such as the balance among transmitter systems and the unsatisfying conclusion (already made) that any drug would be expected to alter perceptions.

The lack of a common biochemical action is not the only problem encountered in the study of this class of drugs. The behavioral effects have been equally difficult to study. Animal models seem rather silly when one is talking about hallucinations, artistic creativity or oneness with the universe. Nonetheless, there have been reports of monkeys grasping for apparently hallucinated objects in empty space (at least it appeared empty to the investigators), of cats stalking unobservable prey, and of spiders spinning unorthodox webs. This is not to say that animals do not experience and perhaps even appreciate the effects of mind altering drugs, but simply that the effects that are being championed by the users of these drugs are too close to the human experience to make an animal model very useful. But the problem is not solved by turning to human studies. Objective measures of performance (even creativity and imagery) can be obtained, but many of the changes must rely on subjective reports, and when the drug is effective, these reports must be obtained from an individual who has an altered interpretation of the environment.

The widespread use of drugs for the purpose of altering perception has led to an unwarranted mystique about the properties of these drugs. A recurrent theme in the description of the drug states is that the experience is unparalleled in the normal condition, except for dream states, hypnotic or meditative trances, and religious rapture. While this is probably true, it should serve to diminish rather than elevate the uniqueness of the drug effects. Hypnotic trances, for example, have been viewed as something of the occult, with both the mind and the body under the direct control of the hypnotist. Careful, objective studies reach less dramatic conclusions, and have shown that the so-called trance includes the full range of normal EEG arousal and that the "feats" of rigidity, induced sensory impairment, and even blister formation are all within the boundaries of phenomena that can be done on command by many non-hypnotized individuals or by individuals pretending to be hypnotized (cf., Dalal & Barber, 1972; Orne, 1979).

The normal non-drugged state may also be overrated in terms of its ability to provide a constant and accurate view of the world. Illusions abound. One need only to lie down beside a church and gaze skyward to see the tall spire apparently falling continuously as the clouds sweep by. If one were to nurture this illusion in the same fashion as a drug induced effect, it might well turn into a religious experience.

The misconception of all of these phenomena is that there is something inherent in the drug or hypnotic induction that introduces these experiences. But it is not like bringing in a motion picture reel from some mysterious external source. To borrow and paraphrase an old adage about computer data processing, "crap in; crap out", the brain has only its own information to work on. Altered states of consciousness induced by hypnosis, meditation, drugs, or sleep can do nothing more than rearrange or reinterpret past experiences in the context of the ongoing environment. In the normal, alert waking state, our nervous systems have a long history of selection that has favored a slightly distorted (rocks really do fall faster than feathers) Newtonian interpretation of the universe. The simplest conclusion from all of this appears to be the unsatisfying one that we described earlier: The disruption of any of the major systems that are involved in arousal (ACh, 5-HT, NE and DA) can alter these perceptions in strange and sometimes rewarding fashion.

F. SUMMARY

Principles

1. The daily, cyclic activity of the brain is accompanied by changes in mood, arousal, and a variety of physiological changes.

2. The EEG continues to be active during the daily periods of sleep. This EEG is characterized by 90-min cycles that coincide with periods of REM and dream reports.

3. The ascending reticular activating system is an important brain structure for the maintenance of arousal.

4. Dopamine, norepinephrine and acetylcholine have all been related to arousal; serotonin has been related to sleep.

5. Drug effects change dramatically as a function of the background level of arousal at the time of administration.

6. Strychnine, picrotoxin and pentylenetetrazol produce CNS stimulation by blocking inhibition or reducing recovery time between action potentials.

7. The xanthine derivatives increase arousal by increasing calcium permeability.

8. Nicotine mimics acetylcholine and produces arousal by acting on CNS neurons as well as stimulation of autonomic ganglia.

9. Cocaine and amphetamines are sympathomimetic and increase arousal and the effects of reward.

10. Cocaine and related local anesthetics block sodium channels and interfere with the propagation of action potentials.

11. The hypnotic and sedative drugs reduce arousal and may induce sleep, but they may also interfere with REM sleep. They probably act by facilitating the action of GABA.

12. Alcohol produces many of its effects by acting on the same polysynaptic systems that are influenced by the sedative and hypnotic drugs, probably through the GABA receptor complex.

13. Anticholinergic drugs reduce arousal levels, but also interfere with behavioral inhibition and attention.

14. Hallucinogenic drugs do not seem to fall into a class in terms of the cell population or neurotransmitters that are affected.

Terms

2-deoxy glucose

Acetaldehyde

Acetaldehyde dehydrogenase

Alcohol

Alpha waves

Amphetamine

Anticholinergic

ARAS

Atropine

Barbiturate

Benzodiazepine

Ca++ channels

Caffeine

Cholinomimetic

Circadian rhythm

Cl- channels

Cocaine

Delta waves

Disulfiram

GABA

GABA receptor complex

Ganglionic stimulation

Glycine

Hippocampal theta

LSD

Melatonin

Na+ channels

Nicotine

Pentylene tetrazol

Phenylcyclidine

Picrotoxin

Pineal gland

Postsynaptic inhibition

Presynaptic inhibition

Raphe nucleus

REM sleep

Renshaw cell

Reticular formation

Scopolamine

Serotonin

Suprachiasmatic nucleus

Sympathomimetic

Tetanus toxin

Theobromine

Theophylline

Theta waves

Tyrosine hydroxylase

Xanthines

Yerkes-Dodsen Law