There are many factors that influence the quality and quantity of pain, the most obvious being
the physical intensity of the stimulus that is producing the pain. But other parameters also can
have powerful influences on the interpretation of pain. The acute pain produced by hitting a
thumb with a hammer may be less bothersome than the milder throbbing of a chronically injured
knee. Mowrer (1956) put forth a model of fear which, among other
things,
postulated that the
aversive quality of an intense stimulus included both pain and fear of more pain. Mowrer's
theory was complicated and difficult to support, but it presaged by a couple of decades the
important distinction between predictable and unpredictable pain.
The involvement of pain in a wide range of human conditions has led to all sorts of remedies
for
the reduction of pain. Some of these are behavioral and almost reflexive in nature, such as
sucking on a thumb that was hit or grasping a barked shin with the hands. This application of
pressure may have some physiological basis for pain reduction because of the stimulation of
alternative pathways that may actively compete with the processing of pain information. Along
the same lines, we may even bite a knuckle to reduce pain in the foot or, in the frontier tradition,
bite a bullet instead of a knuckle. The application of cold or heat is also widely prescribed as a
physical means of reducing pain.
The ancient folk remedies also include the pharmacologic reduction of pain, the most notable
of
which are alcohol and the opiate compounds. As modern pharmacology began the systematic
search for drugs, the anesthetic compounds (both local and general) were developed.
Interestingly, there was some reluctance to use these compounds, partly because of the
unknown actions of these drugs, and partly because of the notion that pain was an important
part of the healing response. These objections were soon cast aside, however, and drugs that
offer pain relief now comprise a major portion of both the prescription and nonprescription
pharmaceutical industry.
The most exciting developments in pain research during recent years have not been in the
discovery of drugs, but rather in the emerging story of how the body reacts to pain. As
indicated above, pain is essential to allow the organism to minimize exposure to adverse
environments. But once this alerting function has been accomplished, there is a diminished need
for a continuation of the painful stimulus. (It is not necessary to continue feeling the full impact
of the hammer on one's thumb to make one more careful in the future!) Toward these ends, it
seems that there are mechanisms for the reduction of continued pain. These findings will form
the foundation for the present chapter.
The investigation of pain (and its counterpart, pain relief) requires the systematic and
quantitative measurement of the phenomena. In humans, this has been approximated by
measuring thresholds of reported pain under carefully controlled conditions. These thresholds
are not discrete, but can be easily influenced by prior experience, instructions, social
expectations, and a variety of other factors. Although this complicates research, it does not
necessarily mean that the measure is faulty, but probably reflects the very real changes in the
perceptions of pain that are engendered by these conditions.
A number of different procedures have been developed to assess the pain threshold of
experimental animals. One of the more interesting of these is the flinch/jump test, which
exposes a rat to several series of shocks, in ascending and descending intensities (Evans,
1961). With increasing intensities, for example, there will be a range of low shock levels to
which the rat does not respond. Then, as the intensity increases, the rat will begin to show a
slight flinch with each brief shock presentation. With further increases in intensity, the rat will
actually jump when the shock is presented, the criterion usually being that at least three of the
rat's feet leave the grid floor. Once these responses have been determined, the sequence is
reversed and descending shock intensities are delivered until the jump response disappears and,
with further decreases, the flinch response disappears. This procedure produces reliable
threshold determinations for both the flinch response, which is interpreted as the lowest shock
level that is detectable as pain, and the jump response, which is interpreted as the lowest level
of shock that produces an emotional response to the pain. The test can discriminate, for
example, between a drug that locally blocks nerve conduction, and a centrally active analgesic
drug that reduces the impact of the pain (see Fig. 5-1). (Some of the
readers may have had
pain thresholds established by a dentist to determine the relative health of two or more teeth,
and will be able to appreciate this difference between simple detection and an emotional
response.) Despite the theoretical advantages of the flinch/jump test, this procedure has not
been used routinely, because it requires an observer to make a subjective evaluation of whether
a particular response is a big flinch or a small jump--not as easy as it might seem. Furthermore,
the test is rather tedious and time consuming to administer.
Two simpler tests, both of which use heat as the pain eliciting stimulus, have been used more
frequently. One of these is the paw lick test, which involves
placing the rat on a specially
constructed metal plate which is maintained at a constant temperature. The temperature is set
low so that the rat can remain on the plate for several seconds before it becomes painful
(something akin to the handle of a skillet, which may seem only warm at first, but nonetheless
add a briskness to one's steps when carrying it across the kitchen.) The measure of the pain
response is the latency from the time the rat is placed on the plate until it licks its front paw. An
even simpler measure is the tail flick response, which can be
automated for objective
measurement. In this test, the rat (or mouse) is placed in a small restraining cage and its tail is
pressed lightly into a groove. A source of heat (usually a light bulb) is directed to the underside
of the tail. When the heat reaches the pain threshold, the tail is flicked out of the groove, and
the latency between the onset of the heat and the tail flick is automatically recorded. Both of
these tests produce reliable measurements of thresholds, and have become the standard tests
for determining the analgesic properties of drugs or behavioral treatments (see Fig. 5-2).
We turn now to a discussion of the research that has focused upon analgesia, the relief of
pain.
The
Discovery of
Opiate Receptors
Opium, an extract of the poppy plant, was ensured a place in history many centuries ago
through the writings and the art work of early civilizations. Loosely translated, the term
narcotic means "numbing" and probably refers both to the direct
analgesic properties of this
compound and to the more general depressant or sedative properties of the drug in larger
dosages. Because of its unique powers and potential for abuse, opium and its derivatives have
been the subject of literature, art work, legislation and even wars. Against this backdrop of
human drama, a research story has unfolded, the results of which may have more far reaching
consequences than all of these other aspects.
The production of opium as a drug was mastered long before there existed any formal
knowledge of pharmacology. The harvesting of the poppy pods and the procedures for
concentrating and to some extent, purifying the opium has been known for centuries. In the
1500's a Swiss physician, Parcelsus, prepared a relatively pure extract, laudanum, which is still
used today. The isolation and chemical identification of the active ingredients did not occur until
the 1800's, when morphine (which comprises about 10% of
dried
opium powder) was
isolated and codeine (which comprises less than 1% of opium
powder) was identified. Each of
these components is an effective analgesic, and each has substantial potential for abuse.
Ironically, when analogues of these compounds were synthesized in the laboratory, one of
them, heroin, was hailed as the "hero drug" that could relieve pain
without causing addiction!
The narcotic drugs produce an excellent blend of direct pain reduction and the attenuation of
the psychological trauma associated with pain. These effects are especially desirable for acute
and severe pain associated with injuries, most notably those that occur on the battlefield. As
information about the various neurotransmitters and their receptor specificity started to unfold
(cf., Chapter 2), researchers began to search for the mechanism of action of the narcotic drugs.
The basic question was "Which transmitter substance is mimicked, blocked, or otherwise
modified by the opioid drugs?"
The answer, curiously enough, was none of the above. Although specific transmitters (e.g.,
histamine and serotonin) had been linked to pain, the narcotic drugs did not appear to interact
directly with these systems. As neuropharmacological techniques became more sophisticated,
it became possible to isolate and identify specific receptors through a procedure that measures
receptor binding. The procedure (shown in Fig. 5-3) is complicated, but it can be
summarized as follows (see also related discussions of receptor binding in chapters 5 and 6):
The compound in question is prepared in radioactive form and injected into an experimental
animal. At some later time, usually calculated to coincide with known times for maximum action
of the drug, the brain is homogenized and treated in various chemical and physical (e.g.,
centrifugation) ways until a relatively pure sample of the radioactive compound and its attached
cellular components has been formed. This substance is, for all practical purposes, the original
drug and the brain's receptors for the drug.
One of the problems with the type of experiment outlined above is that a considerable amount
of nonspecific binding can also occur, rendering the results meaningless. Avarim Goldstein's
laboratory (cf., Goldstein et al, 1971) had developed procedures that
involved special
washing of the brain tissue, along with very small amounts of the radioactive drug. Using these
procedures and a specific antagonist of opiates, naloxone, Pert and Snyder (1973) were able
to show specific binding sites in the brain. Comparison of the relative potencies of various
opiate mimickers and blockers showed a close correlation with the ability to bind to these
receptors (Fig. 5-4), confirming the notion that these were the
receptors
that are normally
involved in the action of opiate pain relievers (cf., Jaffe & Martin,
1980; Snyder, 1978).
But why should the brain have receptors for an extract of the opium poppy? The only logical
answer is that the brain does not have receptors for opium. Rather, the brain must have
receptors for compounds produced by the body (endogenous compounds) which happen to
share a chemical similarity with the narcotic compounds. Given this conclusion, the race was on
to find these chemicals in the body, and to delineate the conditions under which they are
released.
In 1975, two groups of investigators (Hughes and Kosterlitz from Scotland and Simantov and
Snyder from the United States) independently isolated two substances from pig brain and calf
brain that had specific morphine-like properties. Hughes and his associates
(1975) dubbed
these substances enkephalins, from the Greek meaning "in the
head". The substances were
small peptides consisting of five amino acids each:
Tyr-Gly-Gly-Phe-Met, and
Tyr-Gly-Gly-Phe-Leu
A decade earlier, Li and associates (1965) had isolated a large
pituitary hormone which he
called beta-lipotropin (so named because it induces fat
metabolism). When the structure of
this molecule was shown at a convention, one of Hughes' associates, Howard Morris, was in
the audience and made a remarkable observation that can only be likened to recognizing a
familiar face in a crowd--he noticed the sequence Try-Gly-Gly-Phe-Met (i.e., met-enkephalin)
imbedded in the middle of the long molecule. (See how long it takes you to find it in Figure
7-5 even when you know it is there!) It is suspected that this pituitary hormone may serve
as a
precursor for at least some of the smaller enkephalins that are formed in the brain.
Later studies have shown that the beta-lipotropin molecule not only contains the
met-enkephalin
sequence, but several other sequences that are significant to stressful conditions. Positions 4-10
forms the sequence for ACTH, while positions 61-76, 61-77, and
61-91 contain the
sequences for alpha-, gamma-, and beta-endorphin, respectively. Because all of these
compounds have morphine-like properties, they have been termed endorphins as a contraction
for endogenous morphines (Simantov & Snyder, 1976). As
indicated
in Figure 7-5, the
beta-lipotropin is released primarily from the intermediate lobe of the pituitary.
The receptors for these endogenous compounds are located in logically appropriate places. In
particular, they tend to be highly concentrated in the limbic
system (which is involved with
emotional responses), and in the periaqueductal gray
area
of the brain stem (which is strongly
implicated as in pain circuitry).
In summary, there appears to be a system within the brain that can produce opioid compounds
and there are specific receptors located in appropriate regions of the brain. We turn now to
behavioral experiments that demonstrate the action of these systems.
The major sources of hormonal opiates arise from the pituitary (especially the intermediate lobe and from the adrenal gland (apparently both the medulla and the cortex of the adrenal). As in the case of other hormones, these opioid compounds are released into the bloodstream and can have their effects on widely dispersed target sites.
There are two major sources of neuronal opiates (i.e., opiates that are released at synapses as
neurotransmitters). The arcuate nuclei of the
hypothalamus
contain a population of cells that
are anatomically connected to the limbic system and the periaqueductal gray areas of the brain
stem (areas that have been shown to have large numbers of opiate receptors). The second
source is an opioid link in the descending periaqueductal
gray system. In this case, the opioid
transmitter substance inhibits cells that transmit pain signals to the thalamus.
The periaqueductal gray system has a component of fibers that does not utilize an opioid
transmitter substance, and there apparently are pain inhibiting compounds (of unknown origin)
that are released into the bloodstream as hormones. There is relatively little information about
either of these systems, and they are typically discussed in terms of what they are not (i.e.,
nonopioid) rather than what they are.
The evidence for these four types of analgesia comes from many different experiments, but
only
a few need be described to show the rationale of these conclusions.
Direct electrical stimulation of the periaqueductal gray system through implanted electrodes
produces analgesia. For example, the latency to flick the tail away from a heat source is
significantly increased. But what type of analgesia does this represent? The standard test is to
determine the response to an opiate antagonist, usually naloxone or naltrexone. If a compound
that is known to block the analgesic effects of morphine or opium also blocks the analgesia that
is produced by electrical stimulation, then it seems likely that the stimulation is causing the
release of endogenous opiates. Naloxone blocks the analgesia produced by periaqueductal
gray stimulation. Further evidence for the opioid nature of this effect comes from drug tests that
involve morphine. The analgesia produced by electrical stimulation can be enhanced by the
administration of morphine. Furthermore, animals that have been rendered tolerant to the
effects of morphine show less analgesia with stimulation, and animals that have become tolerant
to the effects of electrical stimulation are less responsive to the effects of morphine. Thus, there
are three converging results that support the notion that periaqueductal gray stimulation
produces analgesia via the release of endogenous opiates:
(a) Naloxone blocks the effect,
(b) Synergism with morphine, and
(c) Cross tolerance with morphine.
It has also been shown that pain itself is a potent stimulus for the production of analgesia.
Watkins and Mayer, for example, have shown that electric shock delivered to the front paws of
a rat will produce analgesia, as measured by the tail flick response. The administration of
naloxone just before the foot shock will abolish this effect. (Naloxone administered after the
foot shock does not diminish the analgesia, suggesting that the effect is triggered by endorphins,
but not necessarily sustained by them.) The foot shock induced analgesia also shows cross
tolerance to morphine, as would be expected from the discussion above.
These naloxone tests confirm the opioid nature of the response, but do not show whether the
opiates are neural or hormonal. This distinction is made on the basis of additional experiments.
The removal of the pituitary and/or the adrenal gland (the major sources of hormonal opiates)
do not diminish the analgesia. Furthermore, transection of the dorsolateral funiculus (the
pathway through which the descending periaqueductal gray fibers pass) abolishes the analgesia.
Finally, it has been shown that the direct application of naloxone to spinal neurons in the sacral
region (i.e., those serving the tail) will prevent the development of analgesia produced by the
shock to the front paws. Thus, all of these data converge to suggest that the front paw shock
produces analgesia via the small "opioid link" shown in Figure 5-6.
The next aspect of the story seems rather bizarre at first, but the results have been consistent
in
the hands of Watkins and Mayer. They have shown repeatedly that rear foot shock also
produces analgesia, but the effect is not blocked by naloxone. Removal of the pituitary or
adrenal is also ineffective, suggesting a neural rather than a hormonal effect. The fibers involved
also appear to travel through the dorsolateral funiculus, but from a different origin within the
brainstem. These data imply that the pain reduction is neural, but non-opiate in nature.
Prolonged foot shock and/or immobilization produce analgesia that is blocked by naloxone
and
significantly reduced by removal of the pituitary or adrenal glands (e.g., Lewis et
al, 1982).
These data support the notion of hormonal opiates that are released by these endocrine glands
into the bloodstream.
Although the details of the system are yet to be described, there is evidence for hormonal
systems of analgesia that do not involve opioid compounds. Some types of environmental
stressors (e.g, cold water swims) can produce an analgesic effect that is not blocked by
naloxone, but which requires the integrity of the pituitary and adrenal glands.
These systems of analgesia are peculiar in that different types of pain and different body
locations of pain seem to activate different types of analgesia. But why should the rear feet be
connected to a different system than the front feet? And why should brief shocks produce one
effect, while prolonged shock produces another? These peculiarities may be more apparent
than real: The differing procedures may involve differing levels of shock. An attractively simple
model of this has been proposed by Forman and Kelsey (personal communication) in a schema
that relates the type of analgesia to the impact of the aversive stimulus (some measure of both
the intensity and the duration of the painful stimulus). As shown in Figure
5-7, this model
suggests that increasing the impact of the painful stimulus can determine which of the four types
of analgesia will be produced, with neural opiate analgesia being the easiest to elicit, and
hormonal opiate analgesia being elicited only by prolonged or severe pain. This is consistent
with much of the experimental literature, including the forepaw/hindpaw phenomenon if one
argues that it is easier for the rat to lift the front paws and reduce the impact of the shock
(hence, neural opiate rather than neural non-opiate).
A final observation is that all analgesic effects that are based on conditioning (e.g, a
reminder of
a previous shock can produce analgesia) seem to be based on opiates and can be blocked by
naloxone. This appears to be the case even when the original painful stimulus resulted in
non-opiate
analgesia.
Behavioral
Effects on Pain Reduction
Interpretation of pain
The experiments outlined above clearly show that aversive stimulation can produce changes in
pain perception. This important influence of the environment on something so fundamental as
pain perception came as somewhat of a surprise, but was only the tip of the iceberg in terms of
the behavioral interactions that were to be demonstrated. As in the phenomena discussed in the
previous chapters, the interpretation of the environment has a profound effect on these systems
of analgesia.
The triad design, once again, has been useful in demonstrating the role of behavioral variables
in
producing analgesia. Although a number of different investigators have used this procedure, a
particularly efficient application of the procedure has been developed by Kelsey and his
associates (cf., Forman & Kelsey, personal communication). In this procedure, the rats are
placed in small restraining cages that have a wheel which can be turned by the front paws. The
protruding tail has electrodes attached for delivery of electric shock and can also be positioned
in a groove that has a heat lamp for measuring tail-flick latencies in the same session. The
typical triad design includes the non-shocked control group, a group that can escape the shock,
and a yoked group that receives inescapable shocks. As might be expected from the results
discussed in previous contexts, the rats in the first two conditions show no analgesia, whereas
the rats that have no control over the shock show a significant increase in the tail flick latencies
(see Fig. 5-8).
It should also be noted that the role of the endogenous opiates goes beyond that of mediating
the response to direct, painful stimuli from the external environment. A particularly illuminating
experiment has shown the effects of social interactions on these systems (Miczek, Thompson
& Shuster, 1982). These investigators allowed mice to establish "residence" in their
home
cages over a period of time. (This involves marking the territory with odors, etc.) They then
introduced another mouse into this home territory. Almost invariably, the "intruder" mouse gets
attacked and defeated under these conditions, even if the intruder has a "height and weight"
advantage. These interactions involve some actual biting of the intruder, as well as a great deal
of species specific social postural signals for dominance (on the part of the resident) and
submission (on the part of the intruder). The intruder was placed into the resident cage on 10
trials, and allowed to remain each time until 10 bites had been received. As shown in Fig. 5-9,
the exposure to defeat led to increasing amounts of analgesia as measured by tail flick latencies.
This analgesia declined within an hour after the last session.
Control experiments showed the specificity of this effect by demonstrating that "intruding"
into
an empty cage had no effect, even if the mouse was "bitten" by forceps while in the cage.
Apparently, the social trauma of defeat is an important aspect of this phenomenon.
This effect was shown to be the result of opiate release by the administration of naltrexone,
which blocked the development of analgesia. Furthermore, the opiates appear to be playing a
role in the central nervous system rather than peripheral effectors, because the administration of
a quaternary form of naltrexone, which does not cross the blood brain barrier, did not block the
analgesic effect.
Finally, these investigators showed cross tolerance to morphine. In one case, they exposed
the
animals to the social defeat for 14 successive days, by which time the tail flick latencies had
returned to normal (i.e., the mice appeared to be tolerant to the effects of defeat). After this
series of defeats, morphine was also ineffective in changing the tail flick response. Turning the
procedure around, they implanted time-release morphine pellets that slowly release the
analgesic drug over a period of one week. At this time, the mice were placed into resident
cages as intruders. The experience of defeat did not change the tail flick latencies. Thus, the
cross-tolerance between morphine administration and exposure to defeat were demonstrated
(Fig. 5-10).
Although the Western world was reluctant to accept the validity of acupuncture, a related
procedure had been used unwittingly in veterinarian practice for many years. The procedure
known as twitching involves the firm squeezing of a horse's upper lip in a rope noose. After a
few minutes, the horse appears groggy, and can undergo minor surgical procedures with no
evidence of pain. This had been interpreted as distracting the horse's attention from the pain of
the surgery, but it has recently been shown to increase endorphin levels and the effect can be
blocked by naloxone (Lagerweij et al, 1984). It is also interesting to
note
that long distance
runners can sometimes find relief from painful side aches ("stitches") by pinching the upper lip.
The term placebo means "I will please", and has long been
used to
designate an inactive
compound that is administered as though it is an effective drug. In some cases, this is done
quite voluntarily, while in other cases, both patient and physician may be misled. In any event,
there is now evidence that the endogenous opiates play a role in at least some of the placebo
effects. Dental pain is once again the testing ground. In a second series of experiments,
Mayer and associates (1976) found that some, but not all, patients
showed
a reduction in the
pain that they experience after they were given medication that was claimed to be a pain
reliever. For those patients who responded with a reduction in pain, naloxone blocked the
effect and caused an increase in pain. For those who showed no effect of the placebo, the
opiate blocker was without effect (Fig. 5-11). Although a lot of
additional work needs to be
done, it is probably not the case that placebos work for some people but not others. More
likely, placebos probably work for virtually everybody, but the conditions under which they are
likely to work may differ from individual to individual.
An Overview
of
the Pain Response
The evidence is clear that pain bears an uneven relationship to the actual physical intensity of the
stimulus. When an aversive stimulus is first presented, it almost always produces an immediate
behavioral response, and it may even be accurate to view this as a reflexive response. But the
processes that change the perception are triggered almost immediately. The exposure to the
painful stimulus triggers feedback mechanisms that dull the intensity of the pain. The ability or
lack of ability to control the painful stimulus further modulates this perceptual change. In the
next section of this chapter, it will be shown that the experience and interpretation of painful
events not only changes the perception of pain, but also has far reaching ramifications in terms
of the general ability of the organism to respond to stressful challenges.
If the self really is a unique entity that makes up an individual's being, then it is vital to
maintain
that entity. This requires two fundamental abilities: The individual must be able to discriminate
the boundaries between self and nonself, and must be able to defend against forces that would
blur this distinction.
The biological self would appear, at first glance, to be so obvious that the concept would be
useless. Physical boundaries alone seem to make the distinction between the self and nonself
even plainer than the nose on one's face. The biological self became important when sexual
reproduction was "invented" and a unique set of information about one self (or is it one's self?)
was merged with the information about another self to create a third self. This new importance
of individuality was a benchmark in evolutionary history, but the evolutionary advantages also
introduced more stringent requirements to be able to recognize and defend the self against the
nonself. The immune system is the first line of defense in this task.
The field of immunology is currently one of the most complicated and most active areas of
biological research. The increasing knowledge about this system has all but eliminated several
major scourges of mankind, such as measles, smallpox, polio and diphtheria. We can protect
our selves by taking the appropriate "shots". The importance of this area to behavioral
pharmacology is that the environment and the interpretation of the environment have major
effects on the immune system. This has led Ader and others (e.g., Ader,
1981) to coin the
term psychoneuroimmunology. We turn now to a cursory treatment of the mechanics of the
immune system, to be followed by a discussion of the ways in which behavior can place its
mark upon this system to either fortify or break down the defenses of our biological selves.
The normal role of the immune system is to recognize foreign (i.e., nonself) substances and
create a locally hostile environment that will eliminate them from the system. This can occur in
response to the introduction of disease producing bacteria or viruses, tumor cells, parasites,
transplanted organs, inappropriately matched blood transfusions, and a host of other
substances. In this, its normal role, the immune system is somewhat of a silent warrior, and
does not command the attention of the individual. However, for a sizeable proportion of the
population, the immune system becomes very obvious by virtue of its somewhat inappropriate
response to substances that pose no real threat to the individual. These are the individuals who
suffer from allergies to such things as pollen, cat dander, milk, and jewelry, to name a few.
Even more serious than over responding to a harmless foreign substance is a failure to recognize
self as self. This can occur in a variety of autoimmune diseases such as arthritis, Parkinson's
disease, myasthenia gravis, some forms of diabetes, and others (the list is growing).
There are two major ways in which the immune system can respond to a challenge: One of
these is a humoral response which can occur almost
immediately in a sensitized individual.
This involves substances that were created by the immune system and which circulate in the
bloodstream until they come into contact with the specific foreign body. The other is a cellular
response that involves the proliferation of special blood cells that can react to the specific
foreign substance. This reaction is referred to as the delayed response, because it typically
requires about 48 hours to develop. The two categories of immune response are mediated by
specialized leukocytes (white blood cells) that arise from
nonspecialized stem cells formed in
the bone marrow (B-cells) and in the thymus gland (T-cells). Refer to Fig. 5-12 for a
schematic diagram of the reactions that are being outlined.
Humoral responses.
The humoral response is initiated when a B-lymphocyte recognizes the presence of a foreign
substance, the antigen (refer to Fig. 5-13; after Buisseret, 1982). The magnitude of this
response is partially determined by the effects of T-lymphocytes, which are termed helper or
suppressor T-cells, depending on their role in the interaction. If this initial encounter is
interpreted as a challenge from a foreign body, their is a tremendous increase in the metabolic
activity of the B-cell and it begins to manufacture and release antibodies that are chemically
specific to the original antigen. These antibodies are the immunoglobulins, of which there are
five types and several subtypes. One of these, Immunoglobulin-G, has been strongly linked to
populations that are exposed to parasitic worms. This same form appears to be involved in
allergic responses, presumably because the suppressor T-cells in some individuals allows the
initial reaction between antigen and B-cell to continue.
The immunoglobulins that are released by the B-cells interact with mast cells (so-named
because they appeared to the German investigator to be "stuffed"), large cells that are found in
connective tissues, in the membranes of the intestines, eyes, and respiratory tract, in the skin,
and in the lymph glands. These cells contain numerous granules within their cytoplasm, and
their surface membrane has several hundred thousand receptors for antibodies (i.e.,
immunoglobulins). These receptors are nonspecific, in that they will bind any form of
immunoglobulin to the surface of the mast cell. However, when two specific antibodies occupy
adjacent receptors, the pair forms a highly specific receptor for the original antigen. This sort of
"piggy-back" arrangement is a highly efficient way for a single population of cells, the mast cells,
to develop unique sensitivities to any of thousands of potential antigens that might be
encountered. (This type of arrangement has not been described for neurons, but it is such a
clever mechanism that it would be surprising if it were only used in this one system.) The
number of mast cells that have this sensitivity conferred upon them is roughly related to the
amount of immunoglobulin formed by the B-cells in their first encounter with the antigen.
Once the mast cells have been sensitized, the system is ready for an immediate response when
the original antigen is encountered again. When the receptor pair identifies the antigen, calcium
enters the mast cell and the granules move to the periphery and are released in a manner that is
comparable to the release of a neurotransmitter substance. These granules are comprised of a
group of compounds (histamine, serotonin, heparin, blood platelet activators, etc.), which are
collectively termed the preformed chemical
mediators. These chemicals have widespread
effects on blood vessels, respiratory membranes, smooth muscles, and the blood itself,
producing what is commonly called the hay fever reaction. At the same time, there is also an
increase in the production of prostaglandins (which have complicated effects on respiratory
membranes, mucous secretions, blood clotting factors, etc.) and the leukotrienes. The
leukotrienes are many times more potent than histamine, and are responsible for the asthmatic
symptoms of contracted airways, dilated and leaky small blood vessels, and painful, itchy nerve
endings. In severe cases of allergy, this set of reactions can lead to what is called anaphylactic
shock, including the danger of respiratory collapse and death: Nature's way of telling us that
we have encountered a foreign substance.
It should be noted at this point, that virtually all drugs (and foods for that matter) are foreign
substances, and might be expected to induce an allergic sensitization. Although some drugs can
sensitize an individual and lead to an anaphylactic response (penicillin is one of the most
common examples), these reactions are rather uncommon. The reason for this is that the
immune system is designed to respond to large, nonself proteins. The sensitization process (i.e.,
the development of specific receptors) is based upon some small part of the surface of the
antigen molecule, which is termed the epitope. The presentation of this epitope alone is usually
ineffective. Most drug molecules are small relative to proteins, and even if bound to a protein
after administration, the likelihood of sensitization is rather slim.
Cellular responses.
The cellular immune response involves the sensitization and proliferation of T-leukocytes (see
Fig. 5-14), which mature in the thymus gland. When an antigen is
encountered, it is bound to a
special macrophage which presents it to the T-cell. The T-cell receptors represent what has
been termed the major histocompatibility complex (MHC), which in
some sense, can be
viewed as the "self-template" against which potentially nonself substances can be compared.
The recognition of a foreign substance triggers the release of interleukin-2 which stimulates cell
division in the T-cells and produces a substance called gamma-interferon which stimulates
prostaglandin release, causes fever (which also promotes cell division), and stimulates both the
MHC receptors and the presenting by macrophages. This positive feedback system causes a
marked proliferation of sensitized T-cells that respond to the specific antigen. These sensitized
T-cells release a group of compounds called lymphokines
that
produce a characteristic set of
effects: (a) dilation of small blood vessels resulting in local redness and warmth, (b) leaky
blood vessel walls resulting in swelling, and (c) congregation of macrophages, including
phagocytes for removal of foreign substances, damaged tissues, etc. This proliferation of
T-cells and the associated macrophages accounts for the increase in the white blood cell count
that characterizes the response to infections and injuries.
The immune system obviously is not a static system. It can respond to any of host of
potentially
threatening molecules in an efficient and specific manner. However, the likelihood that a
particular molecule will trigger a reaction is influenced by both genetic and environmental
factors. Buisseret (e.g., 1982) has studied this rather extensively in
the case
of allergies to
milk. The genetic link is strong, though not complete: If both parents are allergic, the offspring
have a 58% chance of being allergic. If one parent has the milk allergy, the offspring has a 38%
chance of being allergic. The rate is only about 12% for those who do not have a family history
of milk allergies.
Early environmental factors also play an important role. Buisseret
(1976) cites an early study
by Grulee (1943) which showed that 36% of breast fed babies
contracted
an infection of
some sort and .0.13% died as a result. By contrast, babies that were bottle fed were almost
twice as likely to contract an infection (63%) and far more likely to die as a result (7.56%)!
This could be attributed, in part, to more sanitary conditions surrounding breast feeding, but the
bulk of the effect is probably due to the transfer of immunities via the mother's milk (cf.,
Appleton & McGregor, 1984). There is also some evidence that
breast milk may help to
prevent proteins from crossing through the intestinal linings where they can sensitize the immune
system. Buisseret (1978) followed up these results with an
investigation of
the interaction
between type of early feeding, genetics, and the likelihood of showing an allergy to milk. As
shown in Figure 5-15, the breast fed offspring of parents who do not
have
milk allergies have
virtually no chance of developing the allergy. Bottle fed offspring of parents who have allergies
show a 60% likelihood of having the allergy. Clearly, the immune system can be influenced by
both genetics and the early environment. Given this degree of flexibility, it should not be too
surprising that later behavioral influences can also be demonstrated. We turn now to these
effects.
Behavioral
Effects on the Immune System
This early work on resistance to tumors has been sharpened considerably by studies that have
looked more closely at the behavioral components. Visintainer and
associates
(1982)
implanted a suspension of tumor cells into the flanks of rats, waited 24 hours, then exposed the
rats to a set of electric shocks in the familiar triad design. A control group received no shock, a
second group received 60 escapable shocks delivered on a variable interval schedule, and a
third group received the same schedule of shocks, but the shocks were inescapable. Only 27%
of the rats that received inescapable shocks rejected the tumor, whereas more than half of the
control group (54%) and the rats that received escapable shock (63%) rejected the tumors.
The rats in the two shock groups received exactly the same number and duration of electric
shocks, but the lack of behavioral control over shock termination doubled the likelihood that the
tumor would get out of control and kill the rat!
The ability to reject a tumor presumably requires the ability to launch a cellular immune
response. This ability has been assessed more directly in studies that have used T-cell
proliferation as the measure of the viability of the immune system. Laudenslager and
associates (1983) used the triad design of shock administration during a single 80-minute
session. At the end of the session, the took blood from the animals, extracted the leukocytes,
and treated the leukocytes with T-cell mitogens (either
concanavalin A
or phytohemagglutinin,
abbreviated CON-A and PHA, respectively). These mitogens stimulate cell division, but the
amount of cell proliferation depends upon the tendency of the cells to proliferate before they
were removed from the system. These investigators demonstrated that exposure to inescapable
shock greatly reduced the amount of T-cell proliferation, whereas the exposure to the same
amount of escapable shock had no effect (Fig. 5-16). These results
demonstrate two important
effects: The lack of a coping response suppresses the immune system's ability to respond, and
this suppression is triggered very quickly (recall that the blood samples were taken immediately
at the end of the session), even though its effects might not become manifest for days or weeks
later (e.g., in the case of a challenge by tumor cells).
(1982)
demonstrated this possibility using a strain of New Zealand mice that are genetically predisposed to suffer from serum lupus erythematosus (SLE), an autoimmune disease. This disease is used as a model for similar disorders in humans, and involves a breakdown of connective tissue and a variety of secondary symptoms such as kidney failure. These mice have a relatively short lifespan which can be prolonged by treatment with cyclophosphamide, a drug that suppresses the immune system. The two groups that were used for comparison were the untreated control group (25% of the mice die of their affliction by the time they are 10 weeks old) and a group that received a standard 8-week regimen of chemotherapy (25% mortality is not reached until about 35 weeks). If the animals receive only 4 injections (during alternate weeks of the 8-week treatment period), the chemotherapy is considerably less effective, and the 25% mortality figure is reached at about 20 weeks. Ader and Cohen used this intermediate level of treatment to test for the possibility of conditioning. Prior to each drug injection, the mice received a distinctively flavored saccharin solution. On the intervening weeks, they received the saccharin solution followed by a placebo injection of saline. The question was, would the immune system be suppressed because of the prior association of the distinctive flavor with the injection of cyclophosphamide? The answer was yes: The rats that received these conditioning trials did not reach the 25% mortality figure until 25 weeks, about five weeks longer than the group that simply received the staggered drug injections.MacQueen and associates (1989) used a somewhat different
Pavlovian conditioning
approach to demonstrate the learned release of preformed mediators from mast cells. In this
experiment, they paired an audiovisual cue with the injection of egg albumin (an antigen that
promotes mast cell activity) into rats. Later presentation of the audiovisual cue alone stimulated
the release of the preformed mediators to the same degree as re-exposure to the antigen itself.
The studies that demonstrate changes in the immune system as a result of learning have
important implications for medical treatment. Particularly in the case of chemotherapy for
cancer, the drugs such as cyclophosphamide are highly toxic, and conditioning procedures
might be able to greatly reduce the amount of drug that needs to be administered. Similarly, a
long list of ailments including colitis, irritable bowel, asthma, and various food allergies may have
a considerable learned component which can be treated more effectively by behavioral means
than by drugs. Another intriguing possibility suggests that learning can confer a degree of
immunity against the changes in the immune system. For example, McGrath and Kelsey
(personal communication) have preliminary data showing that prior exposure to escapable
shock can protect rats from the suppression of immunity that normally results from shock that
cannot be escaped.
Myasthenia gravis has been strongly linked to an autoimmune disorder which attacks the
receptors for acetylcholine. In 1973, Patrick and Lindstrom extracted
nicotinic receptors
from electric eels and injected a suspension of these receptors into rabbits. The immune
systems of the rabbits recognized the nonself nature of these proteins, and developed
antibodies. However, the antibodies were not sufficiently specific, and reacted not only to the
nicotinic receptors of the eel, but also to their own nicotinic receptors. As a result, they
developed experimental allergic myasthenia gravis (see Fig. 5-17).
It is one thing to show that myasthenia gravis can be mimicked by manipulation of the
immune
system, but this does not necessarily mean that this is the normal cause in humans. However,
the evidence was soon to follow. Almon and associates (1974) found
antireceptor
antibodies in 87% of patients suffering from myasthenia gravis. As a result of these
antibodies,
there was a decline in acetylcholine receptor activity of 70-90 percent. For some reason, these
patients have developed antibodies to their own receptor sites. Unfortunately, knowing the
cause has not provided the cure. Corticosteroids inhibit the immune response (cf., Fig. 5-14),
but such treatment has a host of side effects, many of which are dangerous.
The specific chemical relationships between transmitters, receptor sites, and antibodies
provide
a generously complicated number of ways in which the immune system can cause behavioral
dysfunction. Consider, for example, an experiment by Shechter and
associates
(1984) which
was investigating insulin and insulin receptors, but could be equally relevant to a
neurotransmitter system (see Fig. 5-18). They began the series by
immunizing mice with insulin
from cows or pigs. The mice developed specific antibodies (called idiotypes) to this foreign
substance. These idiotypes showed some of the characteristics of insulin, and also triggered the
formation of anti-idiotypes which were, in effect, antibodies against the mice's own insulin
receptors. As a result, the mice developed symptoms of diabetes.
An unexpected link between hormones, behavior and autoimmunity has been put forth by
Geschwind & Behan (1982). Geschwind formed two groups of
subjects who had been
identified for extreme handedness. Those in the left handed group showed a 12-fold increase in
learning disabilities, but also had a disproportionately high number of artists, musicians,
engineers, and mathematicians (skills that have been linked to right hemispheric functions).
Curiously, 11% of these individuals had some sort of autoimmune disease, whereas only 4% of
the right handers suffered from these disorders. The intriguing link to hormones comes from
several different observations. For reasons that are unknown, testosterone inhibits the growth
of the left hemisphere and of the thymus gland (a major organ of the immune system). The
same region of the chromosome that determines the major histocompatibility complex (the
receptor that determines self vs. nonself) may also influence the weight of the testes, serum
testosterone levels, and the sensitivity of the receptors to testosterone.
At the present time, many of these experiments and observations offer little more than hints at
possible effects. But the results are tantalizing, and it is very likely that within a few years it will
become increasingly clear that the immune system is integrally involved in the causes of many
behavioral disorders, the response to drugs, and that behavior can, in turn, modify these
interactions.
The different responses that occur when the opportunity for coping verses no coping prevails
is
paralleled by the general anatomical and physiological features of the brain regions that mediate
emotional responding (see Figure 5-19). The structures of the limbic
system appear to be
primarily responsible for analyzing the emotional tenor of our environment. The structures of
the limbic system receive information from both the outside world and the body, analyze this
information, and contribute to the body's response to the situation via messages to the
hypothalamus and pituitary. These structures are involved in many aspects of behavior, but it is
instructive, in the present context, to analyze the types of responses that are mediated by the
anterior and posterior portions of these two structures.
The posterior regions of the hypothalamus and pituitary are responsible for sympathetic
arousal,
including the stimulation of the adrenal medulla (to release E, NE, DA, and endorphins) and the
adrenal cortex (to release mineral corticoids and stimulate inflammatory responses). There is a
tendency to view these responses in a negative light because of the nature of the situations that
lead to these responses, but in fact, these are energizing responses that improve the
organism's's ability to interact with its environment. Appropriately, these responses occur
when the environment affords the opportunity for something effective to be done-- for example,
fighting, fleeing, or coping.
The anterior portions are normally involved with a variety of constructive functions:
adjustments
of the vegetative responses of the parasympathetic system, production and regulation of a
variety of hormonal systems, and so forth. But when the organism confronts an aversive
situation for which there is no obvious coping response, these anterior regions overreact and
produce a physiological environment that causes tissue damage or otherwise disrupts bodily
functions. The outflow of the parasympathetic system erodes the ability to mount a physical
response, the adrenal gland releases glucocorticoids which, in high concentrations, can cause
tissue damage. The release of endorphins dampens the organism's ability to monitor the
environment and may contribute to the suppression of the immune system.
In 1932, Cannon wrote a book entitled The Wisdom of the
Body in which he repeatedly
demonstrated the adaptive and appropriate responses of the body to such things as hunger,
thirst, exercise, danger, and so forth. His arguments were and remain convincing-- the
accuracy and complexity of the body's responses is nothing less than awe-inspiring. It is
difficult, however, to see the wisdom of some of these responses to situations that do not offer
prediction or control. One might argue that the body (like the proverbial customer) is always
right, and that we do not see the situation with sufficient clarity to recognize the benefits.
Alternatively, one might argue that the ability to interact effectively with the environment is so
crucial to the survival of complex organism that the system fails to function when these
conditions do not prevail. (This latter view will receive further support in the next chapter.)
The diversity of the behavioral and physiological responses has complicated and enriched the
ways in which we view drug effects. The phenothiazines, the benzodiazepines, and even the
beta adrenergic blockers can each decrease the emotional response, but the mechanisms make
more sense when viewed within the appropriate behavioral and neurochemical context. Opium
and related compounds are not just blocking pain, they are interacting with a multifaceted
system that uses the body's own opiates. The important interactions of the immune system with
behavior implies that any drug that changes these types of behavioral responses will also be
likely to have indirect effects on the immune system. The next two chapters extend the
discussion of these issues through an analysis of two different types of mental illness.
2. Pain thresholds have been determined by several methods, including the flinch/jump test,
the paw lick test, and the tail flick test.
3. Opium, an extract of the poppy plant, contains both morphine and codeine; heroin is a
synthetic analogue.
4. Specific receptors for opiate drugs have been found in the brain. These receptors
mediate
the effects of endorphins, a group of endogenous compounds that have morphine-like effects.
5. A large pituitary hormone, B-lipotropin, contains the amino acid sequences for several
smaller peptides that are involved in the stress response.
6. There are four systems of pain reduction: opioid from neural and hormonal sources, and
nonopioid from neural and hormonal sources.
7. Exposure to inescapable pain or social defeat results in analgesia. In many cases,
depending on the precise environmental conditions, this analgesia can be blocked by opiate
blockers and is cross tolerant with morphine.
8. Placebo effects and acupuncture appear to be mediated by endorphins.
9. The immune system is involved with recognizing and defending the biological self.
10. The immune system has two major modes of responding: a humoral response that
involves
circulating immunoglobulins, and a cellular response that involves the proliferation of T-cells.
11. Relatively mild stressors, if not controllable by the individual, can lead to suppression of
the
immune system.
12. The failure of the immune system in various ways can increase the vulnerability to
diseases,
trigger allergies, or lead to autoimmune disorders.
13. Immune suppression can be assessed indirectly y measuring the susceptibility to tumor
cells
or disease, or it can be measuring directly by determining the amount of T-cell proliferation that
results from treating a blood sample with mitogens.
14. Both Parkinson's disease and some forms of diabetes seem to involve an autoimmune
response to one's own receptors-- a finding that may have important implications for a variety
of behavioral disorders.
15. Testosterone appears to inhibit growth of the left hemisphere and the thymus gland, a
finding which may account for the abnormally high incidence of autoimmune disorders in left
handed males.
16. The posterior portions of the hypothalamus and pituitary mediate the sympathetic arousal
response; the anterior portions mediate the overreaction of the parasympathetic system,the
release of endogenous opiates, and other responses that accompany the inability to cope.
Terms
ACTH