Wednesday, September 05, 2007
Those who know me as a scientist and are interested in picking a fight with me (especially religious people who might consider scientific knowledge at odds with their belief system) are sorely disappointed when they discover just how accepting I can be of their views. Although I have a great love of science and see it as an extremely useful method for acquiring knowledge, I do not follow it religiously or seek to force a scientific explanation for things which are fundamentally non-scientific.
Scientific truth is not absolute truth. Science concerns itself primarily with what is empirical. What we can test empirically though so often comes down to what we can observe and this can sometimes be limiting. A theory can reach beyond what is directly observable and still be considered scientific, but it is only considered such because it best explains the observed phenomena.
As an example of something that is scientifically true but was not thought to be so at one point, let's consider the electromagnetic waves that lie in spectra beyond what we can see. These waves (like microwaves and radio) existed before technology advanced to the point where they were empirically provable, but they were not considered a scientific reality until they entered the realm of empiricism. That does not mean electromagnet waves outside of the visual spectra did not exist, they were just not a scientific reality at the time. From past experiences like this we can speculate that there are, today, many elements of our reality which are not scientifically verifiable but may be brought to scientific light in the future. I am also willing to suggest that there are things in our reality which are indeed true, but may never be scientifically verifiable (Now this is a stance that really gets me into trouble with my scientifically minded friends!).
Science, limited by our powers of observation, is limited in what truths it can divulge. Science, therefore, is not an arbiter of absolute truth. Rather, science is an eloquent tool which enables us to make predictions which are largely accurate and advance technologically. It is a practical method of inquiry, but it is not always the best method, and I refuse to fully dismiss all other methods.
So what then is truth and how can one determine reality? Truth, in my opinion, is dependent upon the lens through which you are viewing reality. There is no absolute truth, and there does not need to be. Instead, veracity must be examined in the context of the conjecture.
Take for example, a discussion I had the other night regarding the possible past existence of dragons. From behind the lens of paleontology, dragons did not exist as there is no adequate fossil evidence. From the view of physiology and developmental biology, the proposed bodily design of many dragons is considered impossible and therefore they could not have existed. It is not until we consider the existence of dragons from behind the lens of anthropology and myth that they become real, as cultures across the world show depictions of dragons in some form or another in their written history. So did dragons exist? Yes and no. It depends entirely upon which set of philosophical spectacles you adorn.
Even more weighty questions involving such unearthly things as say, the meaning of life or the soul, are largely not only unexplainable by science, but not at all the concern of science. Enter religion, spirituality, and metaphysics. The fundamentally unobservable and untestable belong to their realm, and here I relinquish the driver's seat.
Thursday, August 30, 2007
In this town where "organic" means natural and throw-backs from the 1960's hippie movement are abound, I hear a lot of complaints about science and technology and how they have created unnecessary health concerns in the wake of progress. One common charge I hear, mixed in with talk about carcinogenous pan coating and deleterious milk hormones, is that vaccination is both dangerous and ineffectual. In response, I thought it would be pertinent to report an opposing view-point, especially since many people seem to fear vaccination simply because they do not know what it is (or maybe they'd just take any excuse to avoid needles). So, here is what I understand thus far about vaccination and immunology in general:
The purpose of vaccination is to help the adaptive branch of the immune system remember specific pathogens so that it can respond to them more quickly and efficiently during subsequent exposures. Many people understand that once they get a disease like chicken pox once, they are unlikely to get that disease (and certainly not with the same severity as before) again. The same idea is at play with vaccination, where an individual is exposed to an innocuous part of the pathogen (like viral coat proteins or bacteria cell wall) to create an immune system memory.
The immune system gains this memory for pathogens through clonal selection. B lymphocytes (a type of white blood cell) with the the correct receptor for the antigen (a portion of the pathogen to which you are exposed) will populate in mass to help fight off the disease. They will also form memory B cells, which are left behind after infection to allow for an expeditious response to a repeat exposure:
The initial response of the adaptive immune branch takes 14 days, while the response takes only 6 days and is far more pronounced in vaccinated individuals or individuals who have already been exposed to the disease.
It is a mistake to suggest that vaccines are a cure all though, as individuals which are immunocompromised may still not recover from a disease even if vaccinated. It is also a mistake to suggest that everyone needs immunizations to every disease. For some individuals, their immune system is so robust that in many cases the adaptive immune branch does not even need to come into play because their innate immune responses, such as those carried out by phagocytes and natural antibiotic reagents in the tissue, is enough to destroy pathogens with in days of infection. The ability to ward off disease quickly can be very important though if the disease is particularly virulent or the individual is more susceptible (which is why doctors will recommend flu shots to the elderly more often than young adults). It is under these circumstances, where the disease is highly detrimental and contagious and the population is susceptible, that vaccination becomes essential.
Granted, there are other methods to combating infectious disease such as using serum to transmit humoral immunity (like what many people use against tetanus), but with more rapidly spreading diseases like pertussis, serum stock would run out quickly if there were a sudden outbreak. Therefore, to prevent epidemics, people are encouraged to be immunized beforehand for diseases which are particularly virulent and rapidly spreading. The entire population doesn't have to be immunized either, just enough to slow the spread of the disease to a manageable level, which is why so many people can get away with not being immunized and still remain healthy (and then suggest that immunization is useless because they themselves have never been immunized).
Vaccination has gotten a bad rap in some cases because older versions of vaccination may have contained viruses or bacteria that were not properly attenuated and some older vaccinations contained potentially harmful compounds like thimerosol, which had a high mercury content and is thought to possibly be correlated with autism. Today, vaccination preparation has vastly improved so that replicated proteins are used to initiate the immune response and thimerosol is no longer used in vaccination fluids at all.
Below is an account of the reduction in annual cases per year of a variety of diseases which can be accredited to vaccination (records from 2004):
Variola (small pox): 48,164/yr reduced to 0/yr, 100% reduction
Diphtheria: 175,885/yr reduced to 0/yr, 100% reduction
Paralytic polio: 16,316/yr reduced to 0/yr, 100% reduction
Rubeola (Measles): 503,282/yr reduced to 37/yr, 99.99% reduction
Rubella (German Measles): 47,745/yr reduced to 12/yr, 99.97% reduction
Epidemic parotitis (Mumps): 152,209/yr reduced to 236/yr, 99.85% reduction
Invasive hemophilus influemzae: 20,000/yr reduced to 172/yr, 99.14% reduction
Tetanus (Lockjaw): 1,314 (deaths)/yr reduced to 26 (cases)/yr, 98.02% reduction
Again, it is important to remember that vaccination is not a cure all and not practical against some diseases. Diseases which lay dormant or mutate quickly pose a particular challenge to immunization. Figures used to suggest that immunization has been wholly fruitless though are unjustified. Vaccination has eradicated some of the most deadly and crippling diseases on the planet, like small pox and polio. We certainly owe a lot to this incredible technology.
Immunology. Kindt, Goldsby, and Osborne. 2007
Wednesday, August 29, 2007
Pit bulls (American Staffordshire Terriers) have caught the media's attention recently after the disbandment of a pit bull fighting ring a sports star was involved in. The media's eye continues to focus on this breed, especially with regards to dog bites and mauling.
With all this attention, it's no surprise that people are beginning to discuss banning the breed in a number of municipals. Kansas City recently followed suite after Denver and Ontario banned pit bulls and many other cities are considering banning their ownership as well (Fox News, Balko, Sept 25 2006).
In my opinion, pit bulls are a greatly misunderstood breed. Although a relatively recent breed, the American Staffordshire Terrier was derived from incredibly old domesticated lineages. Unlike Dingos, Basenjis, or German Shepherd dogs which may have a feral aspect to their genetic lines from breeding with wild varieties, pit bulls have a long history and working relationship with humans. While it is true that American Staffordshire Terriers were bred to be fighting dogs, they were not bred to be attack dogs or in any way hostile toward humans. Rather, they were chosen for the pits because of their compactness and efficiency and were taught to be aggressive toward other dogs, not man. Traditionally, the opposing owner was allowed to wash the competing pit bull before a match (to ensure that no drugs or powders were present on the fur). If that dog was aggressive toward the opponent owner in any way the game was called off, and if it attacked the opponent owner it was put down (How to Speak Dog, Coren, 2000). Obedience and biddablity are vital traits in powerful breeds which must be handled even in the middle of a dog fight. These dogs had to therefore be very docile and loyal toward humans while at the same time champions in the ring (Animal Planet, Dog Breed Profiles, 2007).
Their later breedings in the 1940s focused on and attempted to bring fourth these more loving characteristics as they became more popular as family pets. Pit bulls were never meant to be aggressive toward man and even their aggressive nature toward other dogs can be remedied with proper socialization from birth. Dogs that are reared well almost always behave properly, regardless of their lineage (Before and After Getting Your Puppy, Dunbar, 2004).
It is my suspicion that the reason for pit bull attacks have a lot more to do with human psychology than genetic factors. Many people look at pit bulls as icons of masculinity, as the classic city "though guy" dog. For this reason, they might be brought up to be nasty, trained to be aggressive, or simply neglected and treated as the accessory that too many people see them as.
By what I've seen of the American Staffordshire Terrier, both at the local shelter and in my own home (my mutt is predominantly pit bull and heeler), I'd say this breed has great potential and it would be a tragedy to discontinue their heritage. Breeders who focus on the strengths of the breed are abound, and I feel it has a great future as one of America's favorite family breeds.
Here is an interesting clip from CNN.com about the subject (although be patient, you must sit through a commercial fist):
Monday, August 27, 2007
Looks like I'm staying up tonight!
Tonight/tomorrow morning there will be a total lunar eclipse around 3:30AM. It should be fully visible to us here in the Western USA.
More info here.
Tonight/tomorrow morning there will be a total lunar eclipse around 3:30AM. It should be fully visible to us here in the Western USA.
More info here.
Monday, May 14, 2007
The evolution of dog and man has been reciprocal and intertwined. The first evidence of these canid beasts following and mingling with our hominid ancestors is 500,000 years old (Ostrander, et al., 2006). Dogs gave us an edge over other hominid species like the Neanderthal whose fossils do not show interaction with any canine species. Dogs may have played several important roles in aiding human survival, including providing early warning of intruders and assisting in the tracking of prey. It is speculated that the utilization of the superior scent capabilities of the dog by our ancestors left the olfactory system of early man free of scenting duties. Our nasal passages, no longer needed as much for smelling, could be shortened from muzzles to flat faces with rounded, fat tongues. This particular face shape allowed for greater variation in sound production, improving the speech capabilities of our ancestors. Therefore, our ability to communicate as well as we do verbally may, in part, be thanks to dogs and their contribution as trackers (Coren, 2000). This hypothesis may be a bit tenuous, but it is corroborated by the fact that Neanderthal and other primitive hominids dedicated more of their facial structure to the nasal passages.
140,000 year old fossil evidence reveals dog skeletons which display the morphological characteristics of breeding for domestication (Ostrander, et al., 2006). This is long before the first crops were sewn, before the first domestic horse was ridden, and well before the first cat snuggled in its master's lap. The first breedings were perhaps not fully intentional. The dogs which humans preferred received more food from a kill than the other dogs and were therefore more likely to survive. Dogs which attacked men or were aggressive were likely run off or killed. Friendliness toward man became a trait that better assured the selective survival and reproduction of these animals.
Much of the characteristic look of the domestic dog can be attributed to the selection processes of ancient man. The appearance of domestic dogs, including curled tails, splotchy coat patterns, shortened muzzles, floppy ears, and domed skulls, arouse well before dog fancying and appearance-based selection became popular. These looks came to be not so much because early man found them attractive, but because the look was linked to behavior.
Consider, for a moment, a wolf pup:
Its head is domed, its muzzle is short, and its ears are floppy at first. As the pup ages, the cell signaling and gene expression that brings the ears upright and elongates the muzzle and skull also bring about other adult characteristics like aggression and dominance. What better way for early man to control dogs than to select dogs which maintain puppy-like submission and amiability? This natural phenomena is called pedomorphosis, the retention of juvenile traits by adults (Trut, 1999). The puppy-ish look just happens to come with the temperament and behavior.
A famous Russian experiment in fox domestication demonstrates the link between morphology and domestication excellently. The experiment began over forty years ago with a wild colony of 130 foxes. Belyaev and a team of Russian scientist began to select foxes solely upon objective behavioral characteristics. They noted the changes in these foxes from generation to generation; both their morphology and temperament were recorded. After 30-35 generations of selection, the results were fascinating. The pups displayed a delayed development of the fear response as well was morphological characteristics more akin to domestic dogs such as skeletal differences, coat patters, curled tails, and some had floppy ears. Much of this may be linked to changes in plasma levels of corticosteroids, hormones concerned with an animal's adaptation to stress (Trut, 1999; WGBH, 2004).
As human civilization advanced dogs became useful in many lines of work. Novel breeds and varieties of dog were created to suit the changing needs of man. It is no surprise when a retriever, bred to retrieve shot ducks out of the lake, wants to take a dive in your pool, or when a sheltie, bred to herd sheep, is excitedly circling the children in the yard. It is not just the entertaining quarks of dogs which have been encouraged through breeding though, but some of the most irritating characteristics as well. Terriers, for example, are small dogs bred to hunt vermin. Some had the job of digging underground to find the rats or prairie dogs it hunted. The dog would often get stuck in the tunnel, and would bark vigorously to inform its master of its whereabouts and the location of the prey (Coren, 2000). The master would then dig up the dog and vermin. Although now considered irritating, the constant digging and excited barking of terriers was at one time considered useful and selected for.
The genetics of dogs are as diverse as the tasks for which they were bred. Single nucleotide polymorphisms (SNPs) are the small genetic changes which account for much of the intraspecies variation we see in nature. We have sequenced nine breeds of dog so far and have found 2.6 million of these SNPs and counting! This incredible genetic diversity though is only seen between breeds. Within breeds, genetic erosion and the exposure of recessive alleles is a huge problem (Ostrander, et al., 2006).
Several factors make purebreds more susceptible to genetic disease. One is the founder effect—a few dogs often supply the genetic stock for entire populations, especially if the breed has been brought overseas. The most influential force acting to decrease the genetic diversity within breeds though is, undoubtedly, man. In an attempt to maintain breed purity and select for desired characterizes, people have often overlooked the genetic health of the animal. A certain desired quality may be linked to a disease allele due to its location on a chromosome, or it may be inseparable from the phenotype such as the breathing problems of bull dogs which are due to their characteristic look. As another example, in Dalmatians the instance of deafness is about 30% (50% in blue eyed phenotypes). There is a lower instance of deafness in Dalmatians showing splotchy ear patches due to the close linkage of the two traits on the dog chromosome, but because this characteristic is not seen as favorable in most kennel clubs, dogs displaying it are not selected for breeding. Even more alarming, all Dalmatians are homozygous recessive for a defective autosomal allele which affects purine metabolism leading to hyperuricosuria (causes kidney stones). This gene is tightly linked to the characteristic coat spotting pattern of the Dalmatians and so has not been eliminated.
Even those who try to decrease the genetic erosion of purebreds face much opposition. In the 1970s, an independent breeder Dr. Robert Schaible made efforts to reduce homozygosity of the hyperuricosuria disease allele by breeding a pointer into his Dalmatian gene pool. Even the 5th generation 97% Dalmatian, 3% pointer pups are not accepted as purebreds by many kennel clubs and breeding associations, as the tight linkage of the genes still produces a coat that does not look "quite right."
Here are some other genetic disorders that are common to pure breds:
Vertebral degenerative joint disease
It is not just breeders and kennel clubs that create problems for dogs though. People often select their new family puppy based upon an appearance that they find appealing. I've been working at the local shelter for a while now and the most common breeds I see coming in are Labrador, husky, pit bull, and border collie mixes. It is likely that the former owners of these dogs took them home without researching the breed first. They might have purchased a husky imagining the rugged appearance it would give them, only to later realize that the dog requires a rugged lifestyle as well and does not find channel surfing an adequate exercise. Maybe they thought border collies were cute and trim but didn't understand that this energetic, intelligent breed often requires a lot of space and a job otherwise it may find other ways to occupy its active mind and body--such as tearing the couch to shreds. Others may have purchased what they hoped would be a fearsome guard dog pit bull only to realize they didn't have time to train it properly.
The roles of dogs have changed, yet our attitude about breeds has not followed. Someday, maybe an unusually clever dog breeder will realize that what we really need is the ultimate couch potato dog, a breed that enjoys TV and snacks and the occasional game of fetch, rather than one bred for hunting, herding, or looks. Until then though, it is likely that dogs will continue to pour into our already crowded shelters. So do a dog a favor and research before you buy. Understand that every dog requires daily exercise, time, and dedication.
Ostrander, E., Giger, U., Lindblad-Toh, K., and contributors (2006). The Dog and Its Genome. Cold Spring Harbor Laboratories Press. xvi, 31-53, 81, 200-212, 439-451
Trut, L.N. (1999). Early canid domestication: The farm-fox experiment. American Scientist, 87; 169
WGBH. (2004). Dogs and more dogs the true story of man's best friend. DVD. Boston, MA
Thursday, April 05, 2007
My understanding of words is strictly phonological. I do not visualize words in my head well, rather the image that pops up is of the object or sometimes a color I associate with a more abstract word (e.g.: I read or hear "car" and immediately visualize the object, not the letters of the word. For the more abstract word "philosophy," I just see yellow). This method is so ingrained that my efforts to visualize words have not come very far. If a word is not spelled phonetically (I do fine with phonetic words), then much of my memory for how to spell it comes from the muscle memory of typing or writing. If the word is short enough and I make the effort to see the word rather than the object, then I can visualize the word, though this is not my natural inclination.
I believe that my highly phonetics-based understanding of the written word also accounts for my inability to speed read. I have tried speed reading, but it relies on a visual understanding of words. For me, the visual stimulus (written words) must first be translated into a phonological mental representation (in other words, me talking to myself in my head) before I can access the meaning of the word. This process happens in a split second, so I can still read quickly, but only as fast as I can talk (as I am talking to myself in my mind as I read). I will not understand a word if I only glance at it and do not say it to myself. The meaning is only accessed once the phonetic representation is produced. Of course, it is such an automatic response that I usually cannot force myself to glance at word without mentally saying it. I do miss some road signs though, because of this phenomena. A friend will ask as we drive, "well didn't you see that big green sign a few miles back?" and I will recall having seen the sign, but not what was written as I did not bother to translate the material. Good speed readers, on the other hand, can access the meaning of words visually. The need for translation into a phonetic mental representation does not exist, so the speed of reading is only hindered by the speed of vision and understanding.
This whole talking in my head business has its benefits. For one, e-mails, letters, and books seem more conversational. Each writer gets their own voices as they speak to me in my mind. Plus, this makes me a better writer as everything I write is being said in my head at the same time. The writing then comes out very smoothly, as if I were just talking (because that is essentially what I am doing in my mind as I type).
I wonder how many other people approach reading in this fashion and which is more common.
Sunday, April 01, 2007
In grade school, I recall that the idea of being "left brained" or "right brained" was quite popular. Teachers would give us tests of our drawing or math ability to determine which half of our brain was "dominant." These days, the right is generally considered the creative and emotive half while the left is analytical and logical. These generalizations are somewhat out dated. As you will read, history has continually tried to assign concrete roles to the versatile, complex hemispheres of the brain. After talking a bit about the beginnings of our knowledge of hemispheric lateralization, I will go into some modern studies and theories about the two halves of the brain. This is a long one, so you might want to set some time aside or break it down into parts if you're reading on the fly. Enjoy!
A brief history of lateralization theories
Since the 1800's neurologists have been fascinated by the division of labor between the right and left hemispheres of the cerebral cortex. The specializations of each hemisphere have been repeatedly rewritten as new case studies and empirical data enter our understanding of cerebral lateralization.
It was 1861 when publication of Broca's breakthrough case study of a man with verbal expressive aphasia revealed the left hemisphere's dominance for language. The unfortunate gentleman had lost the ability to speak although he could still make noises and understand language. Post mortem autopsy revealed that the man had profound atrophy of the posterior portion of the second and third frontal gyri of his left hemisphere, now known as Broca's area. This area has been repeatedly confirmed as a key area involved in the motor functions necessary for speech production.
After Broca's breakthrough discovery, reports pooled in about the various disorders found in patients with left hemisphere lesions, including Wernicke's identification of receptive aphasia in 1874, Exner's identification of agraphia in 1881, and Lissauer's findings on agnosia in 1890 (Cutting, 9-12).
The left hemisphere continued to be viewed as the "verbal hemisphere" into the 1950's. As added support for the theory of left hemispheric verbal dominance, examination of epileptic patients by Miliner in 1958 found that patients undergoing a left temporal lobectomy suffered more significantly in recall of verbally spoken stories than those with removal of the right. The right was eventually dreamed the "spatial hemisphere;" Miliner's study of epileptic patients with lateralized temporal lobectomies being a large contributor to this theory. It seemed that those with the right temporal removed had difficulty identifying the scene in a picture but less difficulty with verbal recall than those who had left temporal lobectomies.
The 1960's brought new ideas regarding the distribution of labor between the hemispheres. Throughout the early 1900's it was common knowledge among physicians that the right hemisphere seemed to have a link with emotion, but the idea was not popularized until the 1960's. Again, patients of temporal lobectomies were the main influence for these generalizations. It seemed that those without the left temporal lobe tended to suffer depression while those lacking a right leaned toward euphoria or apathy. A tendency for patients with left hemisphere damage to suffer depression is still seen today (Cutting 24-26).
In the 1970's the verbal/nonverbal dichotomy lost much of its momentum as research began to show the important role of the right hemisphere in understanding metaphor and emotion in speech as well as its contribution to the fluency and tonality of language production. Instead a new dichotomy began to form in which the right hemisphere was seen as the "Gestalt" processor while the left was considered detail-oriented. This new theory mainly arose from studies of split brained patients--people who had had their corpus callosum (the connecting fibers between hemispheres) severed to control epilepsy. These patients were better able to reconstruct whole pictures (such as the circular face of a clock) with their left hand and visual field (controlled by the right hemisphere) and more able to focus on parts (such as the numbers on a clock) when working with their right hand and right visual field (controlled by the left hemisphere) (Cutting 21-23).
Today, the idea of hemispheric lateralization is so popular that terms like "left brained" and "right brained" are used (and often misused) colloquially. Although one cannot be literally left brained or right brained without sever damage or removal of a hemisphere, the terms are informally interpreted as how creative or analytical a person is. The right hemisphere is considered creative in the popular view, while the left hemisphere is known as the analytic half. There are, of course, faults to all of these extreme generalizations that have been made through the ages. Cutting notes the dilemma with this interpretation in his text, the Right Cerebral Hemisphere and Psychiatric Disorders:
"To call the left hemisphere analytic is heuristically unhelpful. It only provokes the questions--analytic of what? and how does it analyze?" -Cutting, 77
Sound and the hemispheres
To examine the known functions of the left and right cerebral hemispheres to date, let us begin with one of the first and most well known lateralizations: the distribution of verbal skills in the left and right hemispheres. Typically, the left hemisphere plays the chief role in speech production as it is the site of the Broca's and Wernicke's areas, but in a few people both hemisphere's play some role in speech production and in some left handed people the right hemisphere is dominant (containing both the Broca's and Wernicke's areas). For about ninety percent of the population though, the left hemisphere is dominant for speech (Cohen 32-33).
The idea of the left hemisphere as the verbal hemisphere arises from the many speech disorders that can be seen when portions of the left hemisphere are diseased or ablated. Verbal expressive aphasia (also known as Broca's aphasia), as mentioned earlier, is seen in patients with damage to the Broca's area. The disorder results in laborious, grammatically incorrect speech. The symptoms include difficulty with function words, such as "the," "a," "in," or "on" and slow, trying speech production. Patients are still able to produce content words, such as nouns, verbs, and adjectives. As part of the secondary motor cortex of the frontal lobe, these difficulties that result from Broca's aphasia are thought to arise from the inability to fall back upon the motor memories that the lips, tongue, and mouth rely upon for forming fluent speech.
Wernicke's aphasia is another startling condition that can arise from damage to the left hemisphere. Located in the auditory cortex of the left temporal lobe, the Wernicke's area and surrounding tissues are responsible for speech comprehension and the conversion of thoughts into meaningful speech. It is directly connected to the Broca's area by a series of axons creating a smooth connection between speech comprehension and production in unaffected people. When the Wernicke's area is damaged, fluent speech can still be produced due to the motor memory of the Broca's area, but content words are often replaced with nonsense (Carleson 482-495).
When the language areas of the left hemisphere are severely damaged, patients are sometimes able to recover lost verbal abilities in the corresponding areas of the right hemisphere. Mainly younger patients, especially those below the age of three, are able to make full recoveries (Habedank B, Haupt WF., Heiss WD., Herholz K., Kessler J., Thiel A, Winhuisen L).
The right hemisphere, too, plays a role in speech comprehension and production. It is essential for recognition of tone and the emotional expression in a speakers voice (Sander K, Scheich H). Many case studies show that lesions of the right cerebral hemisphere often result in a decreased ability to detect the emotion in another’s words. Prosody, the rhythm, emphasis, and melody of speech, is also a key role of the right hemisphere (Carlson 499-501, Young 53-54). Although speech can still be understood by those lacking the Broca's and Wernicke's areas of the left hemisphere, the right hemisphere cannot produce speech on its own in most people.
Even in the absence of actual language, the auditory processing of the hemispheres has a fundamental lateralization dependent upon the type of auditory stimulus provided. According to fMRI studies, the brain responds to increased temporal variation in sound with more dominance in the left Heschl's gyrus, while an increase in spectral variation in sound shows more activity in the right superior temporal gyrus and right posterior superior temporal sulcus (Bishop DV, Jamison HL, Matthews PM., Watkins KE).
Functional MRI studies of patients exposed to music too show lateralization. The right hemisphere seems to be more involved in interpretation of music than the left. Again, we see the right hemisphere playing a larger role in deciphering tones and rhythm. Professional musicians, on the other hand, showed a stronger left hemispheric lateralization when listening to music. It is theorized that this may be due to their analytic interpretation of the music; they already have an orderly categorical system by which to interpret the sound whereas novices do not (Huang, Chen, Wang, and Chung 187).
Visual perception in the hemispheres
It is especially difficult to define the roles of the hemispheres in visual spatial representation due to a lack of terminology and definition for different types of visual spatial abilities. Although it is fairly clear that there is a difference in the ways the hemispheres process visual spatial information, where the line is drawn as to which hemisphere does what seems to be very difficult to categorize.
Beginning with the basics, there does not seem to be a large difference in the ability of either hemisphere to detect a visual stimulus in its visual field. Several studies in the sixties and seventies including Filbey and Gazzaniga in 1960, Bryden in 1976, and Jeeves and Dixon in 1970, reported that both patients with left hemisphere damage and those with right did not have difficulty detecting light dots or other visual stimulus in their intact visual fields. A hemispheric difference in perception of visual stimulus seems first be noticed when defining the orientation of a seen object. Multiple studies show that the right hemisphere holds a superior ability to distinguish a line’s orientation to the horizontal or vertical axis (Young 8-12).
Once we move on from presenting patients with basic shapes to entire pictures new distinctions between hemispheric structures can be identified. Visual-object agnosia, caused by lesions of the left parieto-occipital region, results in an inability to identify the category to which an object belongs. Patients are still able to identify isolated elements and whole scenes, indicating that the left hemisphere is essential for the categorization of objects (Cutting 13-14, 76).
Again, much research seemed to point to the left hemisphere playing a role in identifying parts while the right perceived objects as wholes. The idea that the left hemisphere is necessary for perceiving parts creates a dilemma; how deeply can we consider an object or scene a whole or in parts? For example, I could say the forest is the whole and the trees the parts, or the tree is the whole and the branches are the parts, or the branches are the whole and the cells are the parts. At which point does the left hemisphere become responsible for identifying the pieces of the whole and the right for putting the whole together? It seems the theory that left perceives parts while the right perceives wholes is too general. A better way to describe the role of the left hemisphere might be to say that it provides the formal relationships between objects as they fall into categories and subcategories (e.g. to say that a Dodge Spirit falls into the "car" category) while the right hemisphere considers how these representations reflect the order of the actual world.
An interesting theory of how the left and right hemispheres cooperate in object recognition comes from Warrington and Taylors' 1978 experiments. They noted that the first stage in object recognition is perceptual categorization by its physical nature, in which the right hemisphere collates all possible spatial configurations of an object so that it can be perceived from any visual angle. Next, the left hemisphere adds a semantic category to the object (again supporting the idea that the left is the hemisphere of language). Her support for this theory comes from studies in which patients with right hemisphere lesions consistently have difficulty matching same objects presented from odd angles (Cutting 75-76).
Along these lines a later theory by Kosslyn in the late 80's suggested that the left and right hemispheres worked independently rather than in the serial fashion proposed by Warrington. The right could encode shapes from multiple perspectives, he suggested, while the left funneled a range of shapes into a single, symbolic representation that could be easily named (Cutting 77-80).
In addition to superiority in recognition of three dimensional objects at presented at odd angles, the right hemisphere also seems to be dominant in its ability to recognize facial expressions. This could be said to be a part of its overall excellence in Gestalt reasoning, but also might be attributed to the right hemisphere's regard as the "emotional hemisphere." (Cutting 26).
In patients with grandiose damage to either the left or right hemisphere, paralysis of the opposite side of the body may result. Interestingly, a defense mechanism known as unilateral neglect is typically only seen in those with lesions of the right cerebral hemisphere with corresponding paralysis of the left side of the body (Cutting 36). Patients with unilateral neglect completely ignore their entire left visual field as if it did not exist. For example, they may draw a clock with only numbers on the right side, or when asked to mark where the middle of a line is they may make a mark far to the right of the middle.
As described in Oliver Sack's case studies, sometimes patients are able to understand their situation on an intellectual level, yet still have difficulty resolving the problem. One of Sack's patients, Mrs. S, suffered a massive stoke effecting the posterior regions of her right cerebral hemisphere. The result was unilateral neglect of her left visual field but after having her situation explained to her she was able to combat the obstacle by turning almost full circle to the right to recover the missing half of her meals or papers, although she was still unable to grasp the concept of turning left, as left simply did not exist to her (Sacks 77-79).
Another more extreme symptom of a lesion to the right hemisphere can be full blown denial. Patients with denial not only neglect their left visual field, but also deny the fact that they are paralyzed. In his case studies, V.S. Ramachandran describes some patients who would go out of their way to deny their paralysis. One patient, when asked of her paralyzed left hand "whose arm is this?" responded that it must be her brother's, not her own, because it appeared to be rather hairy (Ramachandran 127-133).
When damage spreads as far as ventromedial frontal lobe of the right hemisphere, denial often expands beyond the patient's paralysis and body image into other regions as well. Patients may deny any out of the ordinary behavior or situation from eating candy to the existence of a diagnosed brain tumor whereas no such problems were noted in patients with corresponding lesion in the left hemisphere (Ramachandran 142-143).
Tying it all together
A fascinating theory of lateralization that takes much of the to-date research into account comes from V.S. Ramachandran in his novel, Phantoms in the Brain. The function of the left hemisphere, Ramachandran writes, is primarily the preservation of stability. This may be why we see cases of denial and neglect in patients with right hemispheric legions but not in those with left as their left hemisphere strives to maintain the status quo. This also explains the euphoric or apathetic nature of those with right hemisphere damage; if they cannot perceive that something is wrong then they do not worry about the problem. The right hemisphere, on the other hand, is charged with assimilating novel information. Again, we see that it must take a wholesale approach instead of categorizing items into preexisting schemes. Take again, for example, visual perception. The right hemisphere must be able to recognize an object from a new angle, at this point taking in the novel information. The left hemisphere is then able to categorize this information, sorting it into the filing system of the mind. As Ramachandran jokingly suggests,
"The right hemisphere is a left-wing revolutionary generating paradigm shifts, whereas the left hemisphere is a die hard conservative that clings to the status quo"-Ramachandran, 136
In conclusion, it seems that not all is yet known about the distribution of labor between the right and left cerebral hemispheres. In the majority of the population, it appears that the left hemisphere is a place for systemic categorization, such as the parts of language and music that can be organized and the things in life which can be readily cataloged. The right hemisphere is more elusive, dealing in emotions and abstractions; it helps us to take in novel information that we do not yet have a name for.
Bishop DV, Jamison HL, Matthews PM., Watkins KE. "Hemispheric Specialization for Processing Auditory Nonspeech Stimuli." Cerebral Cortex. 10 (2005): 1093
Carlson, Neil R. Physiology of Behavior, Eighth Edition. New York: Pearson Education, Inc. 2004.
Cohen, David. The Secret Language of the Mind. San Fancisco: Chronicle Books, 1996.
Cutting, John. The Right Cerebral Hemisphere and Psychiatric Disorders. New York: Oxford Press, 1990.
Habedank B, Haupt WF., Heiss WD., Herholz K., Kessler J., Thiel A, Winhuisen L. "From the left to the right: How the brain compensates progressive loss of language function." Brain Language. (2006)
Ramachandran, V.S. and Blakeslee, Sandra. Phantoms in the Brain. New York: William and Marrow Company, Inc. 1998.
Sacks, Oliver. The Man Who Mistook His Wife for a Hat and Other Clinical Tales. New York: Touchstone, 1998.
Sander K, Scheich H. "Left auditory cortex and amygdala, but right insula dominance for human laughing and crying." Journal of Cognitive Neuroscience. 17.10 (2005): 1519
Wang, Huang, Chen, and Chung. "Monitoring Music Processing of Harmonic Chords using fMRI: Comparison between Professional Musicians and Amateurs." International Society for Magnetic Resonance in Medicine (2006): 187
Young, Andrew W. Functions of the Right Cerebral Hemisphere. London: Academic Press, 1983.