Interesting Science Articles

My Photo
Name:
Location: Duesseldorf, Germany

I'm an American Opera Singer, living in Germany for 21 years now. I love visiting my sister and brother in Rhode Island.

Saturday, March 14, 2009

Blog Site, recommended by NYTimes

March 13, 2009, 7:30 pm

Tumblr Makes Blogging Blissfully Easy
By Sam Grobart AND Paul Boutin
Have you always wanted to blog, but never found the time to set up a site? Stop reading me and click through to Tumblr, a free blogging site that makes it effortless not only to type in text, but to share photos, links, music, and videos. There’s even an instant-post button to include quotes from other blogs.

Setting up a Tumblr blog takes about 35 seconds. I timed it. You hit the site, click on “Sign up,” then type in your email address, a password, and a name for your blog. Click once again and you’re up and running.

Besides being easy to use, Tumblr has a very eye-pleasing layout with a minimalist Web-2.0 look. You can customize it, but why bother? When you login to add content, a row of giant buttons atop the page gives you one-click access to simple tools to insert text, a photo, a quote, a link, a chat session, an audio clip, or a video clip.

Much like Facebook or Twitter, Tumblr handles all the formatting for each type of content automatically. The less you think about it, the better Tumblr works.

Sunday, June 29, 2008

The Itch

Annals of Medicine
The Itch
Its mysterious power may be a clue to a new theory about brains and bodies.
by Atul Gawande
June 30, 2008 Text Size:
Small Text
Medium Text
Large Text Print E-Mail Feeds Scientists once saw itching as a form of pain. They now believe it to be a different order of sensation. Photograph by Gerald Slota.

Keywords
Itching; Scratching; Oaklander, Anne Louise (Dr.); Neurology; Brain; Obsessive-Compulsive Disorders (O.C.D.); Perception It was still shocking to M. how much a few wrong turns could change your life. She had graduated from Boston College with a degree in psychology, married at twenty-five, and had two children, a son and a daughter. She and her family settled in a town on Massachusetts’ southern shore. She worked for thirteen years in health care, becoming the director of a residence program for men who’d suffered severe head injuries. But she and her husband began fighting. There were betrayals. By the time she was thirty-two, her marriage had disintegrated. In the divorce, she lost possession of their home, and, amid her financial and psychological struggles, she saw that she was losing her children, too. Within a few years, she was drinking. She began dating someone, and they drank together. After a while, he brought some drugs home, and she tried them. The drugs got harder. Eventually, they were doing heroin, which turned out to be readily available from a street dealer a block away from her apartment.

One day, she went to see a doctor because she wasn’t feeling well, and learned that she had contracted H.I.V. from a contaminated needle. She had to leave her job. She lost visiting rights with her children. And she developed complications from the H.I.V., including shingles, which caused painful, blistering sores across her scalp and forehead. With treatment, though, her H.I.V. was brought under control. At thirty-six, she entered rehab, dropped the boyfriend, and kicked the drugs. She had two good, quiet years in which she began rebuilding her life. Then she got the itch.

It was right after a shingles episode. The blisters and the pain responded, as they usually did, to acyclovir, an antiviral medication. But this time the area of the scalp that was involved became numb, and the pain was replaced by a constant, relentless itch. She felt it mainly on the right side of her head. It crawled along her scalp, and no matter how much she scratched it would not go away. “I felt like my inner self, like my brain itself, was itching,” she says. And it took over her life just as she was starting to get it back.

Her internist didn’t know what to make of the problem. Itching is an extraordinarily common symptom. All kinds of dermatological conditions can cause it: allergic reactions, bacterial or fungal infections, skin cancer, psoriasis, dandruff, scabies, lice, poison ivy, sun damage, or just dry skin. Creams and makeup can cause itch, too. But M. used ordinary shampoo and soap, no creams. And when the doctor examined M.’s scalp she discovered nothing abnormal—no rash, no redness, no scaling, no thickening, no fungus, no parasites. All she saw was scratch marks.

The internist prescribed a medicated cream, but it didn’t help. The urge to scratch was unceasing and irresistible. “I would try to control it during the day, when I was aware of the itch, but it was really hard,” M. said. “At night, it was the worst. I guess I would scratch when I was asleep, because in the morning there would be blood on my pillowcase.” She began to lose her hair over the itchy area. She returned to her internist again and again. “I just kept haunting her and calling her,” M. said. But nothing the internist tried worked, and she began to suspect that the itch had nothing to do with M.’s skin.


from the issuecartoon banke-mail thisPlenty of non-skin conditions can cause itching. Dr. Jeffrey Bernhard, a dermatologist with the University of Massachusetts Medical School, is among the few doctors to study itching systematically (he published the definitive textbook on the subject), and he told me of cases caused by hyperthyroidism, iron deficiency, liver disease, and cancers like Hodgkin’s lymphoma. Sometimes the syndrome is very specific. Persistent outer-arm itching that worsens in sunlight is known as brachioradial pruritus, and it’s caused by a crimped nerve in the neck. Aquagenic pruritus is recurrent, intense, diffuse itching upon getting out of a bath or shower, and although no one knows the mechanism, it’s a symptom of polycythemia vera, a rare condition in which the body produces too many red blood cells.

But M.’s itch was confined to the right side of her scalp. Her viral count showed that the H.I.V. was quiescent. Additional blood tests and X-rays were normal. So the internist concluded that M.’s problem was probably psychiatric. All sorts of psychiatric conditions can cause itching. Patients with psychosis can have cutaneous delusions—a belief that their skin is infested with, say, parasites, or crawling ants, or laced with tiny bits of fibreglass. Severe stress and other emotional experiences can also give rise to a physical symptom like itching—whether from the body’s release of endorphins (natural opioids, which, like morphine, can cause itching), increased skin temperature, nervous scratching, or increased sweating. In M.’s case, the internist suspected tricho-tillomania, an obsessive-compulsive disorder in which patients have an irresistible urge to pull out their hair.

M. was willing to consider such possibilities. Her life had been a mess, after all. But the antidepressant medications often prescribed for O.C.D. made no difference. And she didn’t actually feel a compulsion to pull out her hair. She simply felt itchy, on the area of her scalp that was left numb from the shingles. Although she could sometimes distract herself from it—by watching television or talking with a friend—the itch did not fluctuate with her mood or level of stress. The only thing that came close to offering relief was to scratch.

“Scratching is one of the sweetest gratifications of nature, and as ready at hand as any,” Montaigne wrote. “But repentance follows too annoyingly close at its heels.” For M., certainly, it did: the itching was so torturous, and the area so numb, that her scratching began to go through the skin. At a later office visit, her doctor found a silver-dollar-size patch of scalp where skin had been replaced by scab. M. tried bandaging her head, wearing caps to bed. But her fingernails would always find a way to her flesh, especially while she slept.

One morning, after she was awakened by her bedside alarm, she sat up and, she recalled, “this fluid came down my face, this greenish liquid.” She pressed a square of gauze to her head and went to see her doctor again. M. showed the doctor the fluid on the dressing. The doctor looked closely at the wound. She shined a light on it and in M.’s eyes. Then she walked out of the room and called an ambulance. Only in the Emergency Department at Massachusetts General Hospital, after the doctors started swarming, and one told her she needed surgery now, did M. learn what had happened. She had scratched through her skull during the night—and all the way into her brain.

Itching is a most peculiar and diabolical sensation. The definition offered by the German physician Samuel Hafenreffer in 1660 has yet to be improved upon: An unpleasant sensation that provokes the desire to scratch. Itch has been ranked, by scientific and artistic observers alike, among the most distressing physical sensations one can experience. In Dante’s Inferno, falsifiers were punished by “the burning rage / of fierce itching that nothing could relieve”:



The way their nails scraped down upon the
scabs
Was like a knife scraping off scales from
carp. . . .
“O you there tearing at your mail of
scabs
And even turning your fingers into
pincers,”
My guide began addressing one of them,

“Tell us are there Italians among the
souls
Down in this hole and I’ll pray that your
nails
Will last you in this task eternally.”



Though scratching can provide momentary relief, it often makes the itching worse. Dermatologists call this the itch-scratch cycle. Scientists believe that itch, and the accompanying scratch reflex, evolved in order to protect us from insects and clinging plant toxins—from such dangers as malaria, yellow fever, and dengue, transmitted by mosquitoes; from tularemia, river blindness, and sleeping sickness, transmitted by flies; from typhus-bearing lice, plague-bearing fleas, and poisonous spiders. The theory goes a long way toward explaining why itch is so exquisitely tuned. You can spend all day without noticing the feel of your shirt collar on your neck, and yet a single stray thread poking out, or a louse’s fine legs brushing by, can set you scratching furiously.

But how, exactly, itch works has been a puzzle. For most of medical history, scientists thought that itching was merely a weak form of pain. Then, in 1987, the German researcher H. O. Handwerker and his colleagues used mild electric pulses to drive histamine, an itch-producing substance that the body releases during allergic reactions, into the skin of volunteers. As the researchers increased the dose of histamine, they found that they were able to increase the intensity of itch the volunteers reported, from the barely appreciable to the “maximum imaginable.” Yet the volunteers never felt an increase in pain. The scientists concluded that itch and pain are entirely separate sensations, transmitted along different pathways.

Despite centuries spent mapping the body’s nervous circuitry, scientists had never noticed a nerve specific for itch. But now the hunt was on, and a group of Swedish and German researchers embarked upon a series of tricky experiments. They inserted ultra-thin metal electrodes into the skin of paid volunteers, and wiggled them around until they picked up electrical signals from a single nerve fibre. Computers subtracted the noise from other nerve fibres crossing through the region. The researchers would then spend hours—as long as the volunteer could tolerate it—testing different stimuli on the skin in the area (a heated probe, for example, or a fine paintbrush) to see what would get the nerve to fire, and what the person experienced when it did.

They worked their way through fifty-three volunteers. Mostly, they encountered well-known types of nerve fibres that respond to temperature or light touch or mechanical pressure. “That feels warm,” a volunteer might say, or “That feels soft,” or “Ouch! Hey!” Several times, the scientists came across a nerve fibre that didn’t respond to any of these stimuli. When they introduced a tiny dose of histamine into the skin, however, they observed a sharp electrical response in some of these nerve fibres, and the volunteer would experience an itch. They announced their discovery in a 1997 paper: they’d found a type of nerve that was specific for itch.

Unlike, say, the nerve fibres for pain, each of which covers a millimetre-size territory, a single itch fibre can pick up an itchy sensation more than three inches away. The fibres also turned out to have extraordinarily low conduction speeds, which explained why itchiness is so slow to build and so slow to subside.

Other researchers traced these fibres to the spinal cord and all the way to the brain. Examining functional PET-scan studies in healthy human subjects who had been given mosquito-bite-like histamine injections, they found a distinct signature of itch activity. Several specific areas of the brain light up: the part of the cortex that tells you where on your body the sensation occurs; the region that governs your emotional responses, reflecting the disagreeable nature of itch; and the limbic and motor areas that process irresistible urges (such as the urge to use drugs, among the addicted, or to overeat, among the obese), reflecting the ferocious impulse to scratch.

Now various phenomena became clear. Itch, it turns out, is indeed inseparable from the desire to scratch. It can be triggered chemically (by the saliva injected when a mosquito bites, say) or mechanically (from the mosquito’s legs, even before it bites). The itch-scratch reflex activates higher levels of your brain than the spinal-cord-level reflex that makes you pull your hand away from a flame. Brain scans also show that scratching diminishes activity in brain areas associated with unpleasant sensations.

But some basic features of itch remained unexplained—features that make itch a uniquely revealing case study. On the one hand, our bodies are studded with receptors for itch, as they are with receptors for touch, pain, and other sensations; this provides an alarm system for harm and allows us to safely navigate the world. But why does a feather brushed across the skin sometimes itch and at other times tickle? (Tickling has a social component: you can make yourself itch, but only another person can tickle you.) And, even more puzzling, how is it that you can make yourself itchy just by thinking about it?

Contemplating what it’s like to hold your finger in a flame won’t make your finger hurt. But simply writing about a tick crawling up the nape of one’s neck is enough to start my neck itching. Then my scalp. And then this one little spot along my flank where I’m beginning to wonder whether I should check to see if there might be something there. In one study, a German professor of psychosomatics gave a lecture that included, in the first half, a series of what might be called itchy slides, showing fleas, lice, people scratching, and the like, and, in the second half, more benign slides, with pictures of soft down, baby skin, bathers. Video cameras recorded the audience. Sure enough, the frequency of scratching among people in the audience increased markedly during the first half and decreased during the second. Thoughts made them itch.

We now have the nerve map for itching, as we do for other sensations. But a deeper puzzle remains: how much of our sensations and experiences do nerves really explain?

In the operating room, a neurosurgeon washed out and debrided M.’s wound, which had become infected. Later, a plastic surgeon covered it with a graft of skin from her thigh. Though her head was wrapped in layers of gauze and she did all she could to resist the still furious itchiness, she awoke one morning to find that she had rubbed the graft away. The doctors returned her to the operating room for a second skin graft, and this time they wrapped her hands as well. She rubbed it away again anyway.

“They kept telling me I had O.C.D.,” M. said. A psychiatric team was sent in to see her each day, and the resident would ask her, “As a child, when you walked down the street did you count the lines? Did you do anything repetitive? Did you have to count everything you saw?” She kept telling him no, but he seemed skeptical. He tracked down her family and asked them, but they said no, too. Psychology tests likewise ruled out obsessive-compulsive disorder. They showed depression, though, and, of course, there was the history of addiction. So the doctors still thought her scratching was from a psychiatric disorder. They gave her drugs that made her feel logy and sleep a lot. But the itching was as bad as ever, and she still woke up scratching at that terrible wound.

One morning, she found, as she put it, “this very bright and happy-looking woman standing by my bed. She said, ‘I’m Dr. Oaklander,’ ” M. recalled. “I thought, Oh great. Here we go again. But she explained that she was a neurologist, and she said, ‘The first thing I want to say to you is that I don’t think you’re crazy. I don’t think you have O.C.D.’ At that moment, I really saw her grow wings and a halo,” M. told me. “I said, ‘Are you sure?’ And she said, ‘Yes. I have heard of this before.’ ”

Anne Louise Oaklander was about the same age as M. Her mother is a prominent neurologist at Albert Einstein College of Medicine, in New York, and she’d followed her into the field. Oaklander had specialized in disorders of peripheral nerve sensation—disorders like shingles. Although pain is the most common symptom of shingles, Oaklander had noticed during her training that some patients also had itching, occasionally severe, and seeing M. reminded her of one of her shingles patients. “I remember standing in a hallway talking to her, and what she complained about—her major concern—was that she was tormented by this terrible itch over the eye where she had had shingles,” she told me. When Oaklander looked at her, she thought that something wasn’t right. It took a moment to realize why. “The itch was so severe, she had scratched off her eyebrow.”

Oaklander tested the skin near M.’s wound. It was numb to temperature, touch, and pinprick. Nonetheless, it was itchy, and when it was scratched or rubbed M. felt the itchiness temporarily subside. Oaklander injected a few drops of local anesthetic into the skin. To M.’s surprise, the itching stopped—instantly and almost entirely. This was the first real relief she’d had in more than a year.

It was an imperfect treatment, though. The itch came back when the anesthetic wore off, and, although Oaklander tried having M. wear an anesthetic patch over the wound, the effect diminished over time. Oaklander did not have an explanation for any of this. When she took a biopsy of the itchy skin, it showed that ninety-six per cent of the nerve fibres were gone. So why was the itch so intense?

Oaklander came up with two theories. The first was that those few remaining nerve fibres were itch fibres and, with no other fibres around to offer competing signals, they had become constantly active. The second theory was the opposite. The nerves were dead, but perhaps the itch system in M.’s brain had gone haywire, running on a loop all its own.

The second theory seemed less likely. If the nerves to her scalp were dead, how would you explain the relief she got from scratching, or from the local anesthetic? Indeed, how could you explain the itch in the first place? An itch without nerve endings didn’t make sense. The neurosurgeons stuck with the first theory; they offered to cut the main sensory nerve to the front of M.’s scalp and abolish the itching permanently. Oaklander, however, thought that the second theory was the right one—that this was a brain problem, not a nerve problem—and that cutting the nerve would do more harm than good. She argued with the neurosurgeons, and she advised M. not to let them do any cutting.

“But I was desperate,” M. told me. She let them operate on her, slicing the supraorbital nerve above the right eye. When she woke up, a whole section of her forehead was numb—and the itching was gone. A few weeks later, however, it came back, in an even wider expanse than before. The doctors tried pain medications, more psychiatric medications, more local anesthetic. But the only thing that kept M. from tearing her skin and skull open again, the doctors found, was to put a foam football helmet on her head and bind her wrists to the bedrails at night.

She spent the next two years committed to a locked medical ward in a rehabilitation hospital—because, although she was not mentally ill, she was considered a danger to herself. Eventually, the staff worked out a solution that did not require binding her to the bedrails. Along with the football helmet, she had to wear white mitts that were secured around her wrists by surgical tape. “Every bedtime, it looked like they were dressing me up for Halloween—me and the guy next to me,” she told me.

“The guy next to you?” I asked. He had had shingles on his neck, she explained, and also developed a persistent itch. “Every night, they would wrap up his hands and wrap up mine.” She spoke more softly now. “But I heard he ended up dying from it, because he scratched into his carotid artery.”

I met M. seven years after she’d been discharged from the rehabilitation hospital. She is forty-eight now. She lives in a three-room apartment, with a crucifix and a bust of Jesus on the wall and the low yellow light of table lamps strung with beads over their shades. Stacked in a wicker basket next to her coffee table were Rick Warren’s “The Purpose Driven Life,” People, and the latest issue of Neurology Now, a magazine for patients. Together, they summed up her struggles, for she is still fighting the meaninglessness, the isolation, and the physiology of her predicament.

She met me at the door in a wheelchair; the injury to her brain had left her partially paralyzed on the left side of her body. She remains estranged from her children. She has not, however, relapsed into drinking or drugs. Her H.I.V. remains under control. Although the itch on her scalp and forehead persists, she has gradually learned to protect herself. She trims her nails short. She finds ways to distract herself. If she must scratch, she tries to rub gently instead. And, if that isn’t enough, she uses a soft toothbrush or a rolled-up terry cloth. “I don’t use anything sharp,” she said. The two years that she spent bound up in the hospital seemed to have broken the nighttime scratching. At home, she found that she didn’t need to wear the helmet and gloves anymore.

Still, the itching remains a daily torment. “I don’t normally tell people this,” she said, “but I have a fantasy of shaving off my eyebrow and taking a metal-wire grill brush and scratching away.”

Some of her doctors have not been willing to let go of the idea that this has been a nerve problem all along. A local neurosurgeon told her that the original operation to cut the sensory nerve to her scalp must not have gone deep enough. “He wants to go in again,” she told me.

A new scientific understanding of perception has emerged in the past few decades, and it has overturned classical, centuries-long beliefs about how our brains work—though it has apparently not penetrated the medical world yet. The old understanding of perception is what neuroscientists call “the naïve view,” and it is the view that most people, in or out of medicine, still have. We’re inclined to think that people normally perceive things in the world directly. We believe that the hardness of a rock, the coldness of an ice cube, the itchiness of a sweater are picked up by our nerve endings, transmitted through the spinal cord like a message through a wire, and decoded by the brain.

In a 1710 “Treatise Concerning the Principles of Human Knowledge,” the Irish philosopher George Berkeley objected to this view. We do not know the world of objects, he argued; we know only our mental ideas of objects. “Light and colours, heat and cold, extension and figures—in a word, the things we see and feel—what are they but so many sensations, notions, ideas?” Indeed, he concluded, the objects of the world are likely just inventions of the mind, put in there by God. To which Samuel Johnson famously responded by kicking a large stone and declaring, “I refute it thus!”

Still, Berkeley had recognized some serious flaws in the direct-perception theory—in the notion that when we see, hear, or feel we are just taking in the sights, sounds, and textures of the world. For one thing, it cannot explain how we experience things that seem physically real but aren’t: sensations of itching that arise from nothing more than itchy thoughts; dreams that can seem indistinguishable from reality; phantom sensations that amputees have in their missing limbs. And, the more we examine the actual nerve transmissions we receive from the world outside, the more inadequate they seem.

Our assumption had been that the sensory data we receive from our eyes, ears, nose, fingers, and so on contain all the information that we need for perception, and that perception must work something like a radio. It’s hard to conceive that a Boston Symphony Orchestra concert is in a radio wave. But it is. So you might think that it’s the same with the signals we receive—that if you hooked up someone’s nerves to a monitor you could watch what the person is experiencing as if it were a television show.

Yet, as scientists set about analyzing the signals, they found them to be radically impoverished. Suppose someone is viewing a tree in a clearing. Given simply the transmissions along the optic nerve from the light entering the eye, one would not be able to reconstruct the three-dimensionality, or the distance, or the detail of the bark—attributes that we perceive instantly.

Or consider what neuroscientists call “the binding problem.” Tracking a dog as it runs behind a picket fence, all that your eyes receive is separated vertical images of the dog, with large slices missing. Yet somehow you perceive the mutt to be whole, an intact entity travelling through space. Put two dogs together behind the fence and you don’t think they’ve morphed into one. Your mind now configures the slices as two independent creatures.

The images in our mind are extraordinarily rich. We can tell if something is liquid or solid, heavy or light, dead or alive. But the information we work from is poor—a distorted, two-dimensional transmission with entire spots missing. So the mind fills in most of the picture. You can get a sense of this from brain-anatomy studies. If visual sensations were primarily received rather than constructed by the brain, you’d expect that most of the fibres going to the brain’s primary visual cortex would come from the retina. Instead, scientists have found that only twenty per cent do; eighty per cent come downward from regions of the brain governing functions like memory. Richard Gregory, a prominent British neuropsychologist, estimates that visual perception is more than ninety per cent memory and less than ten per cent sensory nerve signals. When Oaklander theorized that M.’s itch was endogenous, rather than generated by peripheral nerve signals, she was onto something important.

The fallacy of reducing perception to reception is especially clear when it comes to phantom limbs. Doctors have often explained such sensations as a matter of inflamed or frayed nerve endings in the stump sending aberrant signals to the brain. But this explanation should long ago have been suspect. Efforts by surgeons to cut back on the nerve typically produce the same results that M. had when they cut the sensory nerve to her forehead: a brief period of relief followed by a return of the sensation.

Moreover, the feelings people experience in their phantom limbs are far too varied and rich to be explained by the random firings of a bruised nerve. People report not just pain but also sensations of sweatiness, heat, texture, and movement in a missing limb. There is no experience people have with real limbs that they do not experience with phantom limbs. They feel their phantom leg swinging, water trickling down a phantom arm, a phantom ring becoming too tight for a phantom digit. Children have used phantom fingers to count and solve arithmetic problems. V. S. Ramachandran, an eminent neuroscientist at the University of California, San Diego, has written up the case of a woman who was born with only stumps at her shoulders, and yet, as far back as she could remember, felt herself to have arms and hands; she even feels herself gesticulating when she speaks. And phantoms do not occur just in limbs. Around half of women who have undergone a mastectomy experience a phantom breast, with the nipple being the most vivid part. You’ve likely had an experience of phantom sensation yourself. When the dentist gives you a local anesthetic, and your lip goes numb, the nerves go dead. Yet you don’t feel your lip disappear. Quite the opposite: it feels larger and plumper than normal, even though you can see in a mirror that the size hasn’t changed.

The account of perception that’s starting to emerge is what we might call the “brain’s best guess” theory of perception: perception is the brain’s best guess about what is happening in the outside world. The mind integrates scattered, weak, rudimentary signals from a variety of sensory channels, information from past experiences, and hard-wired processes, and produces a sensory experience full of brain-provided color, sound, texture, and meaning. We see a friendly yellow Labrador bounding behind a picket fence not because that is the transmission we receive but because this is the perception our weaver-brain assembles as its best hypothesis of what is out there from the slivers of information we get. Perception is inference.

The theory—and a theory is all it is right now—has begun to make sense of some bewildering phenomena. Among them is an experiment that Ramachandran performed with volunteers who had phantom pain in an amputated arm. They put their surviving arm through a hole in the side of a box with a mirror inside, so that, peering through the open top, they would see their arm and its mirror image, as if they had two arms. Ramachandran then asked them to move both their intact arm and, in their mind, their phantom arm—to pretend that they were conducting an orchestra, say. The patients had the sense that they had two arms again. Even though they knew it was an illusion, it provided immediate relief. People who for years had been unable to unclench their phantom fist suddenly felt their hand open; phantom arms in painfully contorted positions could relax. With daily use of the mirror box over weeks, patients sensed their phantom limbs actually shrink into their stumps and, in several instances, completely vanish. Researchers at Walter Reed Army Medical Center recently published the results of a randomized trial of mirror therapy for soldiers with phantom-limb pain, showing dramatic success.

A lot about this phenomenon remains murky, but here’s what the new theory suggests is going on: when your arm is amputated, nerve transmissions are shut off, and the brain’s best guess often seems to be that the arm is still there, but paralyzed, or clenched, or beginning to cramp up. Things can stay like this for years. The mirror box, however, provides the brain with new visual input—however illusory—suggesting motion in the absent arm. The brain has to incorporate the new information into its sensory map of what’s happening. Therefore, it guesses again, and the pain goes away.

The new theory may also explain what was going on with M.’s itch. The shingles destroyed most of the nerves in her scalp. And, for whatever reason, her brain surmised from what little input it had that something horribly itchy was going on—that perhaps a whole army of ants were crawling back and forth over just that patch of skin. There wasn’t any such thing, of course. But M.’s brain has received no contrary signals that would shift its assumptions. So she itches.

Not long ago, I met a man who made me wonder whether such phantom sensations are more common than we realize. H. was forty-eight, in good health, an officer at a Boston financial-services company living with his wife in a western suburb, when he made passing mention of an odd pain to his internist. For at least twenty years, he said, he’d had a mild tingling running along his left arm and down the left side of his body, and, if he tilted his neck forward at a particular angle, it became a pronounced, electrical jolt. The internist recognized this as Lhermitte’s sign, a classic symptom that can indicate multiple sclerosis, Vitamin B12 deficiency, or spinal-cord compression from a tumor or a herniated disk. An MRI revealed a cavernous hemangioma, a pea-size mass of dilated blood vessels, pressing into the spinal cord in his neck. A week later, while the doctors were still contemplating what to do, it ruptured.

“I was raking leaves out in the yard and, all of a sudden, there was an explosion of pain and my left arm wasn’t responding to my brain,” H. said when I visited him at home. Once the swelling subsided, a neurosurgeon performed a tricky operation to remove the tumor from the spinal cord. The operation was successful, but afterward H. began experiencing a constellation of strange sensations. His left hand felt cartoonishly large—at least twice its actual size. He developed a constant burning pain along an inch-wide ribbon extending from the left side of his neck all the way down his arm. And an itch crept up and down along the same band, which no amount of scratching would relieve.

H. has not accepted that these sensations are here to stay—the prospect is too depressing—but they’ve persisted for eleven years now. Although the burning is often tolerable during the day, the slightest thing can trigger an excruciating flareup—a cool breeze across the skin, the brush of a shirtsleeve or a bedsheet. “Sometimes I feel that my skin has been flayed and my flesh is exposed, and any touch is just very painful,” he told me. “Sometimes I feel that there’s an ice pick or a wasp sting. Sometimes I feel that I’ve been splattered with hot cooking oil.”

For all that, the itch has been harder to endure. H. has developed calluses from the incessant scratching. “I find I am choosing itch relief over the pain that I am provoking by satisfying the itch,” he said.

He has tried all sorts of treatments—medications, acupuncture, herbal remedies, lidocaine injections, electrical-stimulation therapy. But nothing really worked, and the condition forced him to retire in 2001. He now avoids leaving the house. He gives himself projects. Last year, he built a three-foot stone wall around his yard, slowly placing the stones by hand. But he spends much of his day, after his wife has left for work, alone in the house with their three cats, his shirt off and the heat turned up, trying to prevent a flareup.

His neurologist introduced him to me, with his permission, as an example of someone with severe itching from a central rather than a peripheral cause. So one morning we sat in his living room trying to puzzle out what was going on. The sun streamed in through a big bay window. One of his cats, a scraggly brown tabby, curled up beside me on the couch. H. sat in an armchair in a baggy purple T-shirt he’d put on for my visit. He told me that he thought his problem was basically a “bad switch” in his neck where the tumor had been, a kind of loose wire sending false signals to his brain. But I told him about the increasing evidence that our sensory experiences are not sent to the brain but originate in it. When I got to the example of phantom-limb sensations, he perked up. The experiences of phantom-limb patients sounded familiar to him. When I mentioned that he might want to try the mirror-box treatment, he agreed. “I have a mirror upstairs,” he said.

He brought a cheval glass down to the living room, and I had him stand with his chest against the side of it, so that his troublesome left arm was behind it and his normal right arm was in front. He tipped his head so that when he looked into the mirror the image of his right arm seemed to occupy the same position as his left arm. Then I had him wave his arms, his actual arms, as if he were conducting an orchestra.

The first thing he expressed was disappointment. “It isn’t quite like looking at my left hand,” he said. But then suddenly it was.

“Wow!” he said. “Now, this is odd.”

After a moment or two, I noticed that he had stopped moving his left arm. Yet he reported that he still felt as if it were moving. What’s more, the sensations in it had changed dramatically. For the first time in eleven years, he felt his left hand “snap” back to normal size. He felt the burning pain in his arm diminish. And the itch, too, was dulled.

“This is positively bizarre,” he said.

He still felt the pain and the itch in his neck and shoulder, where the image in the mirror cut off. And, when he came away from the mirror, the aberrant sensations in his left arm returned. He began using the mirror a few times a day, for fifteen minutes or so at a stretch, and I checked in with him periodically.

“What’s most dramatic is the change in the size of my hand,” he says. After a couple of weeks, his hand returned to feeling normal in size all day long.

The mirror also provided the first effective treatment he has had for the flares of itch and pain that sporadically seize him. Where once he could do nothing but sit and wait for the torment to subside—it sometimes took an hour or more—he now just pulls out the mirror. “I’ve never had anything like this before,” he said. “It’s my magic mirror.”

There have been other, isolated successes with mirror treatment. In Bath, England, several patients suffering from what is called complex regional pain syndrome—severe, disabling limb sensations of unknown cause—were reported to have experienced complete resolution after six weeks of mirror therapy. In California, mirror therapy helped stroke patients recover from a condition known as hemineglect, which produces something like the opposite of a phantom limb—these patients have a part of the body they no longer realize is theirs.

Such findings open up a fascinating prospect: perhaps many patients whom doctors treat as having a nerve injury or a disease have, instead, what might be called sensor syndromes. When your car’s dashboard warning light keeps telling you that there is an engine failure, but the mechanics can’t find anything wrong, the sensor itself may be the problem. This is no less true for human beings. Our sensations of pain, itch, nausea, and fatigue are normally protective. Unmoored from physical reality, however, they can become a nightmare: M., with her intractable itching, and H., with his constellation of strange symptoms—but perhaps also the hundreds of thousands of people in the United States alone who suffer from conditions like chronic back pain, fibromyalgia, chronic pelvic pain, tinnitus, temporomandibular joint disorder, or repetitive strain injury, where, typically, no amount of imaging, nerve testing, or surgery manages to uncover an anatomical explanation. Doctors have persisted in treating these conditions as nerve or tissue problems—engine failures, as it were. We get under the hood and remove this, replace that, snip some wires. Yet still the sensor keeps going off.

So we get frustrated. “There’s nothing wrong,” we’ll insist. And, the next thing you know, we’re treating the driver instead of the problem. We prescribe tranquillizers, antidepressants, escalating doses of narcotics. And the drugs often do make it easier for people to ignore the sensors, even if they are wired right into the brain. The mirror treatment, by contrast, targets the deranged sensor system itself. It essentially takes a misfiring sensor—a warning system functioning under an illusion that something is terribly wrong out in the world it monitors—and feeds it an alternate set of signals that calm it down. The new signals may even reset the sensor.

This may help explain, for example, the success of the advice that back specialists now commonly give. Work through the pain, they tell many of their patients, and, surprisingly often, the pain goes away. It had been a mystifying phenomenon. But the picture now seems clearer. Most chronic back pain starts as an acute back pain—say, after a fall. Usually, the pain subsides as the injury heals. But in some cases the pain sensors continue to light up long after the tissue damage is gone. In such instances, working through the pain may offer the brain contradictory feedback—a signal that ordinary activity does not, in fact, cause physical harm. And so the sensor resets.

This understanding of sensation points to an entire new array of potential treatments—based not on drugs or surgery but, instead, on the careful manipulation of our perceptions. Researchers at the University of Manchester, in England, have gone a step beyond mirrors and fashioned an immersive virtual-reality system for treating patients with phantom-limb pain. Detectors transpose movement of real limbs into a virtual world where patients feel they are actually moving, stretching, even playing a ballgame. So far, five patients have tried the system, and they have all experienced a reduction in pain. Whether those results will last has yet to be established. But the approach raises the possibility of designing similar systems to help patients with other sensor syndromes. How, one wonders, would someone with chronic back pain fare in a virtual world? The Manchester study suggests that there may be many ways to fight our phantoms.

I called Ramachandran to ask him about M.’s terrible itch. The sensation may be a phantom, but it’s on her scalp, not in a limb, so it seemed unlikely that his mirror approach could do anything for her. He told me about an experiment in which he put ice-cold water in people’s ears. This confuses the brain’s position sensors, tricking subjects into thinking that their heads are moving, and in certain phantom-limb and stroke patients the illusion corrected their misperceptions, at least temporarily. Maybe this would help M., he said. He had another idea. If you take two mirrors and put them at right angles to each other, you will get a non-reversed mirror image. Looking in, the right half of your face appears on the left and the left half appears on the right. But unless you move, he said, your brain may not realize that the image is flipped.

“Now, suppose she looks in this mirror and scratches the left side of her head. No, wait—I’m thinking out loud here—suppose she looks and you have someone else touch the left side of her head. It’ll look—maybe it’ll feel—like you’re touching the right side of her head.” He let out an impish giggle. “Maybe this would make her itchy right scalp feel more normal.” Maybe it would encourage her brain to make a different perceptual inference; maybe it would press reset. “Who knows?” he said.

It seemed worth a try. ♦

Labels: , ,

Dick Cavett on Depression

June 27, 2008, 7:43 pm
Smiling Through
Who decided that it’s variety that’s the spice of life?
I submit that, rather, it is contrast that is life’s piquant condiment.
Last week, I attended two events in my home state of Nebraska that supplied both variety and contrast on successive days. A bit like the Mafioso some years ago who got married one day and began a 10-year jail sentence the next (a cynic might consider them both “sentences”).
On the one hand, I addressed a group of noble citizens whose job is aiding and counseling poor devils suffering from depression. “Cavett Returns Home to Discuss ‘The Worst Agony Devised For Man’ ” read the next day’s headline in the Lincoln paper. Despite the subject matter, I got quite a lot of laughs. My credentials? Having been there myself.
The year before I had talked to a similar group of care-givers in Omaha in front of an audience that included what you’d think would be an entertainer’s nightmare: a hundred or more people in the throes of the disease. I expected no laughs.
I had just gotten started telling the grim faces that I knew what they were going through when a large man — in pajamas, as I recall — stood up and slowly made his way toward me.
“Paranoid schizophrenic,” someone stage-whispered to me. There was general tension in the room as the man continued to approach. When he stopped two feet in front of me, and stared at me, I heard myself say, “Come here often?” Loud general laughter broke the tension. He returned peacefully to his seat — probably without having heard me or the laughter.
Miraculously, I kept them laughing for perhaps an hour. Clearly the fact that I knew about their plight from my own experience had a lot — or maybe everything — to do with it.
I was able to say to them, I know that everyone here knows that feeling when people say to you, “Hey, shape up! Stop thinking only about your troubles. What’s to be depressed about? Go swimming or play tennis and you’ll feel a lot better. Pull up your socks!” And how you, hearing this, would like nothing more than to remove one of those socks and choke them to death with it. (Laughter mixed with some minor cheering.)
The reward from this was unique in my experience. Afterwards, those in charge seemed amazed and delighted. One said, “See Clara over there? She hasn’t moved a muscle in her face for six months and you had her laughing out loud.”
(Such inane advice of the “socks up” variety, by the way, can only be excused by the fact that if you’ve never had it you can never begin to imagine the depth of the ailment’s black despair. Another tip: Do not ask the victim what he has “to be depressed about.” The malady doesn’t care if you’re broke and alone or successful and surrounded by a loving family. It does its democratic dirty work to your brain chemistry regardless of your “position.”)
My time with them in Omaha a year ago was not recorded but I would rather have a tape of that day with that audience than just about anything I’ve done. Of the things I said to them I can recall only this story:
Personal item: Once I said to a doctor during a “session” that I wished he could get inside my head for just a minute because there’s no way of imagining what this feels like. “Oh, I know,” he said, “I got pretty sad when my father died.”
Defying standard protocol on the couch, I arose on one elbow, turned to him — he was seated behind me — and said, “Do you think grief is even close to this?” To his credit he replied, “I’m sorry. I shouldn’t have said that.”
(The anger you feel at such a moment pumps a shot of adrenaline that can make you feel symptom-free . . . all too briefly.)
The fact that these afflicted people in Omaha knew me to be a “celebrity” had a good deal to do with the unexpected success of the whole thing. Some had even seen me talk about the nasty illness on television in the early ’80s, or in People magazine. While not wishing to become the poster boy for depression, I still found the rewards undeniably pleasant, gratifying and touching.
As in: Dear Mr. Cavett, You don’t know it but you saved my dad’s/ wife’s/daughter’s life. Followed by various forms of, My dad’s seeing that Dick Cavett could have it made him feel he wasn’t a freak, and he finally went for treatment. We are so grateful.
Apparently one thing I said on “Larry King” back then hit home hard. It was that when you’re downed by this affliction, if there were a curative magic wand on the table eight feet away, it would be too much trouble to go over and pick it up.
There’s also the conviction that it may have worked for others but it wouldn’t work for you. Your brain is busted and nothing’s going to help.
The most extreme problem that depression presents is suicide. It’s the reason you don’t dare delay treatment. Don’t mess with it. Run for help — whether it’s talk therapy, drug therapy or the miraculous results of ECT (electroconvulsive therapy, erroneously labeled “shock therapy”). The shock involved is closer to insulin shock than electric shock. It’s a toss-up whether more people have been scared off it by “One Flew Over the Cuckoo’s Nest” than have been scared off medication by Tom Cruise’s idiotic braying on the subject on “The Today Show.” (Matt Lauer should have hit him with a wet turbot.)
I guarantee that one result of this week’s Supreme Court decision on guns will be the deaths of people who have a gun at home for the first time while in depression. In the depths of the malady, getting a stamp on a letter is a day’s work. Going out to somehow arrange for a gun would be way beyond your capability while stricken. But having one near at hand is another matter. There were times when I longed for my ancient .22 single-shot squirrel-hunting rifle. Luckily it had been given away years earlier.
Suicide rarely happens when you are all the way down in the uttermost depths. Again, it’s too much trouble. Perhaps the saddest irony of depression is that suicide happens when the patient gets a little better and can again function sufficiently. “She seemed to be improving,” is the sad cry of the mourners.
Two prime victims of the disease are your libido and your ability to read. Five times through a paragraph and unable to say what it’s about. But, oddly, you can read a book or article about depression with full comprehension. The two best books I know of are William Styron’s monumental account of his own case, “Darkness Visible,” and Kay Redfield Jamison’s “An Unquiet Mind.”
Damned if I had meant to rattle on so long on this subject, depriving you of my contrasting event, the Johnny Carson Comedy Festival in his hometown of Norfolk, Neb. (I’ll get to that.)
And pardon me for teasing you last time about a promised tale of espionage and murder. The case is more complex than I imagined and will take some time.
And is anyone still wondering about the error by the test-makers on that exam that American students performed so dismally on?

Labels: ,

Your Brain Lies to You

Op-Ed Contributor
Your Brain Lies to You
By SAM WANG and SANDRA AAMODT
FALSE beliefs are everywhere. Eighteen percent of Americans think the sun revolves around the earth, one poll has found. Thus it seems slightly less egregious that, according to another poll, 10 percent of us think that Senator Barack Obama, a Christian, is instead a Muslim. The Obama campaign has created a Web site to dispel misinformation. But this effort may be more difficult than it seems, thanks to the quirky way in which our brains store memories — and mislead us along the way.

The brain does not simply gather and stockpile information as a computer’s hard drive does. Facts are stored first in the hippocampus, a structure deep in the brain about the size and shape of a fat man’s curled pinkie finger. But the information does not rest there. Every time we recall it, our brain writes it down again, and during this re-storage, it is also reprocessed. In time, the fact is gradually transferred to the cerebral cortex and is separated from the context in which it was originally learned. For example, you know that the capital of California is Sacramento, but you probably don’t remember how you learned it.

This phenomenon, known as source amnesia, can also lead people to forget whether a statement is true. Even when a lie is presented with a disclaimer, people often later remember it as true.

With time, this misremembering only gets worse. A false statement from a noncredible source that is at first not believed can gain credibility during the months it takes to reprocess memories from short-term hippocampal storage to longer-term cortical storage. As the source is forgotten, the message and its implications gain strength. This could explain why, during the 2004 presidential campaign, it took some weeks for the Swift Boat Veterans for Truth campaign against Senator John Kerry to have an effect on his standing in the polls.

Even if they do not understand the neuroscience behind source amnesia, campaign strategists can exploit it to spread misinformation. They know that if their message is initially memorable, its impression will persist long after it is debunked. In repeating a falsehood, someone may back it up with an opening line like “I think I read somewhere” or even with a reference to a specific source.

In one study, a group of Stanford students was exposed repeatedly to an unsubstantiated claim taken from a Web site that Coca-Cola is an effective paint thinner. Students who read the statement five times were nearly one-third more likely than those who read it only twice to attribute it to Consumer Reports (rather than The National Enquirer, their other choice), giving it a gloss of credibility.

Adding to this innate tendency to mold information we recall is the way our brains fit facts into established mental frameworks. We tend to remember news that accords with our worldview, and discount statements that contradict it.

In another Stanford study, 48 students, half of whom said they favored capital punishment and half of whom said they opposed it, were presented with two pieces of evidence, one supporting and one contradicting the claim that capital punishment deters crime. Both groups were more convinced by the evidence that supported their initial position.

Psychologists have suggested that legends propagate by striking an emotional chord. In the same way, ideas can spread by emotional selection, rather than by their factual merits, encouraging the persistence of falsehoods about Coke — or about a presidential candidate.

Journalists and campaign workers may think they are acting to counter misinformation by pointing out that it is not true. But by repeating a false rumor, they may inadvertently make it stronger. In its concerted effort to “stop the smears,” the Obama campaign may want to keep this in mind. Rather than emphasize that Mr. Obama is not a Muslim, for instance, it may be more effective to stress that he embraced Christianity as a young man.

Consumers of news, for their part, are prone to selectively accept and remember statements that reinforce beliefs they already hold. In a replication of the study of students’ impressions of evidence about the death penalty, researchers found that even when subjects were given a specific instruction to be objective, they were still inclined to reject evidence that disagreed with their beliefs.

In the same study, however, when subjects were asked to imagine their reaction if the evidence had pointed to the opposite conclusion, they were more open-minded to information that contradicted their beliefs. Apparently, it pays for consumers of controversial news to take a moment and consider that the opposite interpretation may be true.

In 1919, Justice Oliver Wendell Holmes of the Supreme Court wrote that “the best test of truth is the power of the thought to get itself accepted in the competition of the market.” Holmes erroneously assumed that ideas are more likely to spread if they are honest. Our brains do not naturally obey this admirable dictum, but by better understanding the mechanisms of memory perhaps we can move closer to Holmes’s ideal.

Sam Wang, an associate professor of molecular biology and neuroscience at Princeton, and Sandra Aamodt, a former editor in chief of Nature Neuroscience, are the authors of “Welcome to Your Brain: Why You Lose Your Car Keys but Never Forget How to Drive and Other Puzzles of Everyday Life.”

Copyright 2008 The New York Times Company

Labels: , ,

Friday, June 27, 2008

Incredible Edible Lawn

Thursday, Jun. 26, 2008
The Incredible, Edible Front Lawn
By M.J. Stephey

Clarence Ridgley is the most popular guy on his block, and it's all thanks to his lawn. In April, Ridgley transformed his neatly trimmed yard into a garden of tomatoes, blueberries, strawberries, lettuce, beets and herbs. And because the plot sits in front of his home in Baltimore, the bountiful harvest is visible — and available — to anyone who wanders by."People will come to my yard and pick up an onion sprout and start eating it on the spot," he says. "I've met more people in the past two months than I have the past 22 years of living here."

Ridgley is one of five homeowners in the U.S. to participate in the project known as "Edible Estates," in which homeowners trade their mowed and ornamental lawns for artistic arrangements of organic produce. Los Angeles-based architect Fritz Haeg launched the campaign in July 2005, after pundits and politicians had divided the country into Red and Blue states for the presidential election. Haeg says he was drawn to the lawn — that "iconic American space" — because it cut across social, political and economic boundaries. "The lawn really struck me as one of the few places that we all share," he says. "It represents what we're all supposedly working so hard for — the American dream."

The problem, as Haeg sees it, is that the "hyper-manicured lawn" is looking increasingly out of date. In the 1950s, when suburbia first began to sprawl, a perfectly trimmed front yard embodied the post-war prosperity Americans aspired to. Today, amid rising fuel costs, food safety scares and growing environmental awareness, a chemically treated and verdant but nutritionally barren lawn seems wasteful, he says.

The concept of tilling one's front yard is not a new one. In 1942, as the U.S. emerged from the Great Depression and mobilized for World War II, Agriculture Secretary Claude R. Wickard encouraged Americans to plant "Victory Gardens" to boost civic morale and relieve the war's pressure on food supplies — an idea first introduced during The Great War and picked up by Canada, the U.S. and Great Britain. The slogan: "Have Your Garden, and Eat It Too." Soon gardens began popping up everywhere, and not just American lawns: the Chicago County Jail, a downtown parking lot in New Orleans, a zoo in Portland, Ore. In 1943, Americans planted 20.5 million Victory Gardens, and the harvest accounted for nearly one-third of all the vegetables consumed in the country that year.

Though Haeg's approach to home-grown produce is unique; his enthusiasm for gardening is not. Twenty-five million U.S. households planted vegetable and fruit gardens in 2007, according to Bruce Butterfield of the National Gardener's Association, and that number is expected to increase by several million this year. The waiting list for the USDA's Master Gardener Program, which involves nearly 90,000 volunteers in all 50 states who educate and assist the public with horticulture projects, is getting longer every year, says Bill Hoffman, National Program Leader for Agriculture Homeland Security. Even urban dwellers are returning to the land; in Austin, Texas, for example, the wait for community gardens is three years.

"It comes as no surprise to me," Butterfield says. "Gas prices, food prices, salmonella — the world has gone absolutely crazy. And for a lot of people, that brings up this need to take control over what happens in their own yard. If all goes to hell, you can just lock the gate and stay at home."

In fact, the average American garden has proven to be a surprisingly accurate social and economic barometer. The upsurge in fuel prices in 1975 spawned a similar gardening boom, with nearly 49% of the population growing some sort of produce. Then, as the prosperity of the '90s trickled down to American yards, the pendulum swung back toward aesthetics over sustenance.

"Back in the 1990s, when things were booming, the gardening movement was all about Martha Stewart — spending lots of money hiring people to make these beautiful, ornamental spaces," says Charlie Nardozzi, senior horticulturist at the National Gardening Association. Nowadays, "growing your own food can be a political statement that you have a personal connection with your food and where it's coming from, versus going to a grocery store and grabbing whatever is on the shelf."

But while some gardeners might be trying to save a few bucks or avoid commercially farmed produce, many horticulturists believe the gardening boom is more about lifestyle than economics. And unlike the concept of government-sponsored, "top-down" Victory Gardens, Edible Estates is a grassroots effort. Ridgley, for one, says his garden is as much about community and beauty as it is about food. "This is an art exhibit that just happens to be in my front yard," he says.

Haeg, meanwhile, hopes his project will prompt more Americans to rethink their yards, and where they plant their gardens. He hopes to plant two more Edible Estates next year. "This is a wonderful opportunity to reconsider how we're living, which I don't think is so great anyway." And with 80% of Americans living in homes with access to a yard, the potential for growth is enormous. As Haeg says, "the front lawns are there waiting."

Find this article at:
http://www.time.com/time/nation/article/0,8599,1816764,00.html

Thursday, May 29, 2008

Sahara Sands over Germany

Press release from May 6th, 2008

Arable land can have a negative impact on air quality
Farmland dust cloud from the Ukraine detected in Germany for the first time

Leipzig. Fallow agricultural land and steppe-formation processes are evidently capable of having a much greater effect on global air quality than was previously assumed. This is the conclusion drawn by researchers after examining a dust cloud that formed over parched fields in southern Ukraine and led to extremely high concentrations of particulate matter in Central Europe. On 24 March 2007 the dust cloud spread across Slovakia, Poland and the Czech Republic to Germany. Peak concentrations of between 200 and 1400 micrograms of PM10 particulates per cubic metre were measured. By way of comparison: the EU daily average limit is 50 micrograms per cubic metre. Even if such meteorological conditions would appear to occur relatively infrequently, the unexpected scale of the phenomenon showed a need for a better understanding of the processes that lead to the formation and transport of such large quantities of dust. Writing in the journal Atmospheric Chemistry and Physics, the researchers from the Leibniz Institute for Tropospheric Research (IfT), Freie Universität Berlin, the Helmholtz Centre for Environmental Research (UFZ) and Saxony’s regional office of the environment and geology (LfUG) explain that this is particularly relevant in the context of human-induced desertification and climate change. Previously, the Sahara had been seen as the main source of dust carried over long distances to Central Europe. The paper published by the research team from Leipzig, Berlin and Dresden is the first documentation of dust transport from the Ukraine.


Dust source activation on 23 March 11:00 UTC over the southern Ukraine (arrow).
Source: EUMETSAT

download as jpg (1.9 MB)


The MODIS-Aqua composite image of Southern Ukraine on 23 March 2007, 10:50 UTC reveals the large scale emission of agricultural dust. The reservoir lake of Kakhovskaya on the Dnieper River can be seen in dark blue, ice clouds in light blue.
Source: MODIS-Aqua

download as jpg (0.8 MB)


Ukrainian dust that falls on Labe Meadow in Krkonose Mountains (a wide mountain plateau south of the Sokolik Mt., near Czech-Polish border). Ski tracks show white snow from the previous two days.
Photo: Vaclav Sir, (c) Bulletin of Geosciences

download as jpg (1.2 MB)
Blown by the wind
It was a warm, sunny day in spring and a strong wind was blowing over the parched fields near the Kachowkaer dam on the lower reaches of the Dnjepr river. There had been no rain for weeks. The black soil here in the south of the Ukraine is one of the most fertile soils in the world, but it is also very fine and therefore particularly sensitive to erosion. On this day, 23 March 2007, gusts of wind with speeds of up to 90 kilometres per hour were whipping up huge quantities of dust in the steppe. A dust cloud formed that was so big that it was later clearly visible on the weather satellite infrared pictures. At this point, no one living 1500 kilometres to the west suspected what was in store for people in Germany and their eastern neighbours. Thanks to an area of high pressure over Scandinavia and an area of low pressure moving from the Black Sea to Italy, the air mass quickly drifted to Central Europe. Just one day later the air with its cargo of fine dust from the Ukrainian fields had arrived in Germany.

Tracking down a mystery
In Germany, people in the Erz Mountains were not a little surprised when the sky took on a slightly yellow sheen. The controllers from the state air monitoring network also realised very quickly that something unusual had happened. Their filters were a much browner colour than usual, which meant that there must have been a lot more dust in the air. In the offices of the particulate experts at the Leibniz Institute for Tropospheric Research the phones were ringing. Dr Wolfram Birmili has for years been investigating the long-distance transport of particulates in the troposphere, i.e. the lower few kilometres of the atmosphere. He and his colleagues quickly ruled out the classic air pollutants, such as coal-fired power stations and forest fires, because their measuring instruments showed unusually high levels of coarse-grained particles (larger than 0.001 mm) in the dust plume. The relatively low carbon monoxide and carbon dioxide concentrations also seemed to rule out industrial sources and biomass combustion. Dust from the Sahara Desert had caused a stir in Germany several times in the past. But the Sahara was not a possibility either, because the weather conditions were wrong – the wind was coming from the east. So where was the dust coming from? The detective work began.

Satellite pictures solve the riddle
The directive on particulate matter has been in force in the European Union since 2005. As a consequence, there is a comprehensive air monitoring network in the member states designed to monitor compliance with the limit values. With quick, unbureaucratic assistance from a total of 15 state environmental agencies it was possible to analyse particulate data from 360 stations in five countries. It quickly became apparent that the particulate matter was coming from the east because the concentration increased noticeably towards Slovakia. But where exactly had the cloud come from? What the situation was like the other side of the EU’s eastern border was a matter for speculation because of a lack of air monitoring data. So the researchers combed satellite pictures and came across what they were looking for on a EUMETSAT picture. This showed a noticeable red patch over southern Ukraine on 23 March which expanded rapidly. The researchers estimated the total mass of the dust cloud to be at least 60,000 tons. That is equivalent to more than 600 wagonloads of sand. The actual mass was probably much greater still, since the measuring devices register only those particles that are smaller than 10 microns (0.01 mm). Czech geologists estimated the total dust load must be about 3 million tons because this Ukrainian "plume" contained also bigger particles till size of 0.5mm. The last remaining doubts about the origin of the dust cloud were cleared up by a team led by Jindrich Hladil at the Institute of Geology of the Czech Academy of Sciences in Prague. They compared the dust samples from the air with dust samples taken directly from the Ukrainian soil. The lead isotope ratio showed that the dust had indeed come from the Black Sea region. "The mineralogical-petrological fingerprinting of the solid particles over 10 micrometre size can say a lot about the geologically specific source areas", says Jindrich Hladil. Moreover, the researchers in Prague, who were investigating the mystery completely independently of their counterparts in Leipzig, discovered in their samples pollen grains that were typical for the Ukraine. These included relatively large quantities of ragweed pollen, which is regarded as extremely allergenic. This also ruled out a dust storm from the Mediterranean coast of Libya, which had caught the scientists’ attention in the meantime. The rapid transport of the air and the presence of a temperature inversion, which acted like a lid, ensured that the dust cloud was unable to escape upwards or sideways – this was shown by data from a LIDAR remote sensing system in Leipzig and from weather balloons. It was swept to Germany as if through a pipe at speeds of up to 70 kilometres per hour, and was even detected in Britain. "At the end of the day, it was the combination of dry, vulnerable soil, strong gusts of wind and fast transport within a dry, stable boundary layer that made this a freak dust event in Central Europe," explains Wolfram Birmili.

A foretaste of the consequences of climate change
Over two-thirds of the land area in the Ukraine consists of fields and meadows. The soil on 220,000 square kilometres is regarded as being under threat. Since the 1930s wind erosion in what was then the Soviet Union has increased considerably as a result of collectivisation in agriculture and the resultant large field areas. In particular, this has affected the regions north of the Caucasus, the lower reaches of the Don river and eastern and southern Ukraine. It is possible that the process is also accelerated by climate change. In particular, previously unaffected semi-arid regions are continuing to dry out. A normal dust storm can result in 70 tons of the light black soil being whirled up per hectare per hour. "According to Russian studies, in the past 40 years there have been three to five such dust storms per year on average in the Ukrainian steppe," Birmili reports. "Our institute has been constantly monitoring the airborne particles and their components at the Melpitz research station near Leipzig for around 15 years. We analysed all these measurements again retrospectively but were unable to detect any comparable dust cloud from the Black Sea area. This makes the dust cloud of 24 March 2007 unusual. But who can say that such weather conditions may not occur more frequently in future as a result of climate change?"

Small particles with a big effect
The research findings put a new complexion on the attempts being made by many local authorities to comply with the particulate limits using a wide range of measures. In the greater Berlin area for instance, scientists estimate that half of the particulate volumes come from regional and remote sources, rather than from local sources. Besides salt particles, soil particles form the largest mass of particles in the atmosphere. Scientists estimate that there are between 1,000 and 2,000 million tons of dust circulating around the world in the lower layers of the atmosphere in annual totals. This would be equivalent to the cargo of a goods train long enough to encircle the globe five times. This particulate matter comes primarily from arid areas and deserts, i.e. from the Sahara, the Arabian peninsula, the Gobi Desert and the Taklimakan Desert in Asia, and the deserts in Australia and South America. One-fifth is believed to be caused by human activity, such as field cultivation. And there are consequences for the climate because dust particles in the air block solar radiation trying to enter the atmosphere and block heat radiation escaping into space and are therefore the largest unknown factor in climate models. Moreover, not much is known about the possible health impacts of such long-distance transport of dust aerosols.
Tilo Arnhold

top
More information:
Dr. Wolfram Birmili
Leibniz Institute for Tropospheric Research
Telefon ++49 341-235-3437, -3210
http://www.tropos.de/physik/aerosol/physik_aero.html

oder über

Helmholtz Centre for Environmental Research - UFZ
Press office
Tilo Arnhold / Doris Böhme
Phone +49 (0)341 235 2278
presse@ufz.de

Monday, May 26, 2008

Sunday, May 25, 2008

Adelgunde Video beim Frühstuck in Jungbusch

  Testing -- an interesting experiment.
Posted by Picasa

Thursday, May 22, 2008

Schlafentzug kurze Stromausfälle

Schlafentzug verursacht kurze Stromausfälle im Gehirnddp
ddp - Dienstag, 20. Mai, 15:06 UhrWashington (ddp). Wissenschaftler haben entdeckt, was Schlafentzug im Gehirn anrichtet: Die Steuerzentralen für das Sehen und die Aufmerksamkeit werden immer wieder von einer Art Stromausfall heimgesucht, bei dem ihre Aktivität drastisch heruntergefahren wird. Das geht vor allem zulasten der Fähigkeit, Gesehenem einen Sinn zu verleihen und es in einen Kontext einzuordnen. Da diese Beeinträchtigung jedoch nicht dauerhaft ist und dazwischen immer wieder Perioden normaler Aktivität auftreten, merken die Betroffenen häufig gar nicht, dass ihnen nicht die volle Denkkapazität zur Verfügung steht. Sie wiegen sich dadurch in falscher Sicherheit, was beispielsweise für Lkw-Fahrer sehr gefährlich werden könne, schreiben Michael Chee von der Duke-Nationaluniversität von Singapur und seine Kollegen im Fachmagazin «Journal of Neuroscience» (Online-Vorabveröffentlichung, DOI: 10.1523/JNEUROSCI.0733-08.2008).

Anzeige
Auch bei ausgeruhten Menschen gebe es das Phänomen, dass hin und wieder die Aufmerksamkeit kurzfristig abfalle, erläutern die Forscher. Allerdings hält das Gehirn für diese Fälle eine Art Notstromaggregat parat: Sobald ein Nachlassen registriert wird, springen übergeordnete Kontrollregionen im Stirn- und Schläfenbereich des Hirns an und gleichen die fehlenden Kapazitäten aus. Nach einer durchwachten Nacht nimmt die Effizienz dieses Kompensationsmechanismus jedoch messbar ab, zeigen nun die Ergebnisse von Chee und seinem Team. Die Forscher hatten 17 Probanden einmal nach einer normalen Schlafperiode und einmal nach einem Schlafentzug von 24 Stunden Wahrnehmungstests durchführen lassen und dabei ihre Hirnaktivität überwacht.

Neben der verringerten Aufmerksamkeitskompensation fanden sich vor allem Auffälligkeiten in solchen Hirnregionen, die für Sinnesreize und Wahrnehmung zuständig sind. Zwar traten dort auch nach dem Schlafentzug Perioden mit normaler Leistungsfähigkeit und Aktivität auf, berichten die Wissenschaftler. Diese wurden aber immer wieder von Phasen unterbrochen, in denen die Intensität von Wahrnehmung, Weiterleitung und Verarbeitung speziell der visuellen Reize merklich reduziert war.

Diese beiden Zustände spiegeln nach Ansicht der Forscher den inneren Kampf wider, der nach einem Schlafentzug im Gehirn tobt: Der Bewusstsein gibt den Befehl, wach und aufmerksam zu bleiben, während andere Teile des Gehirns bereits auf den dringend benötigten Schlaf umgestellt haben. In den wachen Phasen mit normaler Hirnaktivität habe demnach das Bewusstsein die Oberhand, wohingegen das Gehirn in den Perioden, in denen Aufmerksamkeit und visuelle Verarbeitung plötzlich absinken, offenbar unbemerkt in einen schlafähnlichen Zustand falle. Als Nächstes wolle man nun schauen, ob sich dieses Abdriften etwa durch Stimulationen vermeiden lässt, erläutert Chee - damit die Betroffenen nicht nur subjektiv das Gefühl haben, voll leistungsfähig zu sein, sondern es auch tatsächlich sind.

(Schlafentzug verursacht kurze Stromausfälle im Gehirn