Archive for the ‘medically-oriented background info’ Category

DSM-comments and critiques: part 2

Sunday, August 19th, 2012

I'm still traveling, but back in the swing of writing posts on the numerous changes to DSM-5 the Diagnostic and Statistical Manual of Mental Disorders, that is supposed to be published (finally) in May of 2013. I've been reading background material, papers with advance critiques and older criteria for some of the diagnoses.

I still can't add photos as I'll be using my iPad until I get home four days from now, so I'll give you a URL or two and some titles you can Google.

I'll start with mentioning a long, multi-part fascinating paper that is appearing in 'Philosophy, Ethics and Humanities in Medicine'. It's called "The six most essential questions in psychiatric diagnosis: a pleuralogue."  The second author, Allen Frances, a Duke School of Medicine Emeritus Professor of Psychiatry, was the leader of the group that put together DSM-IV. I'll come back to these articles in a later post, but used the first one to find short commentaries on DSM-5.

The lead author, Dr. James Phillips, is an Associate Clinical Professor of Psychiatry at Yale. In November, 2011 he wrote an article for 'Psychiatric Times,' titled "The Great DSM-5 Personality Bazaar," and in March of 2012, in the same publication, a piece titled "DSM-5 In the Homestretch--1. Integrating the Coding Systems."

That first piece told of diagnoses, e.g., narcissistic personality disorder, that were excluded in drafts of the new manual and later re-included. The total number of Personality Disorders ended up at six, down from ten, at the time Dr. Phillips wrote his commentary.

One category that was removed was Paranoid Personality Disorder. It seemed worthy of inclusion to Phillips, certainly as much so as another that was kept. Personality Disorder NOS (not otherwise specified), was apparently changed to Personality Disorder Trait Specified (PDTS), which to me at least is a potentially confusing acronym. There is PTSD after all, a term most of us have some familiarity with.

Dr. Mark Zimmerman et al. published a study of 2,150 psychiatric outpatients (you  can find it at www.ncbi.nim.nih.gov/pubmed/21903031) which said that DSM-IV's method, using three trait categories: absent, sub threshold or present, was just as effective as the proposed diagnostic approach of DSM-5.

Dr. Phillips, in his later article, mentioned coding, how to relate the International Classification of Diseases (ICD) with the DSM, often for billing purposes. Our country has a treaty obligation to use the ICD and at present is using a "clinical modification" of the 1978 ICD-9 version. Some diagnoses used by US mental health (MH) therapists aren't in the ICD at all and the new American version of the international coding system, ICD-10CM, originally supposed to be available in October 2013, has been delayed. Of course the rest of the world already uses ICD-10.

Dr Frances in an April 25, 2012 piece in the New York Times, "Diagnosing the DSM-5: Shrink Revolt," was said to be opposed to the first draft of the new version as being too promiscuous with its diagnostic labels. He cites the proposed Binge Eating Disorder which may be present in 6% of the total US population (using the proposed definition).

He then commented, "And that is before the drug companies start marketing something for it."

He had similar reservations about three other tentative new DSM labels: one could be applied to kids with "typical temper problems;" another to anyone who has lost a spouse and is grief-stricken for two weeks. The third, "Psychosis Risk Syndrome," in his opinion, could misidentify many youngsters and treat others with anti-psychotic meds without any evidence that such early treatment is helpful.

By the May, 2012 meeting of the American Psychiatric Association (APA), two of those four problematic categories had been discarded. Dr Francis was still not content and published a NYT Op-Ed piece on May 11th titled "Diagnosing the D.S.M.," which said it's time to open the DSM's revision to all the MH disciplines as well to the primary care doctors who prescribe most of the drugs used for MH disorders.

 

DSM-V comments & critiques: the Rosenhan studies rehashed

Friday, August 17th, 2012

It won't be published in its final official form until May of 2013, but the new version of the Diagnostic and Statistical Manual of Mental Disorders, AKA DSM-V,  has already spawned lots of critiques and courses.

My wife, still doing some pro bono therapy, gets at least one offer a week to attend a seminar on DSM-V. I glance at those, but spend much more time on the background comments, including those from the lead editor of DSM-IV.

But let's start with the Rosenhan experiments. In 1973 an academic who was professor of psychology and law at Stanford had eight sane participants (himself among them) present to 12 hospital admission offices in five states with a chief complaint of "hearing voices." They said the messages conveyed were often unclear, but contained the words "empty," "hollow," and "thud."

In each case the voice was unfamiliar and of the same gender as the complainant (the group included one younger psychology graduate student, three professional psychologists, one psychiatrist, a pediatrician, a painter and a housewife). They gave false names, vocations and employment history, but all other details of their lives were true.

All were admitted to psychiatric wards, whereupon they acted completely sane and behaved as they normally would.

None of the staff recognized they were normal and 10 of them were given the diagnosis of schizophrenia. They remained in the hospital for a week to 52 days (average of  19 days) where a number of other patients suspected they were sane (35 of 118 did so with many vocalizing that the so-called patients were journalists or professors).

When the results of the study were initially made known, the staff of a week-known teaching/research hospital said they wouldn't make such mistakes.

At that point, Dr. Rosenhan set up a second experiment, telling the staff of the renowned center that he would send them  one or more spurious patients over the next three months.

In reality he sent nobody, yet the hospital staff, suspected a number of the 193 patients who were admitted during that time frame; the physicians, psychologists, nurses and techs alleged that 41 were fakes and, of those, 23 were suspected by one or more psychiatrists.

During their admissions all of the first group publicly took copious notes and the typical comment in the nursing notes was "Patient engaged in writing behavior."

They were only discharged with a diagnosis of schizophrenia in remission after admitting they were crazy and all were given medications  (which they did not swallow; they noted many patients did the same).

They seldom saw physicians except for fleeting encounters; in only 6% of these did staff doctors stop and chat or talk with them.

One comment about this famous study is, "It's hard to be sane in an insane setting?

Sorry, I can't give you links or photos, but I'm on an unplanned trip and using my iPad instead  of my laptop.

Hospital-induced delirium: part one: the basics

Friday, July 13th, 2012

When they return from surgery, will some be delirious?

About two months ago I visited a friend in the hospital. He's a little over 80 years old, has several significant chronic medical problems and had recently undergone surgery. When I arrived in his room, he was in bed, didn't recognize me and then sat up and started rowing. Obviously he was delirious and hallucinating.

I've seen him at home since and he's back to baseline, but the topic of post-surgery delirium surfaced in the July 4, 2012 issue of JAMA, so I started reading on the subject

I found an article in a 2004 issue of the American Journal of Psychiatry (AJP) that was a good start, but was clearly aimed at medical folk, especially those who would be prescribing medication for the mostly severely affected patients with delirium. The AJP piece said the first step is determining the cause...if possible. It mentioned that the word itself comes from the Latin word delirare, loosely translated as "to be out of one's furrow." My online dictionary defines delirium as an acute (as opposed to chronic) disturbed state of mind that occurs in fever, intoxication, and other disorders and is characterized by restlessness, illusions, and incoherence of thought and speech.

The most recent mental health Diagnostic and Statistical Manual of Mental Disorders, DSM-IV-TR, a much-used but somewhat controversial tome (I'll write about the DSM in a later post), says delirium is a syndrome (a collection of symptoms and physical signs) of many different causes and that its major features are confusion and loss of short-term memory. It mentions one classic sign, not seen in all cases by any means, is carpologia, a term I'd never heard before, but a behavior I've seen many times; it means picking at the bed sheets in a purposeless, repetitive fashion. The patient may be agitated, have delusions and hallucinations, and may try to remove their IV lines or climb out of bed.

On the other hand, some people have a lethargic, hypoactive form of the syndrome; those may be even tougher to diagnose.

A Mayo Clinic website mentions one hallmark of delirium is a sudden or relatively sudden onset with symptoms that tend to wax and wane. Input from family members as to the patient's pre-illness/surgery mental status may be very helpful in sorting out those who had pre-existing dementia from those who didn't, as the two conditions not infrequently co-exist.

too much alcohol can lead to delirium

It's not just surgical patients, of course; when I was in practice the term internists used was "ICU-itis, and medical patients, especially the elderly who were in Intensive Care for a prolonged period, were the ones we had to deal with most commonly. So a better term might be hospital-induced delirium. But some delirious patients have ingested substances causing the condition (PCP would be one example and alcohol withdrawal another), have heavy metal poisoning, medication-caused delirium, infections involving the central nervous system or metabolic disorders.

It's common, but much more so in older patients and a 2010 meta-analysis of forty-two high-quality studies concluded that delirium in this group is associated with poor outcomes, regardless of age, gender, preceding dementia, and other illnesses.

I'll come back to this frequent and often ominous issue in my next post. As our population ages, we'll likely see more of this condition. Planning in advance for hospital stays may help prevent some episodes of delirium.

 

 

Biological Warfare and Bioterrorism: part 1 History

Thursday, July 5th, 2012

This modern archer's arrow tips aren't as deadly

I wanted to write a post about the 1979 and 2001 anthrax attacks, but, as usual, decided that I should read about and write about the the background concepts first. I found a paper titled, "A History of Biological Warfare from 300 B.C.E. to the Present" written by an associate professor of respiratory care at Long Island University and decided I'd start there.

Our earliest reference to biological warfare concerns the Scythians, a tribe of nomads who migrated from Central Asia to Southern Russia in the 8th and 7th century BCE, founding an empire that lasted several centuries. Their prowess in war and their horsemanship were admired, but their archers were also a formidable force, using arrows dipped in the decomposed bodies of venomous snakes as well as in a mixture of decayed human blood and feces. This concoction would have had both snake venom and several deadly forms of bacteria.

A later episode of bio-warfare came in the 14th century when the Mongol army besieged the Crimean city of Caffa. The Tarters hurled cadavers into Caffa, having suffered from an epidemic of plague themselves. In doing so they not only got rid of their dead, but transmitted the disease to the inhabitants of Caffa. A memoir by the Genoese writer Gabriele de' Mussi which covers the war has been reviewed by Mark Wheelis, an emeritus professor of microbiology at UC Davis. He concludes that the Black Plague pandemic of that time period likely spread from the Crimea to Europe. It's still questionable if the Genoese traders in the city were the ones who brought the plague back.

Hannibal is famous for his use of elephants in the war between Carthage and Rome. He also used poisonous snakes, concealed in earthen jars and hurled onto opposing ships during a naval battle in 184 BCE against the leader of Pergomon.

Smallpox was used as a weapon in the French and Indian Wars (1754-1767). The outbreak of the disease at Fort Pitt led to a plan conceived by Colonel Henry Bouquet and General Jeffery Amherst. The British gave their enemies blankets and a handkerchief, all contaminated with pus from the sores of infected British troops, at a peace conference. Many other Indian tribes caught smallpox eventually, but this episode was a deliberate attempt to use the virus to gain advantage in a wartime situation.

And this prison is safer than Unit 731

In World War II, the Japanese Army established the infamous Unit 731. The country had refused to sign the 1925 Geneva Convention ban on biological weapons and in 1932, after Japan invaded Manchuria, an army officer and physician named Shiro Ishii was put in charge of the "Anti-epidemic Water Supply and Purification Bureau." a euphemism for a prison camp for bio-warfare research.

He and his staff spent years experimenting with many the world's most deadly diseases, using Chinese prisoners as guinea pigs for cholera, plague and dysentery. Over 10,000 of the victims died. Field trials included spraying bacterial cultures from airplanes over eleven Chinese cities.

Ishii was eventually replaced as head of the unit by Kitano Misaji, who later became Lieutenant Surgeon General; after the war the United States granted both immunity from war crimes prosecution in exchange for their bio-warfare knowledge. Ishii had faked his own death and gone into hiding, but was captured and interrogated. Neither man's information proved to be of much value in the long run. Ironically Misaji  later worked for the Japanese pharmaceutical company Green Cross, becoming the chief director of the company in 1959.

I've skipped many vignettes, but it's time to get to the modern era, returning to anthrax as a major subject to discuss.

 


 

 

 

Tuberculosis Part two: Treatment options, Old, New or None

Monday, June 25th, 2012

Robert Koch discovered the bacillus that causes TB

In the era before antibiotics, especially in the late 19th century, tuberculosis, often called the "white plague," was a leading cause of death in the United States. Eastern patients were urged by their physicians to travel to Colorado where the fresh mountain air was supposedly curative. Hundreds of those stricken with TB went to Denver which was sometimes called "the world's sanatorium." Hospital beds were scare and a Jewish woman, Frances Wisebart Jacobs, recognized as the founder of what is today's United Way, spearheaded a movement to fund a TB hospital which eventually became National Jewish Hospital and served patents of any creed with free care for the indigent.

Prior to the advent of antibiotics, TB continued to treated in sanatoriums, where rest, fresh air (and sometimes surgical collapse of a lung or removal of a cavity) were all the therapy available. Extracts from one such patient's diary, begun in April of 1944 and continued through early June of 1946, are available online having been published in the British Journal of the Royal Society of Medicine in 2004. 

Latent TB is usually treated for nine months with a single antibiotic, usually INH like I was given in the late 1970s. Active TB is quite a different matter and for those with drug-susceptible infection, usually requires a four-drug combination given for six months. This may be given as directly observed therapy with a healthcare worker meeting the patent at their home or workplace.

If the patent stops taking the antibiotics too early or if the particular TB bacteria involved are drug resistant (this occurs more frequently in people with HIV/AIDS and those who have had previous TB therapy), or the medications used are of poor quality, multi-drug resistant TB can develop.

A 2002 paper titled "Tuberculosis therapy: past, present and future," while detailing the history of TB treatment, mentions that George Bernard Shaw, in his 1911 play, The Doctor's Dilemma, talked about the early attempts to cure the disease as a "huge commercial system of quackery and poisons." Elsewhere in the play, Shaw mentions "stimulating the phagocytes," referring to white blood cells that can engulf invading bacteria. The modern take on this 1911 quote would be a TB vaccine.

The only one I'm aware of is BCG, an attenuated live viral vaccine produced from the bovine strain of tuberculosis. It's been around since the 1930s and has been given to at least 5 million people globally, but its major benefit has been to prevent spread of TB to sites other than the lungs. At this stage we really need an effective immunizing agent, as our pharmacologic attempts to keep TB in check are becoming less and less effective.

We all need to support efforts to stamp it out.

Workers at the CDC published a 2007 paper titled "Worldwide Emergence of Extensively Drug-Resistant Tuberculosis" (XDR-TB) mentioning that nearly 90 countries have had multi-drug resistant TB (MDR-TB) requiring usage of so-called second-line drugs for at least two years. Now we've gone beyond that as nearly 10% of the 3,520 samples from an international MDR-TB pool were also untreatable with a majority of those drugs.

So we're up to 2012 and The Wall Street Journal's article, "India in Race to Contain Untreatable Tuberculosis." TB already kills more people than any other infectious disease except for HIV. But since the first cases of XDR-TB were reported in Italy in 2007, Iran and now India have also been struck by the strain.

There's an urgent need for an effective vaccine or a totally new approach to TB treatment.

 

Tuberculosis Part one: How long has it plagued us?

Saturday, June 23rd, 2012

I was reading an article in The Wall Street Journal this morning about "Untreatable tuberculosis in India" and decided to explore the background data before writing about what we're facing now.

I have a personal acquaintance with TB; when I returned to Air Force Active Duty status in 1977, I got a TB skin test. Much to my surprise it was positive.

I'm glad my chest x-ray didn't look like this

My chest x-ray was normal; I had none of the symptoms of active TB: chronic cough with blood-tinged sputum, night sweats, fever and weight loss. So I didn't have active disease and could be treated with only one drug; the infectious disease specialist told me I would take a medicine called isoniazid (INH) for a year.

I found out that about a third of the entire world population has been infected with the human variant of TB, Mycobacterium tuberculosis. In the US, 5-10% of the population will have a positive skin test; in other parts of the world, especially in some Asian and African countries, up to 80% will test positive.

Around the world new TB infection are estimated to occur at the rate of one per second, nine million cases a year with 95% of those living in developing countries. The vast majority of those remain asymptomatic. Of those who have a normal immune system, roughly 5-10% will ever develop active disease. But if you have HIV you have at least a 30% chance of moving on to symptomatic disease & x-ray-positive TB; other studies place the risk even higher, at 10% per year.

Now that milk is pasteurized, most of us in the US don't have to worry about the bovine strain of TB. But that isn't true everywhere, so beware of drinking unpasteaurized milk when you travel abroad.

A detailed online history of TB from the New Jersey Medical School commented that 2-3 million people die of the infection every year; the vast majority of those lived in developing countries. The ancient Greeks called the disease phthisis. It's been with us for millennia; ~4,500-year-old spinal column bits and pieces from mummies in Egypt  were the earliest evidence of human infection that I had been familiar with, but I found an article that doubled that estimate. Bones from an ancient site off the coast of Israel, estimated to be 9,000 years old, not only had the characteristic signs of TB, but also had DNA and bacterial cell wall lipids that could be analyzed by modern techniques.

One of his ancestors had evidence of the earliest TB we're aware of

Researchers from England commented that the tuberculosis we see today came from a human strain of the bacteria, not from a bovine origin. They also said that the DNA was subtly different from that of TB organisms today and felt this meant there has been a very long linkage between this infection and people. But the very earliest animal to have clearcut evidence of TB was a long-horned 17,000-year-old bison with skeletal remains showing the disease.

TB outbreaks still occur in the US. The June 20, 2012 edition of JAMA has a CDC report of cases which occurred in a homeless shelter in Illinois. The majority of the 28 patents involved (82%) had a history of excess alcohol use  and many had longer stays in the men's section of the shelter and socialized in two bars in the area.

The risk factors seen in developing countries: lower socio-economic status and overcrowding, seem to me to have played a role in this US series of patients. Alcohol over-usage has been implicated as a risk factor for TB,  perhaps from repeated prolong close contacts in bars and perhaps from effects on the immune system.

I'll get back to the current issues with TB in my next post.

 

 

 

Heart attacks Part 2: Prevention: risk factors & our kids

Wednesday, May 23rd, 2012

Here's a risk factor you can eliminate

This post pings off the April 17, 2012 article in The Wall Street Journal, "The Guide to Beating a Heart Attack." I initially wrote about surviving a heart attack (myocardial infarction {MI} is the medical term). Next I wanted to turn toward the prevention side.

I first found the Interheart study's article from 2004, "Nine modifiable risk factors predict 90% of acute MI." The study followed 29,000 people from 262 sites in 52 countries and concluded that the common belief that half of heart attacks can be predicted was clearly an underestimate.

The research group found the same impact of the nine variables everywhere in the world: abnormal blood lipids (fats, like cholesterol) and smoking were at the top of their list. Then came diabetes, high blood pressure, abdominal obesity, stress & depression, exercise, diet and alcohol intake.

I was used to measuring cholesterol and its HDL (so-called good cholesterol)  and LDL (bad cholesterol) components. This study actually used a more sophisticated lipid approach.

They measured the ratios of  the proteins that bind to and carry fats, apolipoproteins A and B. APOA is associated with HDL lipids while APOB is said to unlock the door to cells and in doing so acts as an unwelcome delivery van for cholesterol. When present in high levels, APOB can lead to plaque formation in blood vessels and an increased risk of coronary heart disease (CHD).

They also found some good news: as expected, eating fruits and vegetables daily, exercising and perhaps moderate alcohol intake were associated with lower risks of CHD. Again this was true everywhere in the world.

The WSJ article mentioned that hospital admissions for heart attacks had actually decreased among the elderly; these nine factors were better predictors in younger groups. What can be done to stop the looming specter of CHD among our younger population?

The CDC examined the parameters in a recent online article titled "A Growing Problem." One issue was "screen time." Our kids eight to eighteen average four an a half hours a day watching TV and three more on cell phones, movies, computers and video games. I even read an article about a two-year-old whose parents think learns a lot from their iPad. Maybe so, but how much exercise does that kid (and his older compatriots) get?

The CDC feels there is a dearth of quality physical activity in our schools; as of 2009 only a third of them provided daily PE for our kids. And after they leave school or when they're on vacation, many don't have safe access to biking, hiking, running, playing areas and trails.

Somerville chose healthier food in their schools

One Massachusetts community, Somerville, has gotten attention for their anti-obesity integrated program, "Shape Up Sommerville"  (You can watch the thirteen minute PBS special on their community-wide progress). The Robert Wood Johnson Foundation is attempting to help similar programs get started across the country, especially focusing on childhood obesity.

Recently I heard a NPR comment that caught my attention. If we don't do something to stop the epidemic of childhood obesity, we'll soon be seeing CHD rates soar in people in their 20s and 30s and maybe even younger.

A French researcher said, "Mankind is doing a good job of killing himself."

We need to try new approaches to help our kids. The Somerville plan sound like a good place to start.

 

 

 

Surviving, or better still, preventing heart attacks: Part 1: After it happens

Friday, May 18th, 2012

Heart attacks frequently cause sudden cardiac arrest

The April 17, 2012 edition of The Wall Street Journal had an article titled "The Guide to Beating a Heart Attack." It had both good news and bad: since the 1970s the annual number of American deaths from heart attacks (the "med-speak" term is myocardial infarction or MI) has diminished by three fourths; on the other hand nearly a million of us will have an MI this year and many of those will die.The National Vital Statistics Reports estimate for 2010 was 595,000 deaths from heart disease (of all kinds)  and the Seattle-King County 2012 estimate is 480,000 adults dying from an MI or its complications.

A quarter million die from sudden cardiac arrest (SCA) and the majority of those happen in a non-hospital location. Only 7.6% of people who  have an SCA outside a hospital survive to be discharged to home. This figure varies markedly according to where you live. If you happen to reside in Rochester, NY, your odds are much better. Bystander-witnessed cardiac arrest victims there who have the typical heart rhythm disorder that leads to sudden cardiac arrest (it's usually due to a chaotic quivering called ventricular fibrillation{VF}), have a 50% chance of survival to discharge from the hospital.

My mother, as I've mentioned before, was one of the fortunate ones. She didn't live in Rochester or in the Seattle area which also has a superb track record.  But she had a bystander-witnessed event, got prompt CPR and a rapid response from a trained Advanced Cardiac Life Support (ACLS) team, and lived another 16 years.

The Seattle-King County concept is termed "Community Responder CPR-AED." They knew that most people who die from SCA have VF and the only "cure" was to use a defibrillator. Most non-medical people wouldn't be able to operate the complex gadgets used in hospitals. The answer was the AED, an automated external defibrillator developed nearly twenty years ago.

The American Heart Association" Science Advisory commentary on AED use by non-medical people has a four-point program for out-of-hospital SCA: early recognition followed by a 911 call; early bystander-performed CPR; early AED use and then early ACLS.

look for this sign

They included several extra points I hadn't thought about, having always performed CPR-defibrillation & ACLS in hospital settings. Early CPR increase the possibility that defibrillation will stop VF and the heart will then resume its normal rhythm; it does so while providing blood flow to the brain as well as the to heart. And all the AED does is stop the VF abnormal heart rhythm enabling the heart to restart normal beating, but the heart rate may be slow to begin with, so CPR may be necessary for several more minutes.

Early CPR also increases overall survival rates; if it's not being provided, every minute between the patient's collapse and defibrillation lowered that rate by 4-6%.

Given all that, one of the first things the state of Washington did was to pass a law granting immunity from civil liability for any person (or entity) who acquires a defibrillator. Then they started wide-spread CPR and AED training (learning to use an AED is easier than learning CPR) and markedly increased their paramedic numbers.

The life-saving results have been very impressive. My question now is whether to buy an AED for our home.

 

The 1918 flu virus and its descendants: Part 2 Rediscovering the culprit

Sunday, May 13th, 2012

many other major pandemics were associated with rodents, but not the 1918 flu

I re-read my last post a day after writing it and amended the first line, since I found it misleading. It was the worst flu pandemic ever, but I knew that smallpox, the Black Plague, AIDS, malaria and perhaps even typhus each have caused nearly as many or even more deaths over a period of years. I eventually found a rather strange, non-medical website with the "7 Worst Killer Plagues in history," and confirmed my belief that no other bacteria or virus had wreaked as much havoc in brief span of time as the 1918-1919 H1N1 influenza virus.

I wanted to find out what happened to that highly pathogenic organism and, after searching the web, realized the PBS article on the "Spanish flu" was a good place to start. It mentions that the influenza virus was not identified until 1933 and that the actual genetic identity of the particular strain involved in that pandemic (as opposed to the basic type...H1N1) was not identified for many years. The influenza virus responsible for the 1918-1919 pandemic has had many descendants, none as deadly as their ancestor.

In 1950, Johan V Hultin, a graduate student starting his doctoral studies in microbiology, got a clue from a visiting professor who suggested hunting for the virus in bodies buried 32 years prior in the permafrost of the Arctic. Hultin and his faculty advisor traveled to Alaska where flu among the Inuits had been especially deadly with 50 to 100% death rates in five villages.

early days in the Far North

Gold miners, under contract with the Territorial government, had served as grave diggers in 1918-1919 and tissue samples were recovered from four bodies exhumed in 1951. Pathology slides fit with viral lung damage and, in some cases, secondary bacterial pneumonia. But tissue cultures from the samples did not cause disease in ferrets and no influenza virus was recovered.

It wasn't until 1995 that science had advanced enough to for researchers to start the work necessary to identify the virus's unique features. Jeffrey Taubenberger, a molecular pathologist then working at the Armed Forces Institute of Pathology (AFIP), began a ten-plus-year-long project starting with autopsy tissues from the time of the pandemic that had been preserved in the National Tissue Repository. His project was stimulated by a paper published in the journal Science in February, 1995, in which preserved tissue samples from the famous British scientist John Dalton (often called the father of modern atomic theory) were examined. Dalton was color-blind and had donated his eyes at his death in 1844 to determine the cause of the defect; his DNA was studied 150 years later and the resultant publication gave Taubenberger the impetus to do the same with the flu virus.

Hultin read the first paper from Taubenberger's group, wrote to him and eventually went back to Alaska to exhume more flu victims. One was an obese woman whose lungs had the findings of acute viral infection. Samples of these permafrost-preserved tissue had RNA incredibly similar to those obtained from the AFIP National Tissue repository.

And so began an amazing chapter in the history of virology.

The 1918 flu and its descendants: part 1

Friday, May 11th, 2012

In some years this sign should be in red

The worst flu pandemic of all time began near the end of World War I, in the fall of 1918. It killed, in the next year, somewhere between 20 and 50 million people across the globe.  The comparison to WW I deaths, eight and a half million from all countries involved, is striking.

There had been major influenza pandemics before and since, some severe and some relatively mild. The term itself conventionally refers to a worldwide outbreak of an infectious disease with some adults in every continent (except Antarctica) involved, but doesn't imply how lethal the illness is.  For example the H1N1 "swine flu" pandemic in 2009-2010 involved 74 countries, but the death rate was relatively low.

Stanford University has a superb description of the so-called Spanish flu online. Usually flu kills the very young and the very old more than young adults; this time was different with far more deaths between the ages of 20 and 40 (some say 20-50 and others 15 to 34) than in the typical flu season. The influenza-related death rate, normally about 0.1%, has been estimated at 2.5 to 3% and may have been even higher. A fifth to a third of everyone alive at the time caught the virus, so half a billion victims may have been inflicted.

For Americans, including soldiers, the end of the war was near, but over 40,000 servicemen and nearly two-thirds of a million back home would die of this modern plague.

The precise origin of the disease is unclear; swine were affected in a nearly simultaneous fashion, but have not been blamed for the human ailment. The war itself and its resultant transportation of large numbers of troops, could have facilitated its spread globally. A first wave of the infection struck American army encampments in the United States, but was comparatively mild, at least when contrasted to the second and third outbreaks later in 1918 and then in 1919.

He was at risk as well

Public health measures were widely instituted, but the actual effectiveness of quarantine, gauze face masks, limited school closures and banning of public events is unknown.

In the midst of what for many was a typical flu infection, some developed a highly virulent form of the disease, with a strikingly abrupt onset, fever, exhaustion and rapid progression to pulmonary complications and death.

Many cases developed secondary bacterial infections and one species of bacteria was initially blamed for the disease. Then two French scientists reported a filter-passing virus in the British Medical Journal in November 1918. They used filtration to remove bacteria from the sputum coughed up by a flu patient and then injected the remaining fluid into the the eyes and noses of two monkeys. After their primate subjects were noted to have fevers, a human volunteer was given a subcutaneous injection of the same filtrate. He was the only person in their laboratory to develop the flu.

The extraordinary mortality rate of the 1918 influenza is shown on a graph plotting deaths in America from a variety of common infectious diseases over the years from 1900 to 1970. Another way to gauge the impact of the pandemic is to note that average life expectancy in the United States fell by ten years for that period.

And yet the incidence of influenza ebbed and since 1920 we've returned to the normal cycle of seasonal flu, intermittent epidemics and occasional pandemics, none as severe and deadly as the Great Flu of 1918-1919.