The MH Romp part 2: Why so many codes?

July 27th, 2012

medical bills are complex

Today I got a mini-brochure in the mail from the American Medical Association, AKA the AMA, offering the CPT 2013 Professional Edition and the ICD-9-CM 2013 Professional Edition with a special offer; more than 40% off if I order both by September 1, 2012. That's $114.66 versus the package list price of $202.90. WOW!

Okay, what are those things and why would I (or rather a physician still seeing patients) want them? CPT is the acronym for Current Procedural Terminology, a series of numeric codes developed and periodically updated by the AMA that are used for billing patients or insurance companies for each and every service that a doc may provide to a non-Medicare patient. If the person seen by a physician is over 65 (or is otherwise eligible for Medicare), then a set of codes called HCPCS are used for billing purposes. The acronym stands for Healthcare Common Procedure Coding System. Level I HCPCS codes are identical to CPT codes. Level II codes cover things that don't usually get billed by a physician such as ambulance services or durable medical equipment. Medicare and Medicaid handle those differently than insurance companies do.

But I want to focus on Mental Health issues and eventually get into two other sets of handbooks, both with their own codes. So first some history.

The initial development of MH illness groupings in the US dates back to the 1840 census when a category for "idiocy/insanity" was added. Forty years later, census data included seven MH entities, some of which would be classified as medical diseases now (e.g., epilepsy).

There's a nice history of the DSM itself on an American Psychiatric Association (APA) website. It's been an amazing saga leading to multiple disputes and perhaps even some acrimony. Basically WW II heavily influenced the development of the DSM and, for that matter, the ICD. It was the first time large numbers of US psychiatrists, headed by Brigadier General William Menninger, got directly engaged in all aspects of military personnel selection, processing, assessment and treatment.

Medical 203, a brand new classification of MH disorders came out in 1943 as a War Department Technical Bulletin. All the Armed Forces and, with slight alterations, the Veterans Administration, adopted this nomenclature. When psychiatrists returned to private practice, after the War, they brought Medical 203 to their hospitals and clinics.

The World Health Organization added MH diagnoses in their International Classification of Diseases (ICD) with the 6th revision in 1949. Prior to that the ICD had primarily been used to classify the causes of death in various populations, an effort dating back at least to the mid 1850s.

The forward to DSM-I mentions the ICD-6 "categorized mental disorders in rubrics {names, titles, groups} similar to those of the Armed Services nomenclature." But the APA formed a committee to standardize a version to be used in the Untied States. A tenth of the total membership of the APA eventually had a chance to voice their opinions and over 90% liked the adaptation, eventually titled the Diagnostic and Statistical Manual of Mental Disorders, DSM for short.

In 1998, L.J. (Lawrence) Davis, a long-term writer and contributing editor for Harper's Magazine, published an article titled, "The Encyclopedia of Insanity: A Psychiatric Handbook lists a madness for everyone." He was referring, of course, to the DSM, in this case to DSM-IV.It's a harsh critique, saying the DSM was dogmatic with a theme that we're all either crazy or about to be, but treatable, by guess who...the selfsame authors of the 866-page-long book.

For those of us with medical insurance, we can let the third-party payer sort things out

So there have been numerous versions of the ICD and we're about to see DSM-V in May of 2013. Both have billing codes which are identical in large part. Most psychologists use DSM, but third-party payers use the American version of ICD-9-CM (the CM stands for Clinical Modification).

But there's still controversy and disagreement and the HHS Secretary announced that as of October 1, 2013, the ICD-10-CM codes must be used. The rest of the world has used these since 1990.

Yet DSM-V is supposedly coming out in May, 2013 and ICD-11 is due in 2014.

 

 

 

Romping through the Mental Health field: part one: various clinicians and MH medications

July 24th, 2012

talk therapy often helps

Although my training was in Internal Medicine (IM) with a subspecialty in Nephrology (nephros is Greek for kidney), I've had a long-standing interest in Mental Health (MH) issues. My medical school (University of Wisconsin circa 1962-1966) included four years of MH classes. Then I supervised, in various Air Force roles, mental health departments and later commanded a hospital with 90 MH beds (versus 15 each of IM, Peds, Ob-Gyn & Surgery). But there's an even more important reason for me to care about the field...my wife is a clinical social worker, i.e., one who sees patients (they now call them "clients," but I don't) with MH problems.

Since I'm married to a MH therapist, I'm familiar with the Diagnostic and Statistical Manual for Mental Disorders, DSM for short. It'a a publication of the APA, the American Psychiatric Association and is the "lingua franca," the standard terminology for all working in the MH field. That includes psychiatrists, who are, of course, MDs and therefore can prescribe medications (I often get the impression that's all many of them do these days), psychologists (PhDs who, in general, can't, although two states, New Mexico and Louisiana, have enabled some who obtain a master's degree in clinical psychopharmacology to write RXs for MH meds only), clinical social workers (who have a Masters degree or, occasionally, a PhD, but can't prescribe pills), and a variety of other therapists (e.g., marriage and family therapists), most of whom have a Masters degree and can't prescribe meds.

Of course those of us who have MD degrees and aren't MH specialists can also order MH meds; internists and family practice physicians do so fairly often. I saw our FP today for  my sore hand and raised the subject; my impression was she'd be uncomfortable writing kids' MH prescriptions, but is quite at home with adults. I'd think that most surgeons and Ob-Gyn physicians would be less likely to write RXs for many of these drugs without consulting another physician who's used to the meds and their side effects. The standard anti-anxiety drugs would be an exception, of course, and perhaps they might treat a patient with relatively mild depression.

Whenever I saw a patient with significant depression or other major MH issues I put in a call to the psychiatrists.

So who is available to treat this youngster?
Mostly likely his pediatrician.

I didn't know much about pediatric mental health issues. I found an online paper titled "Strategies to Support the Integration of Mental Health into Pediatric Primary Care" from an organization I at first thought was part of the NIH. Then I Googled it and realized the National Institute for Health Care Management wasn't governmental, but a non-profit. The Executive Summary of "Strategies" stated up to one in five children and adolescents in the U.S. experience MH issues with 50% of all lifetime mental disorders being seen by age 14.

The issue for Pediatrics is there aren't very many psychiatrists who specialize in kids; so three ideas have been suggested: telemedicine for remote Pediatric practices; co-locating a pediatric psychiatrist in a large Peds practice or collaborative care via what's called the "medical home model." But for Pediatrics in general, the integration of MH care into primary care is desperately needed; again there just aren't enough Peds therapists to see all the youngsters with MH issues.

I'll get back to the DSM in my next post; its history is interesting and its latest version, to be published in May 2013, has caused a lot of controversy.

It's finished now

July 15th, 2012

If you read my posts regularly then, by my error, you got a partially written one in the past few days. It was my second post on delirium and I finished it this afternoon.

So if you're interested, here's a live link to my website

Delirium part two: prevention strategies and risk factors

July 14th, 2012

A familiar photo may help

I went back to the JAMA article I briefly mentioned in my last post; one of its most striking comments is that postoperative delirium is significantly under-diagnosed. The physicians and other staff taking care of surgical patients may miss connecting the dots correctly 80% of the time!

With that in mind, my wife and I decided each should, as a baseline, have a list of all the medicines the other is taking. If one of us needs a major operative procedure, the other can help prevent an episode of delirium in several ways.She handed me a list of her meds the next morning and said our dentist had wanted it the last time she was seen, so she had it put into her computer's word processing program. I finally got mine typed up this morning.

There are a number of other things we could do for each other and some our local hospital could start doing. Bringing in a familiar picture/photo from home, our bedside clock and perhaps a few other objects we see every day would help "de-sterilize" the hospital environment. Of course, if one of us were in an ICU we'd have to check what the rules are, but I'd push the envelope here.

Some hospitals have established senior units in their emergency departments, ICUs and wards. I can remember seeing a calendar with the day of the week, month and year during one of my postoperative stays in the past 13 years here (I've had a total knee replacement, two lower back surgeries and two cataracts removed). The JAMA article advocated a daily delirium check with diagnoses, evaluation and treatment detailed in the medical record.

An intensive care setting can be disorienting with its machines, constant lighting and noise

Dimming the area's lights to maintain a "normal" day and night pattern can be helpful as can co-management of older patents by geriatrics specialists, intensivists or hospitalists, although, in my case, I'd like to see my primary care physician dropping by even if she had nothing to do with managing my care

A May 2012 publication online in Critical Care of a prospective controlled trial of the use of nighttime ear plugs by adult ICU patients showed promising results in a relatively small study. The most significant difference was in the percentage of patients who had an episode of mild confusion. The authors felt there was a correlation with improved sleep patterns.

Of the associated major risk factors: "advanced age" i.e., >70-75 years, preexisting dementia and functional disability, are crucial, but among surgical patents the procedure itself and the anesthesia used also play highly important roles (e.g., major heart and vascular procedures are far more likely to be associated with an episode of delirium than cataract surgery).

We've visited a number of older friends in medical settings in the last six months. In each case a spouse was omnipresent and we think we need to do the same for each other. We'll also be advocates for dimmer lights in the evening and maybe even try ear plugs.

 

Hospital-induced delirium: part one: the basics

July 13th, 2012

When they return from surgery, will some be delirious?

About two months ago I visited a friend in the hospital. He's a little over 80 years old, has several significant chronic medical problems and had recently undergone surgery. When I arrived in his room, he was in bed, didn't recognize me and then sat up and started rowing. Obviously he was delirious and hallucinating.

I've seen him at home since and he's back to baseline, but the topic of post-surgery delirium surfaced in the July 4, 2012 issue of JAMA, so I started reading on the subject

I found an article in a 2004 issue of the American Journal of Psychiatry (AJP) that was a good start, but was clearly aimed at medical folk, especially those who would be prescribing medication for the mostly severely affected patients with delirium. The AJP piece said the first step is determining the cause...if possible. It mentioned that the word itself comes from the Latin word delirare, loosely translated as "to be out of one's furrow." My online dictionary defines delirium as an acute (as opposed to chronic) disturbed state of mind that occurs in fever, intoxication, and other disorders and is characterized by restlessness, illusions, and incoherence of thought and speech.

The most recent mental health Diagnostic and Statistical Manual of Mental Disorders, DSM-IV-TR, a much-used but somewhat controversial tome (I'll write about the DSM in a later post), says delirium is a syndrome (a collection of symptoms and physical signs) of many different causes and that its major features are confusion and loss of short-term memory. It mentions one classic sign, not seen in all cases by any means, is carpologia, a term I'd never heard before, but a behavior I've seen many times; it means picking at the bed sheets in a purposeless, repetitive fashion. The patient may be agitated, have delusions and hallucinations, and may try to remove their IV lines or climb out of bed.

On the other hand, some people have a lethargic, hypoactive form of the syndrome; those may be even tougher to diagnose.

A Mayo Clinic website mentions one hallmark of delirium is a sudden or relatively sudden onset with symptoms that tend to wax and wane. Input from family members as to the patient's pre-illness/surgery mental status may be very helpful in sorting out those who had pre-existing dementia from those who didn't, as the two conditions not infrequently co-exist.

too much alcohol can lead to delirium

It's not just surgical patients, of course; when I was in practice the term internists used was "ICU-itis, and medical patients, especially the elderly who were in Intensive Care for a prolonged period, were the ones we had to deal with most commonly. So a better term might be hospital-induced delirium. But some delirious patients have ingested substances causing the condition (PCP would be one example and alcohol withdrawal another), have heavy metal poisoning, medication-caused delirium, infections involving the central nervous system or metabolic disorders.

It's common, but much more so in older patients and a 2010 meta-analysis of forty-two high-quality studies concluded that delirium in this group is associated with poor outcomes, regardless of age, gender, preceding dementia, and other illnesses.

I'll come back to this frequent and often ominous issue in my next post. As our population ages, we'll likely see more of this condition. Planning in advance for hospital stays may help prevent some episodes of delirium.

 

 

Biological Warfare and Bioterrorism: anthrax accident and attack

July 10th, 2012

This is a safe way to see anthrax bacteria

In 1969 President Nixon, prodded repeated by the Harvard biologist Matthew Meselson, decided to end all United States offensive biological warfare (BW) research. The story is told in detail in Volume III of the National Security Archive; the concept eventually led to the 1972 Biological and Toxin Weapons Convention, signed by more than one hundred nations.

Our BW research scientists were very unhappy with the decision and thought some countries, especially the Soviet Union, would not adhere to the treaty at all; they were eventually proven to be right. Our CIA apparently kept a "small" amount of various infectious agents and toxins including 100 grams of anthrax.

In April 1979 the most lethal anthrax epidemic to date began to unfold near the Denver-sized city of Sverdlovsk in Russia. The winds were blowing in a fortunate direction, so only 68 people died; otherwise the casualties would have likely numbered in the hundreds of thousands. Russia said nothing about the incident.

In late 1979 and early 1980 a Russian-language newspaper published in West Germany reported an explosion at a military installation near Sverdlovsk and estimated  a thousand resultant deaths from anthrax. Our April, 1979, satellite images were reviewed and roadblocks and decontamination trucks were noted in that area in the spring of 1979.

The Russians, in a March 1980 article, denied this was anything other than a naturally occurring outbreak. Later they claimed any deaths resulted from consumption of meat from anthrax-contaminated cattle and that veterinarians had reported deaths in animals before any human casualties. Dr Meselson arranged a 1988 meeting with Soviet scientists and was convinced that the tainted-meat explanation was plausible.

Finally in 1991, after the Soviet Union broke up and Boris Yeltsin headed Russia, he told President Bush that the KGB and Soviet military had lied. Apparently a technician at the BW plant had neglected to place a filter correctly; the result was a plume of anthrax spores. The Russians shut down the Sverdlovsk plant, only to build a larger one in Kazakastan.

October, 2001, brought an anthrax attack to the United States. A sixty-three-year-old man living in Florida was taken to his local emergency department, confused, vomiting and  having an elevated temperature. An alert infectious disease consultant considered the possibility of anthrax after a spinal tap revealed both white cells and the typical bacilli. A prompt response and rapid escalation of information ensued and the bacterium was found in the patient's workplace. Subsequently a history of the victim having examined a powder-containing letter was obtained from his co-workers. Other letters arrived at political and media targets in four states and Washington, DC.

Fast forward to 2010. The FBI finished its eight-year investigation, saying a US biologist who committed suicide in 2008 had been solely responsible for the 2001 bioterroism attack. The man implicated had worked at the Army laboratory at Fort Detrick and the Justice Department concluded he alone mailed the anthrax letters which killed five people, sickened seventeen others and led to a multi-billion-dollar, terror-laced response.

An October, 2011, article in USA Today said each FBI field office now has personnel dedicated to investigate biological attacks, while the bureau itself has an entire major section for counterterrorism: biological, chemical or radiological.

Most mail is absolutely safe

The same infectious disease expert involved in the uncovering of the 2001 letter attacks wrote a followup article published in January 2012 in the Annals of Internal Medicine. He mentioned the initial victim, a photo editor for a supermarket tabloid newspaper, was the 19th known case of inhalational anthrax in the US since 1900 and the first since 1976. Each of the other patients' exposure was directly associated with their work in mills or labs.

One other piece of irony was a Canadian study of anthrax published in September 2001 showing even unopened envelopes containing anthrax spores were a threat. The results had been emailed to our CDC on October 4, 2001, but the message was never opened.

 

Biological warfare and Bioterrorism part two: anthrax

July 6th, 2012

Don't ever open one of these sacks, unless it's your job and you're in full protective gear

I'm reading a book called Germs: Biological Weapons and America's Secret War. It was written in 2001 by three Pulitzer-prize-winning senior newspaper reporters and starts with an event most of us never heard about; immediately after the horrific 9-11 attack: a trained New York National Guard team was sent to NYC to determine if there had also been an accompanying germ warfare attack.

There's a difference between biological warfare and bioterrorism; in one sense it's a matter of scale. In another it's a matter of purpose. In biological warfare the intent is to kill or incapacitate an enemy force. Actually, utilizing the latter approach is likely to be more effective, as it ties up large numbers of support personnel, moving the sick and taking care of them in medical facilities.

Bioterrorism attacks may injure or kill a much smaller absolute number of victims, but, as the term suggests, can spread terror through a huge population base.

So why concentrate on anthrax?

Sheep can get anthrax

Anthrax has a long history in North America, likely arriving thousands of years ago via the Bering Land Bridge. It's been a rare cause of death in the US: most cases here involved mill employees working with wool, farmhands, those who work in tanneries and, potentially, veterinarians. Anthrax was known as a disease of hoofed animals and people caught it from infected beasts. The usual form was cutaneous with a sore like a bug bite that could eventually turn into a black, usually painless skin ulcer. If unrecognized and untreated, the bacteria could spread to the blood (sepsis) with a 20% chance of death; less than 1% died if treated.

Most sources say anthrax spread from person to person never occurs; a few mention rare transmission of the cutaneous form.

But there are two other forms of the disease: the gastrointestinal kind occurs when a person consumes meat from an infected animal. It's been quite rare in the US with one case in 1942 and a second in 2010, but is also quite deadly with a death rate estimated variously at 25 to 60% worldwide and the effects of post-exposure treatment unclear.

And then there is inhalational anthrax, caused when someone breathes in anthrax spores, the dormant phase that can live in soil for many years. When this form occurs, the death rate, which used to be over 90%, even with early recognition and the best possible care, is now estimated at 45%.

The last case of inhalational anthrax occurring naturally in the US was in 1976.

So why did our military gear up to immunize 2.4 million soldiers and reservists in December 1997? After all, President Nixon, in November, 1969, had announced that our country  would totally abandon the use of lethal biological weapons and confine its research in the area to defensive measures. In 1972 the US, the Soviet Union and over a hundred other countries signed the Biological and Toxin Weapons Convention, banning the use of BW.

But many of our own scientists thought this was a mistake. They were proven correct when the anthrax epidemic at Sverdlovsk occurred only seven years later.

The military anthrax vaccination program has a fairly simplistic website, designed to walk young troops through carefully selected and presented facts about the anthrax vaccine. The vaccine has been available since the 1940s and 1950s and was tested in mill workers in the late 1950s. The modern version was licensed in the United States in 1970, and in January, 2002, the FDA allowed the company making it to begin routine distribution from a newer manufacturing plant. The same company is working on a new recombinant version

An October, 2011, Washington Post article discusses the thorny issue of testing the effectiveness of the immunization in children.

Even the safest vaccines have some side effects; the vaccine may not protect versus inhalational anthrax caused by  altered strains of the bacterium and there's no generalized threat at present.

But, what if?

 

Biological Warfare and Bioterrorism: part 1 History

July 5th, 2012

This modern archer's arrow tips aren't as deadly

I wanted to write a post about the 1979 and 2001 anthrax attacks, but, as usual, decided that I should read about and write about the the background concepts first. I found a paper titled, "A History of Biological Warfare from 300 B.C.E. to the Present" written by an associate professor of respiratory care at Long Island University and decided I'd start there.

Our earliest reference to biological warfare concerns the Scythians, a tribe of nomads who migrated from Central Asia to Southern Russia in the 8th and 7th century BCE, founding an empire that lasted several centuries. Their prowess in war and their horsemanship were admired, but their archers were also a formidable force, using arrows dipped in the decomposed bodies of venomous snakes as well as in a mixture of decayed human blood and feces. This concoction would have had both snake venom and several deadly forms of bacteria.

A later episode of bio-warfare came in the 14th century when the Mongol army besieged the Crimean city of Caffa. The Tarters hurled cadavers into Caffa, having suffered from an epidemic of plague themselves. In doing so they not only got rid of their dead, but transmitted the disease to the inhabitants of Caffa. A memoir by the Genoese writer Gabriele de' Mussi which covers the war has been reviewed by Mark Wheelis, an emeritus professor of microbiology at UC Davis. He concludes that the Black Plague pandemic of that time period likely spread from the Crimea to Europe. It's still questionable if the Genoese traders in the city were the ones who brought the plague back.

Hannibal is famous for his use of elephants in the war between Carthage and Rome. He also used poisonous snakes, concealed in earthen jars and hurled onto opposing ships during a naval battle in 184 BCE against the leader of Pergomon.

Smallpox was used as a weapon in the French and Indian Wars (1754-1767). The outbreak of the disease at Fort Pitt led to a plan conceived by Colonel Henry Bouquet and General Jeffery Amherst. The British gave their enemies blankets and a handkerchief, all contaminated with pus from the sores of infected British troops, at a peace conference. Many other Indian tribes caught smallpox eventually, but this episode was a deliberate attempt to use the virus to gain advantage in a wartime situation.

And this prison is safer than Unit 731

In World War II, the Japanese Army established the infamous Unit 731. The country had refused to sign the 1925 Geneva Convention ban on biological weapons and in 1932, after Japan invaded Manchuria, an army officer and physician named Shiro Ishii was put in charge of the "Anti-epidemic Water Supply and Purification Bureau." a euphemism for a prison camp for bio-warfare research.

He and his staff spent years experimenting with many the world's most deadly diseases, using Chinese prisoners as guinea pigs for cholera, plague and dysentery. Over 10,000 of the victims died. Field trials included spraying bacterial cultures from airplanes over eleven Chinese cities.

Ishii was eventually replaced as head of the unit by Kitano Misaji, who later became Lieutenant Surgeon General; after the war the United States granted both immunity from war crimes prosecution in exchange for their bio-warfare knowledge. Ishii had faked his own death and gone into hiding, but was captured and interrogated. Neither man's information proved to be of much value in the long run. Ironically Misaji  later worked for the Japanese pharmaceutical company Green Cross, becoming the chief director of the company in 1959.

I've skipped many vignettes, but it's time to get to the modern era, returning to anthrax as a major subject to discuss.

 


 

 

 

More on salt, actually salts

June 30th, 2012

What should we make with our CSA-supplied spinach today?

We're in the prime of our CSA delivery season; fresh vegetables started three weeks ago and fruits this week. Many of our meals consist of spinach, lettuce, beets, rhubarb, apricots & cherries, with milk and/or cheese or yogurt. We rarely, if ever, shop for any "prepared foods," always check labels for sodium content, and only eat out, other than our weekly Thai food splurge, for birthdays and other occasions. I'm firmly convinced that less sodium (often termed "table salt," but most typically found in processed foods and restaurant dishes) is better for us.

So when I started reading an article in the ACP Journal Club section of the January 2012 issues of the Annals of Internal Medicine that was titled "Review: Interventions to reduce dietary salt do not reduce mortality or morbidity," I was skeptical. The original article, published in England, was a meta-analysis, a statistical look at a group of research studies. In this case seven randomized controlled trials were lumped together, and according to the Journal Club, the conclusion was as in the article's title.

But the US and Canadian reviewers disagreed. In people with normal, borderline, or elevated blood pressure, 6 of the 7 studies showed variable results and the pooled data did not reveal statistically significant decreases in death rates or medical complications. I went to the original article and the authors actually say, "Despite collating more event data than previous systemic reviews...there is still insufficient power to exclude clinically important effects of reduced dietary salt..."

That translates, to me, as "we don't know yet what happens when millions of people lower their salt intake." The reviewers, being ultra cautious, say, "...we are unaware of compelling evidence showing that consuming less sodium in the general population is harmful."

A free-lance science writer wrote an article in Scientific American in 2011 with the title, "It's Time to End the War on Salt." She argues that the data linking increased salt intake and various diseases is not solid.

Should I believe those statistics?

Yet there are lots of studies showing a strong link between salt intake and blood pressure and others claiming a similar correlation between dietary sodium and cardiovascular disease.

One country that decided to act on these supposed connections was Finland. In the late 1970s a national-level campaign was started to include mass media education, monitoring of urinary sodium excretion and food-industry cooperation using salt substitutes. The average sodium intake was cut by nearly 30% and over the next ~24 years stoke and heart attack deaths went down by 60%. Was this cause and effect? I'm not sure, since other factors may have played a role and the initial average salt intake was quite high.

But a December, 2011, New York Times article, with the striking title, "Sodium-Saturated Diet Is a Threat for All" led me to find another kind of salt that plays a role. I found the July, 2011, Archives of Internal Medicine research report, "Sodium and Potassium Intake and Mortality Among US Adults" and realized I was on the right track with our diet.

It's not just too much sodium chloride, the kind of salt some use at their dining room tables, manufacturers add to processed foods and restaurants to their recipes to add flavor and preserve food. It's also how much potassium you eat (in fruits, juices, vegetables, fish, nuts and meat), that makes a difference. In this case, within reason and assuming you have normal kidney function, more is better.

I'm going downstairs and eat some fruit and perhaps a baked potato and yogurt, all high in potassium and relatively low in calories.

Tuberculosis Part two: Treatment options, Old, New or None

June 25th, 2012

Robert Koch discovered the bacillus that causes TB

In the era before antibiotics, especially in the late 19th century, tuberculosis, often called the "white plague," was a leading cause of death in the United States. Eastern patients were urged by their physicians to travel to Colorado where the fresh mountain air was supposedly curative. Hundreds of those stricken with TB went to Denver which was sometimes called "the world's sanatorium." Hospital beds were scare and a Jewish woman, Frances Wisebart Jacobs, recognized as the founder of what is today's United Way, spearheaded a movement to fund a TB hospital which eventually became National Jewish Hospital and served patents of any creed with free care for the indigent.

Prior to the advent of antibiotics, TB continued to treated in sanatoriums, where rest, fresh air (and sometimes surgical collapse of a lung or removal of a cavity) were all the therapy available. Extracts from one such patient's diary, begun in April of 1944 and continued through early June of 1946, are available online having been published in the British Journal of the Royal Society of Medicine in 2004. 

Latent TB is usually treated for nine months with a single antibiotic, usually INH like I was given in the late 1970s. Active TB is quite a different matter and for those with drug-susceptible infection, usually requires a four-drug combination given for six months. This may be given as directly observed therapy with a healthcare worker meeting the patent at their home or workplace.

If the patent stops taking the antibiotics too early or if the particular TB bacteria involved are drug resistant (this occurs more frequently in people with HIV/AIDS and those who have had previous TB therapy), or the medications used are of poor quality, multi-drug resistant TB can develop.

A 2002 paper titled "Tuberculosis therapy: past, present and future," while detailing the history of TB treatment, mentions that George Bernard Shaw, in his 1911 play, The Doctor's Dilemma, talked about the early attempts to cure the disease as a "huge commercial system of quackery and poisons." Elsewhere in the play, Shaw mentions "stimulating the phagocytes," referring to white blood cells that can engulf invading bacteria. The modern take on this 1911 quote would be a TB vaccine.

The only one I'm aware of is BCG, an attenuated live viral vaccine produced from the bovine strain of tuberculosis. It's been around since the 1930s and has been given to at least 5 million people globally, but its major benefit has been to prevent spread of TB to sites other than the lungs. At this stage we really need an effective immunizing agent, as our pharmacologic attempts to keep TB in check are becoming less and less effective.

We all need to support efforts to stamp it out.

Workers at the CDC published a 2007 paper titled "Worldwide Emergence of Extensively Drug-Resistant Tuberculosis" (XDR-TB) mentioning that nearly 90 countries have had multi-drug resistant TB (MDR-TB) requiring usage of so-called second-line drugs for at least two years. Now we've gone beyond that as nearly 10% of the 3,520 samples from an international MDR-TB pool were also untreatable with a majority of those drugs.

So we're up to 2012 and The Wall Street Journal's article, "India in Race to Contain Untreatable Tuberculosis." TB already kills more people than any other infectious disease except for HIV. But since the first cases of XDR-TB were reported in Italy in 2007, Iran and now India have also been struck by the strain.

There's an urgent need for an effective vaccine or a totally new approach to TB treatment.