Cholera: Part two, the nearby 21st century epidemic

February 26th, 2013

Until 2010 I hadn't thought much about cholera in the modern era. I had considered it a disease from the past  and associated it with Dr. John Snow, the father of modern

Algae can carry cholera bugs a long ways

epidemiology , the study of the patterns, causes and effects of health and disease in defined populations (Hippocrates, the famous Greek physician is considered the ancient father of the field).

I was clearly wrong in doing so.

I had previously read parts of the science writer Laurie Garrett's first two books, The Coming Plague: Newly Emerging Diseases in a World Out of Balance published in 1994 and Betrayal of Trust: the Collapse of Global Public Health which followed in 2000. Her first book touches on cholera in Africa and then has a section on the seventh Global Pandemic starting in 1961 in Indonesia's Celebes Islands.

Now I read Chapter 16 of The Coming Plague in detail. It mentioned that Rita Colwell, PhD, an  environmental microbiologist, was convinced in the 1970s and 1980s that bacteria and viruses could be carried in algae, the world's  oldest living life form. Algae are responsible for "red tides"  (AKA Harmful Algal Blooms or HABs), episodes when those ocean plants massively increase in number then produce toxins making shellfish dangerous to eat and killing off fish.

Colwell found that the bacterium responsible for cholera could survive encysted in algae and float long distances in their "plant capsules." The El Tor strain of the bug was responsible for the 1991 epidemic in Peru. The CDC's publication Morbidity and Mortality Weekly Report, AKA MMWR, mentioned that outbreak in its February 15, 1991 editionMMWR noted this was the first appearance of cholera in South America in the 20th century and recommended exclusive use of boiled water for drinking, careful cleaning of fruits and vegetables, and avoidance of raw or inadequately cooked fish or other seafood. It stated the risk to U.S. travelers was low.

In the next eleven months cholera claimed over 330,000 victims in the Western Hemisphere, killing just over 1%. Lima, the Peruvian capital, had stopped chlorinating its water and Peruvians often ate ceviche, uncooked fish and shellfish mixed with lime juice. By the Fall of 1993, 8,000 deaths and over 900,000 cases of cholera were reported in Latin America. The El Tor strain of the cholera bacterium had become endemic in the region.

A 1994 article in the Journal of Clinical Microbiology documented the next chapter in the modern history of cholera. A new strain struck in December of 1992, first in the Indian city of Madras and then spreading to Calcutta, Bangladesh and Thailand. Even those who had previously been through a siege of cholera were not immune to the O139 strain as the Bengal cholera Vibrio was termed.

An earthquake can be both a disaster in itself and the seed for an epidemic.

The Western Hemisphere would have another cholera epidemic eight years later. In the January 10, 2010 a major earthquake in Haiti occurred. Although its magnitude on the logarithmic Richter scale was "just" 7.0, while the offshore earthquake in Japan in 2011 was an 8.9 (an 8.0 quake is 10 times as intense as a 7.0 and a 9.0 is 100 times as powerful), the depth of the Haiti quake was ~half that of the 2011 tremor in Japan and it struck a major Haitian city. The damage was immense and the local infrastructure was severely disrupted with healthcare, water and sanitation being affected.

A recent New England Journal of Medicine article (Feb 14, 2013) reviewed the surveillance efforts during the subsequent two years. Prior to the earthquake, less than two thirds of Haiti's population of 9.8 million had access to even the lowest category of an improved water source; less than an eighth drank treated water from a pipe system and only a sixth lived with adequate sanitation. The 1991 Peruvian cholera didn't reach Haiti so there was little or no prior immunity to the El Tor Vibrio strain.

The results were predictable, a major outbreak of cholera, but the government and international medical assistance markedly ameliorated the epidemic. Through October 20, 2012 over 600,000 cases of cholera were reported and 7,436 deaths resulted. The case fatality rate was initially high in some locales (4.6%), but within three months of the start of the epidemic it fall to the World Health Organizations target of <1.0%.

In comparison there were 2.8 million cases of cholera globally in 2011 with 91,000 deaths (3.25%). The CDC notes that twenty-three cases occurred in the U.S.; 22 were associated with travel to Haiti, one with consumption of food products from that country.

The treatment of cholera is relatively simple: the WHO says rehydration with oral rehydration salts is enough in almost all cases. Intravenous administration of fluids can be life-saving in especially serious cases.

But how about preventing the disease?  A Perspective column in the same journal edition (NEJM Feb 13, 2013) is titled "The Cure for Cholera--Improving Access to Safe Water and Sanitation." The three authors, all with dual MD, MPH degrees, note that the malady is still a major source of illness and mortality in the developing world with WHO estimating 3 to 5 million cases and 100,000 to 200,000 deaths a year.

In the treatment arena, they note that antibiotics should be given to those with even moderate dehydration, that all patients should receive zinc, which can decrease the duration of diarrhea, and a newer variant of the two-dose vaccine should get wider usage.

Safe drinking water and modern sewage disposal is still a major issue for many in 2013: two and a half billion live without adequate toilet facilities and nearly 40% in the least developed regions of the world don't have bacteria-free water to drink.

More than a billion of the poor and marginalized need help. But estimates of $50 billion needed per year are daunting in these tough economic times.

 

 


Cholera: Part 1 background and history

February 24th, 2013

An 1882 monument to victims of cholera

Cholera is an infectious illness, found only in humans, caused by a bacteria in contaminated water, leading to severe diarrhea and dehydration and capable of killing its victims in a matter of hours if untreated. When I read about the disease for the second time in decades (the first time was after a 21st-century epidemic in Haiti), I was amazed at how quickly a victim can lose 10% or more of their body weight in severe cases; e.g., eight quarts between my normal bedtime and when I usually wake up. Many people who ingest the bacteria don't develop any symptoms, but if they do and lack modern re-hydration therapy, their chance of dying is 40-60%.

In all likelihood it is an ancient disease with writings from the lifespan of Buddha  (563-583 BCE) and from the time of Hippocrates (460-377 BCE) revealing diseases that presumably were  cholera. It has, over the last several hundred years, been a major killer of mankind, causing millions of deaths in the 19th century.   Those numbers place it among the deadliest of infectious illnesses, in the company of smallpox, the Spanish flu, bubonic plague, AIDS and malaria.

A CBC News article online with the title "Cholera's Seven Pandemics," starts with a major outbreak in India near the Ganges River delta. Between 1817 and 1823 there were 10,000 deaths among the British soldiers stationed in that country, estimates of hundreds of thousands of fatal cases among native Indians and 100,000 dying in Java in the year 1820. The second pandemic began in 1829, again in India, and spread to Russia, Finland, Poland, England, Ireland, Canada, the U.S. and Latin America, before another outbreak in England and Wales that killed 52,000 over two years. The sixth pandemic killed more than 800,000 people in India alone and, over the next 24 years swept over parts of Europe, Russia, northern Africa and the Middle East.

The National Library of Medicine's website entry on cholera associates it with crowding, poor sanitation, famine and war. India has remained a source as the disease is endemic (ever present) there. People get cholera by eating or drinking either contaminated food or water; the medical term is the fecal-oral route.

In the summer of 1854 London was the epicenter of a deadly outbreak. Dr. John Snow, a famous British physician born March 15th, 1813, had been noted as a pioneer in anesthesiology, using chloroform to assist in Queen Victoria's delivery of her eighth child in 1853.

Then, as documented in the book, The Ghost Map by Steven Johnson, Snow turned his investigative talents and keen mind to cholera, becoming in the process the modern father of epidemiology.

London's population had grown immensely and its sewage system was antiquated. In addition to basements filled with excrement, cesspools and drainage into water sources were rampant. A major concept of disease causation was the miasma theory. The term means "bad air" and the assumption was illness was caused by the presence in the air of a miasma, a ill-smelling vapour containing suspended particles of decaying matter .

Snow, on the other hand, felt cholera was caused by something ingested, most likely by drinking water contaminated by waste products.

In a painstaking and extremely clever investigation, Snow had, in a prior cholera outbreak in 1849 which was responsible for a dozen deaths in flats in a slum area, shown that two separate  sets of milieu had markedly differing death rates. All environmental parameters were essentially identical in the two groups with one exception; where they obtained their water. The group who suffered a much higher rate of illness got theirs from a company whose river source was in the same area where many sewers emptied.

Vibrio cholerae, the cholera bacteria

Five years later a much larger cholera epidemic provided an opportunity to more closely examine the water sources of the victims. One particular pump, seemingly providing clear water, proved to be the culprit. The Broad Street pump's output was examined by a Snow's colleague, a skilled microscopist Dr. Arthur Hassall, and found to contain what Hassall believed to be decomposed organic matter with oval-shaped tiny life-forms felt to be feeding on that organic substance. Snow was not aware then of the 1854 work of an Italian scientist, Filippo Pacini, who had examined the intestines of patients dying from cholera in Florence and found a comma-shaped bacillus he termed a Vibrio.

The proponents of the miasma theory did not yield easily, but Snow's map of the location of deaths from cholera eventually let his hypothesis of a water-borne illness prevail.  Then an assistant curate (church figure in charge of a parish) named Henry Whitehead who had read Snow's papers on the epidemic eventually found the index (first) case, a baby Lewis. As a result, the Broad Street pump was excavated and a direct connection to a cesspool was found.

The juxtaposition of Snow's scientific data and Whitehead's work as a beloved neighborhood figure led to the local Vestry Committee's report endorsing the water-as-culprit theory.

The city subsequently launched a major project to carry waste and surface water away from Central London.

 

 

 

 

Five years later, he

Vaccination/Immunization: Part 3 Adults and the disease risks some of us take

February 16th, 2013

You need protection against viruses and bacteria that lurk out there

After reading a number of articles, I decided that Lynnette and I  are up to date on all our vaccinations, but many adult are not; the CDC on Feb 1, 2013, published an online review titled "Noninfluenza Vaccination Coverage Among Adults--United States 2011" that reveals a sad picture. The first two sentence sums it up, "Vaccinations are recommended throughout life to prevent vaccine-preventable diseases and their sequelae. Adult vaccination coverage, however, remains low for most routinely recommended vaccines and well below Healthy People 2020 targets."

I had only a vague idea what does Healthy People 2020 referred to, so I found the definition on a CDC website. 

In December of 2010 the Department of Health and Human Services (HHS) launched a multi-faceted ten-year program with four major goals for our American population: 1). Attain high-quality, longer lives free of preventable disease, disability, injury, and premature death. 2). Achieve health equality, eliminate disparities, and improve the health of all groups. 3). Create social and physical environments that promote good health for all. 4). Promote quality of life, healthy development, and healthy behaviors across all life stages.

It's obviously a huge undertaking and HHS came up with 1,200 objectives (sic) organized into "topic areas" (42 of those) each covering something felt to be very important in our public health. That's too big of a chunk for me to even think of writing about today.

So in this post I'll focus on vaccinations for adults.

Every year an Advisory Committee on Immunization Practices is given the charge of reviewing and updating the recommendations for childhood vaccinations and also those for adults. The Annals of Internal Medicine published the adult schedule and comments on its changes January 29, 2013.

Let's go back to the non-influenza vaccination article; the discussion was on immunizations to protect us against tetanus, diphtheria and pertussis/whooping cough combined as Tdap; pneumococcal pneumonia, hepatitis A, hepatitis B, herpes zoster (AKA shingles), and the human papillomavirus (HPV).

The Tdap numbers were startling to me. Only 55.4 of adults over 65% are protected and <65% of adults from ages 19 to 64, but  fatality rates for tetanus are over 13%. Far too many people are taking chances with a terrible, but preventable disease. The American Geriatrics Society is urging all of us over age 65 to have the Tdap shot, to protect ourselves and our grandkids (from pertussis in the latter case).

I've written on pertussis, but to recap we're seeing more cases in the U.S. (22,550 were  reported in 2010 and many more, especially among the elderly, are never reported). There have been epidemics of pertussis in 2012-2013. If you think you're still immune to whooping cough  because you had the childhood vaccination five-shot series, you should know that an person's immunity wanes from 98% protection to 70% after five years have elapsed.

There hasn't been a case of diphtheria in this country since 2003, but lots of us travel to countries where that disease is endemic (regularly found) and the case-fatality rate for respiratory diphtheria is 5-10%.

The pneumoccocal vaccination rate for those in this country who are 19 to 64 and considered at high risk for this kind of infection (e.g., anyone whose immune system isn't at its best) is only a tad over 20%, while the 2011 figures for those of us over 65 are much higher, at 62.3% in 2011. Even in the older age group the data showed Caucasians have gotten this immunization much more commonly then Asians, Hispanics or blacks, all of whom had vaccination rates <50%.

I've had the herpes zoster shot, but I'm in the 15.8% (20111 figures) who've done so. I never wanted to have shingles after knowing two people who had prolonged excruciating pain from this disease.

HPV is the most common sexually-transmitted viral disease in the United States. The CDC says, "Almost every sexually active person will acquire HPV at some point in their lives." In doing so they increase their risk of certain cancers; in a major CDC study that covered everyone in the U.S. from 2004-2008 there were over 33,000 HPV-associated cancer cases per year.

There are a host of reasons people don't get vaccinated. The CDC has an article online that covers the topic of common misconceptions about the need to continue vaccination. Some people think that infectious diseases were being prevented by improvements in sanitation/hygiene even before immunizations were developed. Or they may believe that a majority of us have already been vaccinated so they don't need to (the herd immunity concept) or that certain "lots" of a particular vaccine are dangerous. Some think we've gotten rid of all the diseases that vaccines can prevent, so they reject having themselves get the shots.

Especially if "out there." in your case, means most of the world

Unfortunately, none of these concepts are valid and many of us travel to parts of the world that have much worse immunization statistics than America does. So, if we're not vaccinated before our trips, we run the risk of bringing home a disease and spreading it to others.

 There are some significant changes coming in the vaccination arena, but I'll save those for another time, including a few words on Hepatitis A and B. For now I'd suggest asking your physician is she/he thinks you're current in all the immunizations you need; that's especially true if you are planning a major trip somewhere outside the country.

 

 

 

Vaccination/immunization: Part 2 Smallpox: history

February 13th, 2013

We don't see these signs anymore.

Vaccines have a humble beginning. Smallpox was the first infection that people tried to prevent using a method called variolation, developed in China and India, in which dried material from a smallpox scab was ground up and then administered to a well individual, blowing the powder into their nose and hopefully giving them a mild case of smallpox and long-term immunity to the disease.

It originated from the observation that people who survived a previous smallpox infection somehow become resistant to getting the infection again. It was thought that by artificially infecting an unaffected person, the process could protect the individual from the dire malady.

Smallpox is an ancient disease; a great online article by Stefan Riedel, an MD, PhD from the Baylor University Medical Center, traces its history over an estimated 12,000 years, with Egyptian mummies from 3,000 to 3,500 years ago showing typical facial scars.

The Antonine Plague, which lasted from CE 165 to 180, most likely was smallpox (though possibly measles). It killed 2,000 per day in the Roman Empire with total deaths estimated at nearly seven million. Many scholars feel it significantly contributed to the downfall of the empire.

Ancient Chinese documents show that variolation was practiced in the Song dynasty in China (CE 960 to 1279). Legend has it that the Song emperor had lost his eldest son to smallpox, so he traveled deep into the forest of a high mountain and sought help from a reclusive nun. The woman was known as a holy healer, and she passed on the technique of variolation to save the ancient Chinese royal family.

Two to three percent of individuals receiving variolation ended up dying from smallpox. The only reason this practice continued was because the chance of dying from smallpox caught “naturally” from another infected person was much higher with some epidemics killing 30% or more of victims.

In 1717, Lady Mary Mortley Montagu, was in Constantinople as the wife of the British ambassador. She herself had suffered from smallpox and also lost a brother to the disease. She found that the Turks had another approach to gaining smallpox immunity, inoculation. The word is derived from the Latin inoculare, meaning “to graft.” Inoculation referred to the subcutaneous instillation of smallpox virus into non-immune individuals. The inoculator usually used a lancet wet with fresh matter taken from a ripe pustule of a person who suffered from smallpox. The material was then introduced into the arms or legs of the non-immune person.

Lady Montagu had two of her own children inoculated, one while in the Ottoman Empire and the other upon returning to England. A number of prisoners and later some abandoned children were subjected to the procedure; both Lady Montagu and the Princess of Wales were involved in this "research project," which next had those inoculated deliberately exposed to smallpox. None of them got the disease.

After this success, members of the British royal family were inoculated and the practice spread widely in Europe, reaching America with Reverend Cotton Mather being a strong supporter.

During one of Boston's succession of smallpox epidemics, after hearing from Cotton Mather of a publication in  the Transactions of the Royal Society, Dr Zabdiel Boylston, in spite of strenuous protests from the general public and from other physicians, inoculated his own son and two family servants. Eventually he performed the procedure on 247 ranging in age from nine months to sixty-seven years; other physicians inoculated 39, and of the 286 only six died  (2%) and three may have already had smallpox, while 14% of the 5,759 who had not been inoculated and caught the disease perished.

Thanks to Edward Jenner

The next step came from Dr. Edward Jenner, who had speculated about the protective effects of cowpox during his "apprenticeship" with George Harwicke. He had heard a woman who was a dairymaid and had a pock-free complexion say, "I shall never have smallpox because I have had cowpox." Over ten years later, in 1796, he used material from a cowpox-infected woman to inoculate an eight-year-old boy. The child recovered from cowpox, but when deliberately inoculated later with smallpox did not develop any signs or symptoms of the much more serious disease.

Since the Latin word for cow is vacca, Jenner dubbed the process vaccination and after being rejected for publication by the Royal Society, sponsored his own booklet to promote the method of smallpox prevention.

The rest, as they say, is history and Jenner became famous; in 1802 he received a large sum of money from Parliament and a still larger one in 1807. He continued his research in several areas of medicine and science and was appointed Physician Extraordinary to the King.

Vaccination became an established medical procedure.

 

Vaccination/immunization: part 1 Kids first

February 10th, 2013

I was reading an Annals of Internal Medicine update on adult immunizations and then found it was available for the general public on a CDC website. I thought we were pretty much caught up on our own vaccinations, but decided to read the article anyways. There are, as always, some changes and I'll eventually walk through those for you.

He's up to date with his vaccinations; some others aren't

But that's not the real issue in America today. A piece in The Wall Street Journal on Feb 6, 2013 with the title "Rolling Back the War on Vaccines is dead center on. The authors are Jay Winsten, an associate dean at the Harvard School of Public Health and Emily Serazin, a principal of the Boston Consulting group, (a global management consulting firm with 77 offices in 42 countries) and they sound a very loud alarm, primarily for those with small children.

We still have too many kids who aren't vaccinated and as a country we stand in danger of losing "herd immunity," and in doing so, endangering our youth. There have already been several resultant epidemics here and elsewhere; measles in Europe in 2011 affected >30,000 and the U.S. had 222 cases, primarily linked to those who had traveled to Europe. If you look worldwide, away from places where vaccination is routine, measles killed 139,300 people in 2010.

And then there's pertussis, AKA whooping cough. A July 19, 2012 CBC News report noted that the number of U.S. pertussis cases had doubled, 18,000 at that point, with nine children dying as a result. Kids who haven't been vaccinated are eight times as likely to get pertussis, according to a CDC physician.

The Institute of Medicine (IOM) issued a January 2013 report on childhood immunizations in the United States. I knew there was such an organization, but wasn't clear as to what is was and how much credence all of us should pay to its pronouncements, so decided to read further.

The IOM is an American non-profit, non-governmental organization, founded in 1970 under the congressional charter of the National Academy of Sciences. Its purpose is to provide national advice on issues relating to biomedical science, medicine, and health, and to serve as adviser to the nation to improve health. It works outside the framework of the U.S. federal government to provide independent guidance and analysis and relies on a volunteer workforce of scientists and other experts, operating under a rigorous, formal peer-review system.

IOM committees are carefully composed to assure the requisite expertise and to avoid bias or conflict of interest. Every report produced by IOM committees undergoes extensive review and evaluation by a group of external experts who are anonymous to the committee, and whose names are revealed only once the study is published.

The summary of the IOM's report on The Childhood Immunization Schedule and Safety, with its subtitle being Stakeholder Concerns, Scientific Evidence, and Future Studies mentions that vaccines are one of the safest and most effective public health interventions available.

As a result of their effectiveness, with smallpox totally unknown as the killer it once was, measles rare in America and other diseases reduced or virtually extinct, parents of kids under five, now worry more about vaccines than the diseases they so successfully prevent. Some would deny their children the proven benefits of immunizations because of the schedule of 24 injections by age two (as many as five at one time), others for religious reasons, potential harm from side effects or mistrust of our government. Their fears are inflamed by those who, for varying reasons, inveigh against any vaccinations.

The IOM panel concluded that a prospective randomized, controlled study would not be ethical, and since fewer than one percent of all Americans refuse all immunizations, a new observational study (comparing outcomes between those kids who are vaccinated and those who aren't) would be prohibitively difficult and time-consuming. In order for such a research study to be valid, the kids would need to be matched pairs of the same age, sex, ethnicity and location in the U.S..

So the best approach appears to be one that uses the Vaccine Safety Datalink (VSD), a twenty-three-year-old project linking the CDC with nine managed care organizations. It's a well-proven tool for evaluating the safety of immunizations. The VSD could be adapted for further studies to address stakeholder (parents and others) concerns.

The scales of justice are available when needed

The U.S. has a National Vaccine Injury Compensation Program (NVICP), in part started as a result of the anti-vaccination movement. The VCIP's goals are to ensure an adequate supply of vaccines, stabilize their costs and act as an accessible/efficient modality for people actually found to be injured by vaccines. A designated section of the U.S. Court of Federal Claims reviews claims of vaccine-caused injuries. During the first eight years of NVICP a total of 786 contested cases were resolved; this is a tiny fraction of pre-vaccination adverse outcomes, but offers a mechanism to evaluate and compensate any actual immunization injuries.

No medicine is 100% safe; vaccines are among those that have the best balance between immense positive and rare negative effects.

End-of-Life Care: where and how?

February 6th, 2013

I've been deluged by two recent deaths and a number of severe illnesses among elderly (70+) friends and acquaintances and so was particularly interested in an article in JAMA that arrived today with the title "Change in End-of-Life Care for Medicare Beneficiaries." The study looked at the site where people died (home, hospital, hospice) and what type of medical care they received in the last 90 days of their lives.

An ICU may not be your first choice of where to end your days

The accompanying editorial piece, written by two Yale physicians, summarized the data well: among this large group of Medicare patients (848,303) dying in 2000, 2005, or 2009, more died in hospice care or at home in the later time frame, but ICU stays and acute care hospitalizations actually increased. In reality, hospice care, although more available and better recognized, frequently seems to be used only in the last few days of life.

This pattern of hospitalization and re-hospitalization holds true for those with other diseases and illnesses. A 2009 New England Journal of Medicine article looked at readmissions among nearly 12 million fee-for-service Medicare patients reviewing data from the 2003-2004 time frame. A startling 19.6% of those Medicare beneficiaries who had been hospitalized, then discharged, were back in the inpatient setting within thirty days and over a third (34%) within ninety days. The estimated cost to Medicare of  the total of these "bounce back" hospital stays was $17.4 billion in 2004.

A 2011 study reviewed the medical histories of nursing home patients who were hospitalized, often repeatedly. In 2012 MedPAC, the Medicare Payment Advisory Commission, recommended "reducing payments to skilled nursing facilities (SNFs) with relatively high rates of re-hospitalizations."  In 2006 the cost of early re-admissions from SNFs (within the first 30 days after hospital discharge) was $4.34 billion.

A series of Interventions to Reduce Acute Care Transfers (INTERACT II), with a goal to reduce re-admissions has been shown to be effective, cutting acute care transfers from SNFs by a sixth to a quarter. INTERACT II data from a pilot project reported in 2011 and involving 25 SNFs showed an average cost of $7,700 per nursing home and an average saving estimate to Medicare of $125,000 per 100-bed SNF.

That's the cost in terms of money, but the impact in terms of quality of life costs to patients can't be estimated easily, but is huge. So let's go back to the original JAMA article and editorial and its references.

One of the striking finding was the percentage of patients with diagnoses of dementia who actually spent some of their last days in an ICU setting. I woud have thought this figure should have fallen over the past years, but in reality, as Drs Jenq and Tinetti's editorial points out, it actually increased (18.6% in 2000 and 21.8% in 2009). If I'm demented in my last days on this earth, please don't waste precious ICU space on me!

The Yale reviewers comment that we need to set criteria for ICU admissions (as we already do for other health-care settings) makes eminent sense, Not only do we waste highly needed bed space and lots of money, potentially depriving other critically ill (and potentially recoverable) patients of the opportunity to receive the highest level of care available, in doing so we may block those, like me, who wish to die at home or in hospice, of that option...at least until the last three days.

In the article by Teno et. al., forty percent of those with COPD had an ICU stay prior to death and an equal percentage of those referred to hospice within three days of dying came there from an ICU.

At home, with your grandchild's hands on yours, may be a much better choice.

Yet, in reality, data show many of us would prefer to die at home and many of the caregivers for the terminally ill would agree.

How do we change the current picture, so that surrounded by family and perhaps even in our own bed, more of us can face the end in the relative comfort of home or with the aid of hospice?

I don't have an easy answer, but I think it's time for a national dialogue on the subject.

As things are now, I think we're avoiding coming to grips with an issue we should face up to; our health care system can't afford the costs involved and our own wishes and those of our loved ones need to be expressed in advance.

It's hard to make decisions at the last moment, so let's think about them in advance, individually, with our families and as a nation.

Personality and Meaning as Therapeutic Tools

February 5th, 2013

In my January 5, 2012 post on "When  Do We Stop Changing?" I mentioned the work of Dan McAdams the head of Northwestern University's Psychology Department whose major field of interest appears to be personality psychology. I didn't know what that was, but it sounded interesting, so I started to read some of his published work.

We're different from the rest

But before I got very far, I need to clarify some background words for myself. For instance what's the formal definition of personality and what are personality traits? I started with the appropriate webpage of the American Psychological Association(APA). Their take on personality is "individual differences in characteristic patterns of thinking, feeling and behaving."  Hmm... that made sense to me once I went through the wording slowly: so you and I each have our own habitual ways we react to life and those affect how we differ from each other in how we think, what we feel about life events and how we behave.

But then came the subject of personality "traits" and even the APA wouldn't give me a clear-cut answer. Many places I looked made the typical error of defining a word by using the same word, e.g, a personality trait is a trait that..." I found an online definition: attitudes, behaviors and actions a person possesses or exhibits. But how is that different from the overall concept of personality? My Mac's own built-in dictionary did a better job: it says a trait is a distinguishing quality or characteristic, typically one belonging to a person.

The APA website did have a short discussion of which traits predict job performance; it mentioned the "Big Five:" being an extravert, agreeable, displaying conscientiousness, having emotional stability and staying open to experience. But having interpersonal skills is sometimes even more important in predicting job success and conscientiousness may work just fine in ordinary jobs, but less so in those that require a high degree of creativity.

The Huffington Post in a July 6, 2012 piece titled "6 Personality traits Associated with Longevity," delved into some of those factors that are common among the long-lived among us. They mentioned studies for each one, but in brief the fab six are being conscientious, easy to laugh, optimistic, socially connected, extroverted and happy.

Then I went back to McAdams' work and read a 2006 article he published in a journal called Narrative Inquiry, " The role of narrative in personality psychology today,"and his chapter titled "Personal Narratives and the Life Story" in the 2008 Handbook of personality: theory and research

The entire field is termed personality psychology.  Interestingly the word comes from the Latin term persona, meaning mask. But it doesn't imply a mask to conceal; rather a one that typifies. It's defined as focusing on major individual differences in our behavior and experience. Our largest changes happen in early adulthood to age 40, but we continue to adapt and alter our personality, to lesser extents, even in old age.

McAdams feels we, as a species, are story tellers, from the time we can speak coherently, and much of what we tell as stories, at least in our Western culture, is our own life. We tell them to other people, retell and retell them (hopefully to different people) and they alter over time as our memory is, in all but a few individuals, not perfect. In doing so we help make clear the emotional content of our experiences, primarily for our own benefit.

Roberts and Mroczek, in a 2008 article available online through the NIH Public Access website, appear to dovetail with some of McAdams' views. They note that middle-aged individuals, when compared to younger adults, are generally more agreeable and conscientious while scoring lower in extraversion, neuroticism and openness; they mention that those among us who are psychologically mature at earlier ages generally have more effective relationships and do well at their work; they also tend to be healthier and live longer.

McAdams and others would view the life story as a therapeutic tool to be "reformulated and repaired." He presents a five-sided framework for personality involving human nature, traits, adaptations (e.g., motives, goals, values, strategies), life stories and culture. He notes Western life stories are more self-centered while those from Chinese subjects (Wang and Conway 2004) more often presented moral messages based on past events.

Can we find meaning in even the worst life experiences?

I went back to my copy of Viktor Frankl's great book, Man's Search for Meaning, telling of his experience in Nazi death camps during WWII; his mother, father, brother and wife died or were sent to the gas ovens, but he and his sister survived. He founded logotherapy (therapy through meaning), a new approach to psychological issues, and The American Journal of Psychiatry called his blend of science and humanism, "Perhaps the most significant thinking since Freud and Adler."

I don't know what he would think of personality psychology, but his own Life Story brought about changes that were incredibly influential in helping others.

 

Should I be taking aspirin?

January 30th, 2013

I take a dose equivalent to 1/4 tablet of aspirin

One of our friends recently told my wife she'd stopped taking aspirin after a news report linked regular use of the medication to macular degeneration. We've both taken 81 mg of aspirin a day and, after I'd heard that people may not absorb the enteric coated form well (and I couldn't find any other form in that size at the local drugstore), I'd ordered ten bottles of chewable orange-flavored aspirin online from Amazon.

Then I decided to read the medical reports that our friend's recommendation had been based on. She doesn't have a medical background and hadn't looked at the original data, but instead had seen a warning in a newspaper article. Let's start at The New York Times blog. On Dec 12, 2012 they published an article by Anahad O'Conner titled "Aspirin Tied To Rare Eye Disorder."

It's a very well-worded article written by a 31-year-old, Yale-educated Times reporter who writes a weekly science column and has published two books He notes the article he based his piece on was from JAMA with the lead author, Dr. Barbara Klein, being a professor of ophthalmology at the University of Wisconsin, Madison. Since I'm a UW graduate (BS 1963, MD 1966), I was particularly interested in her study.

It used data from the Beaver Dam Eye Study, started in 1988-1990 and concluded in 2010. O'Connor very appropriately noted this was an observational study, not a prospective, controlled research project. In other words a group of ~5,000, aged 43 to 84, agreed to have regular eye exams and reports were published after the 5-, 10-, 15- and 20-year followups.More than 300 publications have resulted from this project with data supporting a relationship of cataracts and age-related macular degeneration (AMD) to cigarette smoking.

Klein's paper stated that an estimated 19.3% of US adults take aspirin on a regular basis. It's commonly recommended for anyone who has had a heart attack (secondary prevention), but many   of us who've never had evidence of coronary vascular disease also take aspirin. This is primary prevention and is controversial with some data suggesting reduction of heart attacks in men over 45, but not women, although they may have a 17% reduction in stroke incidence.

A senior who has AMD may need a magnifying glass.

A January 21,2013 article from an Australian group reported a two-fold increase in AMD of a particular type, independent of smoking habits. Nearly a quarter of regular long-term aspirin users developed neovascular AMD, two and a half times the percentage of those who did not regularly take aspirin.

A 2001 paper in the Archives of Ophthalmology reported a randomized, double-masked, placebo-controlled study of low dose aspirin (one adult tablet every other day) plus 50 milligrams of beta carotene (a vitamin A precursor rated possibly effective in treating advanced AMD) among over 20,000 US male physicians aged 30 to 84 in 1982. The study was stopped after ~5 years due to a statistically extreme reduction (44%) in first heart attacks. There were fewer cases of AMD in those taking low-dose aspirin than in those who got the placebo.

There's also some data supporting aspirin's role in cancer prevention, especially in malignancies of the colon. Here the benefit was unrelated to aspirin dose (75 mg/day and up), but increased with age.

So let me look at my own risks: my dad had a large polyp in the earliest part of the colon, an area hard to see even on colonoscopy. It was initially felt to be benign, but later had some areas of low-grade malignancy. He also had macular degeneration in his remaining eye  diagnosed at age 90+ (the other eye having been removed nearly sixty years previously after a bad cut and a subsequent infection). My brother died of a heart attack at age 57 and my mother had a heart attack at age 74 with a cardiac arrest; (Dad resuscitated her and she lived to age 90).

The editorial that accompanied the recent JAMA article is thoughtful and impressive. Its title was "Relationship of Aspirin Use With Age-Related Macular Degeneration: Association or Causation?" and it concludes "From a purely science-of-medicine perspective, the strength of evidence is not sufficiently robust to be clinically directive." It then switches to a different viewpoint, the art-of-medicine perspective, saying maintaining the status quo is currently the most prudent approach, especially in secondary prevention (someone who has already had a cardiovascular event). For those of us who haven't, the risks versus benefits should be individualized based on our own medical history and value judgement.

I'm going to discuss this with my own physician but not stop taking a chewable 81 mg aspirin daily until I do.

Progress on the painkiller front

January 27th, 2013

In the January 25, 2013 online edition of The New York Times I found a highly significant article titled "F.D.A. Likely to Add Limits on Painkillers." An advisory panel to the Food and Drug administration wants to strengthen the current rules about such drugs as Vicodin, a frequently prescribed powerful pain pill.

Money sometimes blocks the adoption of sensible rules

Similar recommendations failed to make it through Congress last year; they were lobbied against by business interests with considerable heft behind them.

Now Vicodin and other drugs made from hydrocodone aren't the worst problem, but, as you'll see, they are a way to call attention to medications that for the past four years have caused more deaths in the United States than traffic accidents, or those from illegal drugs including heroin and cocaine.

I went back to an article in the Journal of the American Medical Association, AKA JAMA. It was published on Dec 14, 2011 in the section called "Vital Signs," and was a CDC look at overdoses of prescription opioid pain relievers (OPRs) in the US, especially during the 1999-2008 time frame. The news was stark; in 2007 nearly 100 people died from these drugs every day, with death rates that have tripled since 1991.

Between 1999 and 2010 the sales of OPRs quadrupled with enough prescribed to give every single one of us US adults a standard dose of these medications every 4 hours for a month. The health-care cost of abuse of these drugs is staggering, estimated at $72.5 billion a year.

Of the total number of US 2008 deaths from drug overdoses (36,450), OPRs were involved in 14,800 and the years of potential life lost before age 65 was comparable to the figure from motor vehicle accidents. I was startled, but in retrospect not entirely surprised that a study showed 3% of American physicians accounting for 62% of the OPRs prescribed.

Some of those doctors are anesthesiologists, oncologists or other physicians fully trained in pain control and working in highly specialized hospital-associated units. Their use of these medications is appropriately aimed at patients with cancer, those with severe acute injuries and perhaps some others whom almost all of us would agree should have whatever it takes to minimize agonizing pain.

But some who prescribe for dough can end up in jail

Others in the group of "mega-prescribers," however, may not be pain specialists; they could be working for "pill mills." That has happened in a number of states with some of those doctors being accused at this stage and several others being convicted and facing jail sentences.

But the new rules, which have to be approved by the FDA and then by the Department of Health and Human Services before they actually take effect, make enormous sense to me: refills forbidden without a new prescription; no fax or phone prescriptions, only written ones; and drug distributors being forced to store OPRs in special vaults.

The Times article today notes the panel was not monolithic in their voting (19 to 10) with some highly skeptical that the suggested changes would do much to alter the current surge of inappropriate or illicit drug use in America. One commented that oxycodone-containing products already are in a more restrictive category.

One of the subject matter experts quoted was Dr. Nathaniel Katz, an anesthesia assistant professor at Tufts University medical school. Katz served as Chair of the FDA's Advisory Committee, Anesthesia, Critical Care, and Addiction Products Division, from 2000 to 2004 and thinks the recommended rule was largely symbolic, giving a message both to doctors and patients. He commented that the OPRs the panel was voting on are a relatively minor player. Katz now devotes much of his time to a clinical research company that's attemting to develop new treatments for pain.

Vicodin and like drugs containing hydrocodone are the most widely prescribed OPRs, but are responsible for a minority of deaths with medications containing oxycodone or methadone, although less commonly given to patients, accounting for two-thirds of the drug overdose deaths.

So the question now is whether the proposed new rules make it over the remaining hurdles.

I hope they do.

 

There may be hope for those who've lost their sense of smell and taste

January 22nd, 2013

Ah, that's vanilla

I was reading JAMA, the Journal of the American Medical Association, in this case the January 9, 2013 edition, and came upon an abstract of an article from an ENT (Ear, Nose and Throat, AKA Otolaryngology) journal I wouldn't look at normally. The pilot study authored by an MD, PhD and two colleagues  from the Washington, DC, Taste and Smell Clinic, involved ten patients who had lost their sense of smell and taste (they're clearly related as both my wife and I can attest). The causes of their condition, medically termed hyposmia and hypogeusia, were varied.

Let's start with a definition: a pilot study is a small-scale preliminary research project prior to attempting a larger, more expensive and expansive effort. So this study is not the last word on the treatment of the two conditions, but only a beacon of hope for those of us who no longer can taste and smell with the same acuity we once did.

These are common conditions, especially as we age. I'll give you links to the NIH's MedlinePlus website patient information for impaired smell and for impaired taste, but my own issue with smell and taste started with allergies and nasal polyps in the 1970s and my wife lost her sense of smell, for unknown reasons, when she was in her late 60s. One of the underlying concepts is much of what we perceive as taste is actually smell; the tongue can only detect four or perhaps five tastes: salty, sour, bitter and sweet are the conventional four I was taught in medical school (probably in grade school in a rudimentary fashion); the fifth is umami, a taste known to the Japanese since the early 1900s and often thought of as meaty or savory (it's best associated with the amino acid glutamate, sold as MSG and some people have reactions to it).

Dr Steven Bromley, now a neurologist in New Jersey, wrote a superb paper on smell and taste disorders when he was still a resident. In the first place those issues are common in the general population, but should not be taken as routine. The most common causes are nasal and sinus disease, URIs and head trauma (10% of the latter group have olfactory impairment). A host of medications and a number of more significant diseases can be causative factors, so especially for those who suddenly lose these senses, a medical review may be in order.

Compensating for loss of smell and taste by adding excessive amounts of sugar and/or salt to your diet can lead to serious consequences. We tend to use a host of spices without salt and to tell guests that we under-salt menu items, so they an add whatever they're used to at the table. When we cook for ourselves we add no salt; for company we tend to use 1/2 to 1/3 of whatever the recipe recommends.

Someday we may be able to use a nasal spay to help our sense of smell.

Back to the pilot study: they used both oral and nasal theophylline, a medication historically used for patients with asthma and other lung diseases. The oral form was given for two to twelve months, then stopped for a few weeks (3 to 12) and replaced by a nasal spray of the drug. Past experience with the oral medication showed potential for side effects they wished to avoid. The nasal spray worked better in more patients (8 of 10 versus six of ten) with no adverse effects being noted.

This is obviously only a preliminary study; don't ask your physician for a nasal spray of theophylline yet. Much larger and longer clinical studies need to be done and any of us without an obvious reason for loss of smell and taste should be evaluated for the underlying cause.

But it makes me optimistic that someday some of us may be able to regain those lost senses. Besides enhancing our meals, the other uses of smell and taste, the protective ones that help us distinguish danger (smoke, toxins, pollutants, spoiled food), may be improved.

Here's hoping.