Archive for the ‘Health Blogs’ Category
Ashoka’s Changemakers along with RWJF is sponsoring a very cool competition – “Nudges” – read below for details and please pass along (thanks to Roberto for sending). The competition was named after Cass Suntein’s book Nudge, Cass has been asked to join the Obama administration. In addition to checking out the competition link below, see the RWJF Pioneer Blog which I follow. The “Nudge” competition is about the little reminders, notifications, and encouragements towards action. With health, behavior change is one of the hardest things to impact and we haven’t been very good about designing or focusing on subtle pushes which are fundamentally critical to health care. While I could name quite a few innovative ideas we have covered on this blog, one that comes instantly to mind is the teachAIDS animation created by Piya Sorcar (it’s got technology, education and behavioral impact components). I am looking forward to seeing what innovations this competition yields.
Designing for Better Health Competition
Ashoka’s Changemakers is collaborating with the Pioneer Portfolio of the Robert Wood Johnson Foundation to launch a global search for “nudges” – innovative little pushes – that help people make better decisions regarding their own health and the health of others.
We are inspired by people and organizations like the Destiny Health Plan that provides “vitality bucks,” an alternative currency that allows people to earn travel and shopping rewards every time they make healthy choices. Another motivating example is CARES, an anti-smoking and savings program in the Philippines that offers smokers the option to invest the money they would normally spend on cigarettes into a savings account. “Designing for Better Health” is investing in the most valuable of all resources – people themselves. Here are the many ways in which you can participate:
Do you know innovators who work to help people make choices that improve their health? By nominating them, you will provide them the opportunity to promote their projects on a global platform and get connected with potential funding.
When most people think of global health they think of infectious diseases and all of the associated images this conjures up (and it is harder to capture provocative images of chronic diseases). However, as we have empahsized before, developing countries are facing a dual burden of both chronic and infectious diseases.
This past Tuesday I was privileged enough to attend the launch of the new Health Affairs issue on global health in China and India. I was joined by an esteemed panel of guests who gave great presentations about various issues facing these two nations. Unfortunately I don’t have time to summarize all of their talks but encourage you to read them in the latest issue. I want to focus on Dr. Somnath Chatterji’s paper because the projections of the aging of China and India are quite stunning and the associated social and economic implications will be profound.
Somnath Chatterji runs the WHO’s Study on Global Ageing and Adult Health (SAGE). Here are some highlights from his paper and quotes I picked up (these are based on my hand written notes, so please forgive any factual mistakes):
The pace of change is stunning – what took 100 years in France (the graying of the population) is going to take place in 30 years in China/India (I can’t remember which one he specified). “Aging has been on the backburner…but China and India are facing dramatic demographic shifts in very short periods of time”.
By 2030, 65.6 percent of the Chinese and 45.4 percent of the Indian health burden are projected to be borne by older adults.
By 2019 in China and 2042 in India, the proportion of people age sixty and older will exceed that of people ages 0–14.
Within the next 20 years there will be 42 million diabetics in China and 80 Million in India.
“In four decades 40% of the worlds elderly population will be in China and India…these countries are getting older before they get richer”.
“Traditionally, people think of chronic diseases as diseases of the of the rich, this is probably not going to be true for China and India…we really need longitudinal data to track this”.
There are dozens of issues that come to mind when hearing these projections, some of which include – access, who will get access to care? how will the delivery system be set up for this? where will the focus be (primary care?)? how will this be financed at both health system level and a household level – how much payment will be borne by the patient? can we use capacity developed for tackling infectious diseases for chronic diseases (a very different ballgame in some ways)? what will be the role of the private sector? if the private sector gets involved heavily to sell their drugs and devices in this new “market” – will that lead to better infrastructure for delivery and distribution of medical supplies? how will this impact the economic growth of these countries? There are many more pressing questions, but I will stop here.
Another one of the articles in this global health issue is on obesity in China. This paper is authored by one of world’s leading experts in nutrition (Barry Popkin). We covered some of this before in a recent issue of Scientific American and here is the link for the new paper. Kudos to Health Affairs for the issue and to Burness Communications for a well run launch.
The British Medical Journal (BMJ) just published the top 15 innovations in health since 1840 ["Medical Milestones"]. For those of you wondering, 1840 was the first year the BMJ was published (I didn’t know it until I read the article). Bonus: until Jan 18 you can vote to determine the most important innovation.
In a similar vein, the Dec 23rd issue of the BMJ has an editorial on the significance of recent Web innovation, medicine and health information ["How Web 2.0 is changing medicine"]. Note: The author, Dean Giustini, is also a blogger at UBC Academic Search – Google Scholar Blog and the BMJ is the same journal that had the Google diagnostics article in November (see the November 11th THDblog post “‘Google Health’? Diagnosis is a keyword away“).
Scanning Giustini’s blog, I ran across this list of 5 intriguing medical podcasts from March 2006 ["Top Five (5) Podcast Websites in Medicine"].
Anaesthesia: Symbol of humanitarianism
Stephanie J Snow
By the end of the 19th century anaesthesia was proclaimed as one of the civilising factors of the Western world, and it remains today the most vivid example of medicine’s capacity to diminish human suffering. Anaesthesia continues to develop: muscle relaxants and techniques such as spinal anaesthesia have brought new benefits; anaesthetists have extended their practice to intensive care and management of chronic pain; and new inhaled and intravenous anaesthetic agents have facilitated the development of day case surgery. The detail of anaesthesia will surely continue to evolve. But nothing is likely to be as significant as the demonstrations by 19th century pioneers such as John Snow and James Young Simpson of the potential of anaesthesia to alleviate the pain of surgery.
The discovery of antibiotics heralded a dramatically new approach to infection control and health care, enabling nations to prosper and overturning the concept of health as a moral duty. Penicillin is the iconic antibioticits introduction into clinical practice was widely celebrated, and its benefits (protection against wound infection and a potential syphilis epidemic) were critical in Europe during and after the second world war. Antibiotics also dramatically changed health services in the postwar years. Fast throughput in general practice was possible because antibiotics could be swiftly administered or prescribed after a short consultation. Surgeons undertook more complex operations on patients now protected from infection. Now, the emergence of genomics has given rise to the prospect of selecting many completely new antibiotics.
From the 1950s, when chlorpromazine came into use, the numbers of inmates in asylums began to fall dramatically, and over the next few years antidepressants and antipsychotics arrived en masse. A new world of a truly biological as well as psychosocial psychiatry had begun. Without the discovery of drugs such as chlorpromazine, our modern, multiskilled mental health workforce might never have emerged. The modern emphasis on users of mental health services and their carers would have been impossible. The progress initiated by the discovery of chlorpromazine means that we can replace baggy terms such as paranoid schizophrenia with “temporal lobe hyperdopaminergia,” and we may yet eradicate the monsters of stigma and neglect that still beset mentally ill people.
Since the Stone Age we have evaluated, interpreted, calculated, and computed. With the human brain’s insatiable urge for self improvement, it began building tools to enhance itself. Thus, over the second half of the 20th century, we developed powerful resources to communicate unlimited amounts of knowledge and to change the way we learn, live, and heal. Computer technology can help us achieve optimal levels of health and wellbeing regardless of who or where we are. It can help us transcend our cognitive, physical, institutional, geographical, cultural, linguistic, and historical boundaries.
Watson and Crick’s 1953 report of DNA structure as a double helix and their recognition, at a stroke, of the digital basis of genetic information opened the floodgates to further discoveries. The most dramatic evidence of that flood is the human genome project, humanity’s biggest research endeavour, permitting rapid progress in linking gene sequence variants to thousands of genetic disorders. Identifying gene mutations in common diseases such as eczema exposes relevant pathogenic pathways and enables new interventions for these conditions. From human insulin to hepatitis B vaccine to trastuzumab (Herceptin), an understanding of DNA permeates myriad developments in treatment. The evidence already before us is dramatic but is nothing compared with the tsunami to come.
In a world without evidence based medicine, a boy with asthma might have his treatment changed every six weeks as new drug samples are dropped off at his doctor’s surgery. Most women with early breast cancer would still be undergoing mastectomy instead of lumpectomy and radiation. Now they can choose. Evidence based medicine is about making decisions that are based on the best available evidence, not dictating what clinicians do. The systematic synthesis of evidence is the foundation of all medical discoveries and of good clinical practice. The question has moved beyond “Why is evidence based medicine important?” to “Why is it not already a reality?” and “How can we all work together to make it a reality, quickly?”
Semmelweis’s work on hand washing and Lister’s antisepsis techniques helped to turn the germ theory of disease into clinical reality. The theory was eventually universally accepted after further work by Koch and Pasteur. These insights into the prevention and treatment of infectious disease moved us from a society at the end of the 19th century in which infection typically caused 30% of all deaths to one at the end of the 20th in which less than 4% of deaths were due to infection. The fall in childhood mortality profoundly affected family size and fertility. Our understanding of hygiene, sanitation, and pathology from the development of germ theory has done more to extend life expectancy and change the nature of society than any other medical innovation.
At the root of sophisticated 21st century medical imaging we find the chance discovery of x rays by Wilhelm Röntgen in a physics laboratory in the 19th century. The discovery led to an array of visualisation and interventional techniques that permeate modern practice. Imaging came into its own as an aid to surgery and evolved to modern digital radiology, such as computed tomography, which has transformed investigative medicine. X rays became a mainstay of cancer treatment, and modern imaging is now used to guide interventions such as angioplasty and stent insertion. Without x rays, doctorslike Röntgenwould be working in the dark.
Understanding how the immune system distinguishes host cells from “foreign” cells has made organ transplantation feasible, saving thousands of lives. Understanding the biological weapons in our immune system has resulted in antisera and monoclonal antibody technology. Monoclonal antibodies are used to diagnose and monitor disease, to ensure the quality of food and other biological materials, and to test for trace amounts of drugs and toxins. They have also been used to treat otherwise intractable diseases such as rheumatoid arthritis and to target anticancer agents precisely to the tumourthe “magic bullet” approach. More than a third of all drugs currently being developed by drug companies are monoclonal antibodies, and this technology will enable many more medical milestones to be reached.
Oral rehydration solution to replace the water and electrolytes lost through vomiting and diarrhoea was initially used only by paediatric specialists in tertiary referral hospitals. When it was tested in refugee camps in the 1970s, mortality fell dramatically. Since then this simple and cheap oral solution, given at home or in healthcare centres, has been integral to the World Health Organization’s diarrhoeal disease control programme. In the 1980s nearly five million children under 5 years old died each year from diarrhoea. In 2000 this figure had dropped to 1.8 million. Because oral rehydration has saved more than 50 million children’s lives over the last 25 years, a large chunk of the adult population in developing countries is alive today.
The pill offers women the ability to decide on their own whether or when to become pregnant, thus undermining the historical dominance of men in sex and reproduction. The repercussions of this have been cultural, economic, professional, and educational and have affected millions of people. No drug has had such an enormous effect on religion. The discipline of epidemiology has probably been improved more through the thousands of studies on the pill than through those on virtually any other drug. Moreover, the pill is the preferred method of reversible contraception in more than half the countries in the world. It is one of the few drugs to have remained essentially unchanged decades after its synthesistestament to its enduring value.
Two landmark studies in 1950 led to a growing body of evidence for the harmful effects of tobacco. Since then, the prevalence of smoking has fallen in countries where tobacco control is taken seriously. For all the money poured into cancer research in recent decades, most of the progress in reducing cancer mortality has been due to deaths avoided through successful tobacco control. Despite the efforts of the tobacco industry to fight back, smoking has been transformed from a pleasant, mannered pastime to a badge of low education, social disadvantage, and ostracism. The end game for smoking may well be just 20 years away in nations where smoking is currently in free fall.
In the 1800s acute infectious diseases that killed male breadwinners were a major cause of poverty. Believing that diseases were caused by air contaminated by poor urban drainage, governments built new sewage disposal and water supply systems. This revolutionised public health in Europe, and mortality from infectious diseases fell dramatically. Nowadays we know that better water supply and sanitation can cut diarrhoea among children in developing countries by about a fifth. The 19th century “sanitary revolution” shows that effective intervention does not always need accurate knowledge, that environmental measures may be more effective than changing individual behaviour, and that universal measures may be better than targeted measures in reducing health inequalities.
Tissue culture allows cells to be grown on an industrial scale, yielding vaccines and other biological products such as recombinant factor VIII for haemophilia. Tissue culture techniques have been crucial in the work of more than a third of the Nobel prize winners for medicine since 1953. Without cell culture we would lack vaccines against measles, mumps, and rubella and would still depend on much more expensive and reactogenic vaccines for polio, rabies, and yellow fever. We would be unable to karyotype patients with suspected genetic disorders or perform in vitro fertilisation. Monoclonal antibodies now used for diagnostic and therapeutic purposes would not be available. Gene therapy and the use of stem cells to repopulate damaged organs would be beyond imagination.
Vaccines have saved millions of lives and spared generations the suffering and long term consequences of infections. The vaccines we have today are grounded in the knowledge and techniques that Louis Pasteur introduced with his rabies vaccine. Pasteur’s breakthrough in 1885 represented the medical conquest of an untreatable disease. In the 21st century, smallpox has been eradicated, and in countries such as the United Kingdom once common childhood diseases such as diphtheria, whooping cough, measles, rubella, polio, mumps, and rubella are rare memories. As new vaccines and vaccine delivery systems continue to be introduced, there is no reason to suppose that the future of vaccines will be any less remarkable than their past.