Tuesday, January 19, 2010

COMPLEXITY 4

This is the fourth in a series of new posts on the subject of complexity in scientific research. The theme of this collection is that scientific progress, particularly in the realm of healthcare, has declined as a consequence of the high complexity in software and other technologies.

-- POST BEGINS HERE --

In the prior blog, I discussed the lack of tangible progress in medical research in the past few decades. The general perception that basic research advances are not yielding clinically useful medical breakthroughs has inspired the "translational research" rhetoric currently spewing from funding agencies. Is it possible that the current generation of medical researchers has made no progress whatsoever? Well, maybe there were a few bright spots. Here are some of the major breakthroughs in medicine occurring since 1960.

1. Zinc drastically reduces childhood deaths from diarrhea, a disease that kills 1.6 million children under the age of five, every year (1).

2. Helicobacter pylori causes gastritis, gastric ulcers, and some stomach cancers (2). A simple antibiotic treatment cures gastritis and reduces the incidence of stomach cancers (3). This work earned the two discoverers, Barry Marshall and Robin Warren, the 2005 Nobel prize

3. When babies sleep on their backs, instead of their stomachs, the incidence of SIDS (sudden infant death syndrome, or crib death) plummets (4).

4. Daily aspirin ingestion seems to reduce deaths from cardiovascular disease and colon cancer (5).

The most significant medical advances in the past few decades (and there haven't been many) have been simple measures. All of the great debacles in medicine have been complex. This is because scientific methods have reached a level of complexity that nobody can understand.

Gone are the days when a scientist could describe a simple, elegant experiment (on a mouse, a frog, or some other easily obtained chemical reagents) and another scientist would, in a matter of a few hours, repeat the process in his own laboratory. When several laboratories perform the same experiment, using equivalent resources, and producing similar results, it is a safe bet that the research is valid, but we seldom see that kind of validation.

Today, much of research is conducted in a complex, data-intensive realm. Individual studies can cost millions of dollars, involve hundreds of researchers, and produce terabytes of data. When experiments reach a high level of cost and complexity, repetition of the same experiment, in a different laboratory, becomes impractical.

[1] Walt V. Diarrhea: the great zinc breakthrough. Time August 17, 2009.

[2] Warren JR, Marshall BJ. Unidentified curved bacilli on gastric epithelium in active chronic gastritis. Lancet 1:1273-1275, 1983.

[3] Kidd M, Modlin IM. A century of Helicobacter pylori: paradigms lost-paradigms regained. Digestion 59:1-15, 1998.

[4] Vennemann MM, Fischer D, Jorch G, Bajanowski T. Prevention of sudden infant death syndrome (SIDS) due to an active health monitoring system 20 years prior to the public "back-to-sleep-campaigns." Arch Dis Child. Jan 6, 2006.

[5] Writing Group; Hennekens CH, Dyken ML, Fuster V. Aspirin as a therapeutic agent in cardiovascular disease: a statement for healthcare professionals from the American Heart Association. Circulation 96:2751-2753, 1997.

-- TO BE CONTINUED --

© 2010 Jules Berman


My book, Principles of Big Data: Preparing, Sharing, and Analyzing Complex Information was published in 2013 by Morgan Kaufmann.



I urge you to explore my book. Google books has prepared a generous preview of the book contents. If you like the book, please request your librarian to purchase a copy of this book for your library or reading room.

Jules J. Berman, Ph.D., M.D.
tags: big data, metadata, data preparation, data analytics, data repurposing, datamining, data mining, informatics, complexity, jules j berman, medical history