Monday 23 July 2012

What next against the microbes?



Anti-microbial resistance is a rapidly growing problem and one that is going to hit future generations hard. Back in march this year, Margret Chan, director of the World Health Organization, highlighted in a speech to ECCMID (European Congress of Clinical Microbiology and. Infectious Diseases) the risk of the last 80 years of medical advances being wiped out in one stroke as bacterial diseases such as TB and even E. coli infections become once again fatal. Maternal death rates will rise and even a grazed knee could put a toddler at risk of serious illness. Procedures such as hip replacements and organ transplants could not go ahead. Even something as common place as an appendectomy would become a high risk procedure.

So how do we slow this advance? What other options do we have? There are three main directions of approach that are being taken.

Firstly, the antibiotics that we currently have must be used correctly. This means using them in the right combinations and in the right situations. Antibiotic resistance was first noted in the late 1950s when the drugs were used in vast quantities to promote livestock growth. The Swann report banned such use of drugs then involved in human medicine in 1969 however, a decade is more than enough time for plasmids coding for enzymes conferring resistance to enter the gene pool and these developments were more seen as "voluntary reforms" rather than an out-right ban. Ways around the ban were quickly found as more antibiotics were discovered.



In recent months however, the FDA has been accused of withdrawing proposals for back in 1977 (that ban the use of penicillin and tetracycline antibiotics in farm animals) in the USA. To respond to this accusation, a new ban on cephalosporin antibiotics was announced in January if this year. This ban only has an effect on around 0.2% of antibiotics currently used on American farms. In Europe, all-encompassing bans have stood for well over a decade.

In more recent years, GPs have reported patients demanding antibiotics for colds (which are caused by viruses, thus antibiotics will, at best, have a placebo effect) and, more frequently, failing to finish a course of antibiotics as they felt better after only a few days. These situations both lead to increased levels of resistance in populations of bacteria. Here it lies in the individuals’ responsibility to use their common sense and simply read a set of instructions on a packet. However it is surprising how frequently people fail to do so.

An important focus in the correct use of antibiotics is in using the right drug for the infection presenting; this means a vast improvement in many current diagnostic techniques. This is crucial in the tackling of super-bugs. The drugs used to treat such illnesses tend to have many toxic side effects, and you risk increasing antibiotic resistance in the bacterial strain if it is overly exposed to less than fully effective antibiotic. 

Currently, the average, well-supplied hospital in the developed world takes around 48 hours to fully identify a bacterial strain (samples must be taken, sent to a lab, grown up and then returned with information about the strain and appropriate drugs). A new approach, discussed at the ECCMID, was the promotion of “labs on a chip” technology (such as those already in use for AIDs or liver disease diagnosis) which would allow treatment to be tailored to the infection much more efficiently.

The second area is the development of new antibiotic drugs. In the four years between 2008 and 2012 only two new antibiotic drugs were improved for human consumption but back between 1983 and 1987 the FAD (Federal Drug Administration in the USA) approved the use of 16 new antibiotics. The problem really lies (as it often does) in the money.  A course of antibiotics is usually taken for a few weeks whilst a drug to lower cholesterol or to increase longevity will be taken for life. The comparative profit in antibiotics is small.

A complete overhaul of the drugs industry would be ideal, whereby the financial rewards are based around in usefulness of the drugs rather than the quantity sold. This is unlikely. However the pharmaceutical company GlaxoSmithKline announced last year that is involved in a $40million contact with the Biomedical Advanced Research and Development Authority and the Defence Threat Reduction Agency (two USA government agencies). It seems that the potential threat of bioterrorism is enough to get people putting up the cash.

Finally, as always it is the simple things that make the difference. Proper hand washing in hospitals, going back to the basics of regulated sterilizing techniques, has seen a fall of 80% in cases of MRSA infections in NHS hospitals since the peak in 2003. Infection control had become lax due to over reliance on the power of antimicrobial drugs.

Multi-drug resistance is not a problem that will go away; it will become a greater and greater issue to future generations. But many agencies and organisations are now working on solutions and new approaches to give us more time for the next Fleming to revolutionise medicine again.


Published in The Yorker October 19th, 2012

Saturday 7 July 2012

Censorship in Science


Earlier this week, it came to light that Dr George Murray Levick, a scientist working with Captain Scott’s Antarctic expedition, had his observations of the unusual sex lives of the Adelie penguins censored. This censorship was twofold, performed by Murray himself, by writing his findings in Ancient Greek, and by those who published his paper “The Natural History Of The Adelie Penguin”, by omitting those sections.

The reasons for this were simple, his observations and graphic descriptions of the abnormal sex acts occasionally performed by the younger penguin males were deemed as being too shocking for the Edwardian public. In contrast, the BBC and numerous other media outlets appear to feel that the modern-day public are desensitized enough to learn about penguin necrophilia. The original publication has almost become an ornithological Lady Chatterly’s Lover.

Considering the theme, censorship of scientific data and discovery is not a new phenomenon, be it by the church or, more recently, by the state. The infamous example of this occurring in 1633, with Galileo and the Catholic church. The story of Galileo Galilei and his support of Copernicus’ heliocentric model of the solar system now being taught in schools as an example of science proving blind faith wrong.

Today, the church, in all its forms, has long lost the power to suppress discoveries it disagrees with and the internet has taken freedom of information to a level that even our parents’ generations would have struggled to predict. Nowadays, ideas and discoveries are not considered dangerous if they change a long held view of the universe, astronomy is no longer a threat to any country’s stability. Instead, it is the fields of defence research that come under scrutiny.

The most recent examples of government censorship of science can be seen in the USA, as the result of heightened national security post-9/11. One of the most publicised examples occurred in 2011, when the National Science Advisory Board for Biosecurity (an American federal organisation) requested that key information on the methods behind the modification of a flu virus were removed from a scientific paper. The justification being born from the partially exaggerated fear that the information contained within may be used to create an extremely deadly strain which would be utilised as a powerful bioweapon.

Some may argue that, when it comes to censorship by the state, governments could be seen as justified in their attempts to keep discoveries and research into fields such as virology and nuclear weaponry on the quiet. Indeed, the line between breaking a censor and committing treason can become very narrow indeed.

A noted demonstration of this occurred in 1953, when the issue of censorship ceased to be a one of contemplation for two American communists and engineers. For Julius and Ethel Rosenberg, censorship had become an issue of life and death as they became the first US citizens to be executed for espionage, after passing information about the construction of atomic bombs to the Soviet Union.

It would be naïve to assume that censorship is only performed by religious organisations or governments. In reality, the media is also to blame as the larger and more established scientific journals, such as Science and Nature, discreetly request a kind of confidentiality to be applied by all its contributors. Essentially a subtle form of self-censorship. Many feel that information that could pose a threat to the general population should remain unpublished. In fact, this form of self-censorship is likely to have had much more of an effect than any government mandated enforcement.

Upon reflection, history does appear to show us that, in every field of science and the arts, attempts at censorship will always eventually fail. Sometimes the truth is too big to be contained, sometimes people gossip too much to keep secrets. Eventually, the truth will out.


Published in Nouse, June 20 2012

Bioengineering: a brave new world?


It seems to be alarmingly socially acceptable to parade a lack of scientific or mathematical understanding, yet the same attitude towards literature and the arts is ridiculed. An arts student can comfortably say, “well, I barely passed my GCSE maths” when faced with dividing a restaurant bill, but a undergrad scientist claiming “I only got a D in English Literature” when stumped by a pub quiz question on the author of Romeo & Juliet is very brave indeed. So why is it acceptable to lose all scientific and mathematical know-how during the post-GSCE booze-up, whilst science students are required to have a general knowledge of the arts?

This subject is covered in great depth, and with greater degrees of anger, all over the internet, so I shall refrain from going into it further. Instead, I offer a possible degree of balance in providing a background understanding to the science seen in the media and in literature.

In 1931, Aldous Huxley published his seminal work, Brave New World. Generally filed under that grand umbrella title “science-fiction”, it is a thought-provoking work of satirical fiction. Its dystopian future features developments in reproductive technology, brainwashing, sleep-learning, and the placation of the masses with the wonder drug soma.

But from a biological perspective, the stand-out technology Huxley introduces is the concept of bioengineering. We now live in an age where our ever-increasing knowledge of our genome and development could (and will) escalate us rapidly into having the ability/possibility to design the perfect children, the clones of lost loved ones, and perhaps, eventually, the perfect workers.
So how does this work? Well, to start with, we can consider the processes necessary to create transgenic mice – mice whose genomes contain genetic information introduced artificially from another organism. These mice were first created nearly 40 years ago in 1974. Transgenic animals are used to produce therapeutic drugs for humans. Tracy the Sheep (no relation to Dolly) was one of the first. Created in 1990, she produced a protein called AAT in her milk, due to the insertion of human DNA into her genome. This protein is used to treat patients suffering from cystic fibrosis. Tracy made it to the ripe old age of seven and inspired the cloning of Dolly.

One method of creating transgenic animals is the Embryonic Stem Cell Method. The foreign DNA is introduced to embryonic stem cells of the target species by processes such as electroporation (the cells are zapped with a quick burst of electricity which creates tiny holes, allowing the genetic material to enter) or by treatment with specific chemicals.

These stem cells are then inserted back in to the blastocyst (very early stage embryo) and then the whole thing is implanted into a foster mother and born normally.

The animal produced here is considered chimerical, think of the Greek myth, an animal made up of parts of different creatures. Some of its cells will have accepted the foreign DNA, some will have not. Conventional breeding techniques, which have been in use for hundreds of years, can then be used to produce offspring that are fully transgenic.

So what is the difference between creating a mouse or a sheep with foreign DNA and doing the same for a human? Could we make a human with an ape’s strength or, more excitingly, could we create glow-in-the-dark babies?

Frankly, there is very little difference from a purely biological point of view between creating fluorescent puppies, as done in 2009 in South Korea by Byeong-Chun Lee, and doing the same with a human. Time and money are factors; it takes many more years and a great deal more lab space to raise a human to maturity than it does a mouse. Ethical concerns, on the other hand, play a greater role.

So are Huxley’s visions of production lines of children, each carefully specified to the role that they will play in society unrealistic? Well, yes. Scientists are people too; we have the same ethical concerns as the rest of you. But is it impossible? Certainly not. Bioengineering is a rapidly growing field – and who knows, perhaps someone out there with enough funding really does want to create a Brave New World all for themselves.

And, as for the book, well I think it is a damn good read.


Published in Nouse, May 1 2012