Monday 19 November 2012

Morgellon's, Messiahs and Mass Hysteria.


Over the course of the last 14 months, 15 people (mostly teen-aged girls) in the town of LeRoy in New York State developed Tourette's-like twitches and vocalizations. After an exhaustive series of diagnosis, it was concluded that these occurrences were most likely the result of conversion disorder.
Conversion disorder, formally known as hysteria, is a psychiatric disorder that causes sufferers to display neurological-like symptoms of twitching, fits and spams
An example of Jerusalem syndrome ©Wikimedia Commons; Image Credit: Ja
This is looked into in the Channel 4 documentary "The Town That Caught Tourette's". Many of the girls felt let down by the diagnosis, thinking "this can't all be in my head" and sort out other, even less probable answers.
There are many cases throughout history and the modern world of similar occurrences and there are many more diseases and disorders that we are yet to understand. Here I am going to have a look at some of the odder cases:

Morgellon's Disease

First reported in the 17th century by the physician Thomas Browne, Morgellon's exhibits as aches, faitigue, skin lesion and the sensation that insects or fungal roots are growing beneath the skin. Fibers are "seen" to be growing beneath the skin surface.
The disease has shown a sudden boom, thought to be due to the internet - a meme sickness as it were. However, in recent months, researchers and doctors have begun to report strange findings within the skin of sufferers, possible markers of an actual fungal infection. Cases have also now begun to be reported in infant children who are not normally susceptible to psychosomatic disease.
Whether or not Morgellon's is an actual disease or a "delusional parasitosis" - a case of the brain becoming convinced of an infection - is still up in the air.

Alice in Wonderland Syndrome

Also known as the less catchy Todd's Syndrome, this is a disorder that affects the mind's concept of perception. The visual cortex stops being able to process scale, thus larger things seem small and vice versa. Sufferers report feeling taller than a tree or too small to reach up and sit on a chair.
Normally temporary (an episode can last anywhere from one minute to a whole day), and presenting mostly in migraine sufferers and young children, this syndrome could give us interesting insights into how the brain deals with self-perception and scale.

"Strawberries With Sugar virus"

Back in 2006, in an episode of the popular Portuguese teen show "Strawberries With Sugar", the characters and their school were struck by a life threatening virus. Shortly after, over 300 pupils at 14 separate schools came down with dizziness, rashes and shortness of breath, all symptoms of the fictional disease.
Many of the schools were forced to temporarily shut down and, coincidentally, reschedule some state exams that were meant to be occurring.
After much investigation, the Portuguese National Institute for Medical Emergency concluded that the cases were simply doe to mass hysteria.

Jerusalem Syndrome

An odd phenomenon has been seen in tourists visiting the city of Jerusalem; people, normally middle-aged men, who have lived their lives otherwise perfectly normally spend time in the city and suddenly begin to dress in old bed sheets and declare that they are John the Baptist.
That is a vast over simplification, but Jerusalem Syndrome is a well-studied occurrence with vast numbers of tourists every year suffering from religious psychosis and delusions that evaporate as soon as they are removed from the city.
The syndrome even got a mention in The Simpsons; in The Greatest Story Ever D'ohed, the family visit Jerusalem and Homer becomes convinced that he is the Messiah.

Dancing Plague of 1518

An oldie but a goodie. the Dancing Plague began in Strasbourg as a woman named Frau Troffea began to dance in the street. She didn't stop moving until she died of exhaustion and dehydration four days later. By that point, 30 or so people had joined her and by a month later over 400 dancers were out in the street. Many of the afflicted died from heart failure or strokes.
Records documenting the dancing deaths make it very clear that this was not twitching, or convulsing, but "although they were entranced, their arms and legs were moving as if they were purposefully dancing".
Over the last 490 years, many explanations have been offered. Everything from divine inspiration to ergot poisoning (think the organic version of LSD) has been blamed.
Most likely though, this is another case of mass hysteria and stress-induced psychosis. These people had been living in famine conditions for at least 7 months and in an area riddled with disease and horrific living conditions, it is possible that something just snapped in the population as a whole.




Published in The Yorker November 17th, 2012

Friday 12 October 2012

The Shot-Gun Approach to Applying for a PhD or Masters


About to start the final year of my university degree, I am beginning to panic. It is not the massive increase in work load or the dreadful prospect of finals that is getting to me, but it is the huge and terrible decision that I now have to make.
What should I do after I graduate?
I have always – perhaps there has been an occasional lapse when the idea of being a journalist flittered by – wanted to go on to postgraduate education. Partially because of the fear of entering the world of work but mostly because I really do enjoy research and I find the lab work side of things suits me well.
So that brings me to the next step. Applying to the PhDs and Masters that I want to take.
The logical step for a biochemist on a 2:1 from a reasonably decent institution would be to do a Masters degree, taught or research based, before applying for a PhD. This plan, however, makes the assumption that I either have the money or am willing to take out a (real actual) bank loan to cover one year of education that’ll cost anywhere from £5000 to over £11000 in fees alone. All that for something that may not even be necessary for me to proceed in the direction I want.
Well then, how about applying straight to PhD?
In fact, that is the approach I am taking. But there are so many PhD placements and so many of them look so good. The entry requirements do seem to vary somewhat but most are pretty similar. Here, money is less of a problem, your studies will be funded by various research councils and funding bodies.
At around £50 a pop, with CVs, personal statements and all sorts of transcripts to send off, applying for a PhD isn’t a quick job. This isn’t the sort of thing that “they” recommend doing in an offhand way. Focus on one or two and do them well, don’t just fire out CVs all over the place.
And yet, I am going to apply for all the ones I like. And at least one Masters course, almost as a backup.
Better safe than sorry.

Saturday 6 October 2012

What Is The Point Of Acupuncture?


A recently released data meta-analysis, published online at www.archinternmed.com, has studied the effects of acupuncture on 17, 922 patients suffering from chronic pain conditions such as osteoarthritis and chronic headaches.

The randomised control trials (RCT) compared the effectiveness of acupuncture against no treatment and "sham" acupuncture. Only studies that were perceived to be unbiased and followed the patients for at least four weeks post-treatment were considered.

A acupuncture practitioner in action ©public domain; Image Credit: Kyle Hunter
Across all the pain conditions, patients who received acupuncture reported significantly reduced pain after four weeks compared to those who received no treatment. Patients who received a sham version of acupuncture (for example, using needles that retracted rather than pierce the skin) also reported reduction in pain, but to a lesser extent although a statistically significant difference was seen.

This difference in effectiveness could well be due to the difficulties in conducting a true double blind trail that involves sticking needles into people. In all likelihood, many patients will have observed that their skin was not actually being pierced; and obviously the therapists would be aware that the needles were false. Or at least, one would hope that they would have noticed.

Smaller differences in acupuncture benefits were seen in trials where the patient’s received ancillary treatments such as physiotherapy and gentle exercise programs lead by physical therapists.

These results show that, whilst acupuncture is effective as a treatment against chronic pain, there are many additional factors to be taken into consideration; for example, the relationship between the acupuncture therapist and the patient likely has an impact on the well-being of the patient. Most people feel improved after spending an hour or so getting one on one treatment of any sort.

This is good news for sufferers in the UK as acupuncture treatments on the NHS are being offered as a viable alternative to strong painkillers which are often addictive with many side effects from long-term use. Unfortunately the NHS offers acupuncture as a treatment for many things, and the placebo effect will only take you so far.

In comparison to this, is an investigation conducted by the National Patient Safety Agency, which reported adverse effects following 10% of acupuncture treatments. Whilst most of these cases are reports of nausea and fainting, some patients reported having needles left in them for hours (long after the therapist had left) with some needles having to be surgically removed.

A few reactions have been more severe. Edzard Ernst of the University of Exeter reports that studies of acupuncture in other countries and outside of the NHS in the UK have shown 86 deaths in the last 45 years due to lung collapse after acupuncture. Whilst these cases are extremely rare, it does suggest that the NHS should refrain from offering acupuncture as a treatment for everything from anxiety to nicotine addiction (as it currently does) until a very large scale RCT can be conducted that shows any benefit at all that will outweigh even the slightest of risk.


Published in The Yorker October 5th 2012

Thursday 4 October 2012

Mathematical Inquiry


In July, the House of Lords science and technology committee published an inquiry into higher education in STEM subjects (science, technology, engineering and mathematics) looking at, among many other things, the falling rates of uptake in the subjects regarded traditionally as “hard” sciences in the UK in comparison to many other countries.

Lord Willis of Knaresborough (who chaired the inquiry) suggested that the biggest problem lies not with a lack of interest but with a lack of mathematical skill. The UK now lies 28th in the world in school leavers maths skills, and we are falling further every year. Students who do not take maths at A level cannot apply for many STEM courses and find study of most sciences a much larger leap to degree level.

I did do maths at A level, though it was not my strongest subject, because I wanted to go on to study Biological sciences. Armed with this A level I was able to apply to all the universities of my choice although now many STEM courses at good universities do not require an A level grade in mathematics, not because it is unnecessary, but because there are not enough students applying with this qualification to fill the universities' courses.

And where A level maths is necessary to apply for the course some universities are forced to offer remedial maths to all STEM students. Friends studying medicine at top universities report being alarmed by the complete lack of maths skills in their fellow undergrads. This simply tells us that, for everything that A* at A level is worth, the current school maths curriculum is not doing what it is meant to do and forcing more, disinterested pupils onto it will not solve anything. I for one can vouch that no one should ever be made to slog through Statistics 3 against their will.

Instead of a compulsory boosting of numbers, the teaching itself must be changed. Last year Government statistics published on 140,000 secondary school teachers showed nearly a quarter of maths teachers lacked degree level maths. I spoke once to a woman who taught maths at a local secondary school, her qualifications in the subject began and ended with a B at O level. She confided in me that she hated maths but that no one else at the small school was willing to teach it either. No student is going to learn from a teacher who is uncomfortable teaching the subject.

The curriculum at GCSE needs an overhaul first. I can't remember much from my year eleven maths lessons except for playing games, eating sour skittles and being incredibly bored by the entire thing. Barely anything we learnt had real world applications and most had been taught by rote year on year since the start of secondary school. Maths post-GSCE was seen as intimidating, a massive leap up from what we had done so far.

And, in a way, it was. Yet still the lessons were based around memorizing equations and mechanical rules; very little real-world problem solving was done. Days when we tried old Oxbridge entrance papers, or the 6th form maths challenge were rare because these are not a test of how many geometry equations you could memorize, but instead a challenge of logic, understanding and comfort with numbers, something that wasn't necessary to have to pass the course.

This is were the problem lies, not in the numbers of pupils, but in the core of the curriculum - students are leaving school with no skills to apply their maths. Base the A level around maths you require, give the pupils something to think about, to apply to situations that they come across and more people will begin to feel comfortable with the subject and will take that with them into higher education.

Tuesday 2 October 2012

Musings of a lab rat


I am trying to be grateful for what I've got...
Over 24 weeks of lab work experience outside of University is a great thing to have on your CV, but frankly. I'd like to have my last three Augusts back. See, life as an undergraduate (and even pre-university) lab rat is very rewarding, good experience and all that rot, yet I can't help but feel somewhat fed up with the whole business.
Perhaps a career in science is not for me, or, more likely, weeks of sitting about, staying out of the way and reading the BBC news website is not an accurate experience of the world of a research lab. My summers of 2010 and 2011 are a hazy blur of protein preps, DNA gels and crystal try after crystal tray. At no point could I honestly say I fully understood what I was doing and why. I did enjoy it though, the procedures and visible results.]
An eppendorf centrifuge is one of the many machines that will try and kill you. Just make sure that you close the lid ©Wikimedia Commons; Image Credit: Rockpocket

Admittedly this summer has been an improvement. Currently I am in week six (of eight) and, whilst all my projects are crashing down around my ears, I have been able to stretch my scientific wings a tad more, as it were.I have my own projects, am left to my own devices and can make my own decisions as how best to proceed. The subsequent lack of success so far is possibly a result of this.

In all honesty, this new-found laboratory freedom is more likely a consequence of the assumption that I, having now completed the second year of a Biochemistry degree, must know what I am doing, rather than any display of competence on my part, however it is best not to think too hard about such things, rather stick to doing what I do best (muddle on regardless of all else around me).

Yes, I still follow the post-grads around with my notebook out begging the answers to questions such as “Why is my cloning not working?” “What have I grown on this plate?” and most importantly “Why is that machine beeping at me in such an angry way?”*, but now some problems I can finally tackle for myself. Key among these being what to next to keep my slowly sinking projects above the water line slightly longer, and when it is acceptable to suggest a tea break to your colleagues (any time but lunchtime, it seems). So for the next couple of weeks, I shall proceed as best I can, googling my way through experimental protocols and depending alarmingly on Wikipedia.

*answers, in no particular order, are : “Because you've screwed up”, “Because you've screwed up” and “I don't know, but kill it now”.


Published in The Yorker September 19th 2012

Tuesday 11 September 2012

What kind of quackery is this?


Jeremy hunt, believer in homeopathy and the integrity of Rupert Murdoch, is the new health secretary. This has worried a lot of people.

Back in 2007, Hunt signed a parliamentary Early Day Motion which supported the spending of NHS money on homeopathic “medicines “. The idea that the NHS could ever possibly phase out any clinically tested drugs for overpriced sugar pills took an alarming shuffle towards reality.

Arguably, the concept of the Health Secretary supporting homeopathy is just as ridiculous as the Environmental Secretary Owen Paterson being a climate change “sceptic”. Oh wait…

Tory cabinet reshuffling aside, homeopathy is one of the largest fields of alternative medicine. It is also the subject of ridicule amongst scientists, and angry defensive statements from its supporters.

Homeopathic remedies involve dissolving a substance (anything from snake venom to charcoal) in ethanol, then diluting this into water many, many times before tapping the contained solution against a hard object, such as a Bible or a leather covered paddle. The hypothesis is that the water molecules “remember” the initial potent substance; so the pill is apparently effective no matter how dilute it becomes. This flies in the face of all known laws of chemistry and physics.

The limited effect that homeopathic pills have is based around the placebo effect. In a case of mind over matter, the brain tells itself that the injection (though in fact saline) was a painkiller so that some of the pain felt is ignored.

Homeopathic remedies have never been shown to be effective in any large, placebo-controlled randomised clinical trial (a test all medicines must go through). One placebo is as good as another.

There is an anomaly though, one brought up in by many supporters of homeopathy whenever their particular beliefs are challenged. Known as the Belfast Study and published in 2004 in Inflammation Research, a group at Queen’s University looked into the reaction of basophiles (human white blood cells involved in inflammation) to ultra-dilute concentrations of histamine.

A dilution at 10^-38M (such as that used in homeopathy) in all likelihood will not contain a single molecule of histamine. And yet the claim was made that the cells reacted as if histamine was present. 

The placebo effect could not be at play here, individual cells are not sentient beings with an unquestioning faith in a particular treatment.

The anomalous paper makes for interesting reading and makes a good case for “weirdest scientific phenomenon outside of physics” but is certainly not any reason for people to get over-excited and start putting their health in the hands of a little bottle of over-priced sugar pills.

In fact, many further studies and attempts to recreate the workings of the initial Belfast study have found no such behaviour. The most notable of these follow up investigations involving BBC Horizon and James Randi, the famous skeptic. Until they do, it’s just a curious anomaly worthy of some further study.

Furthermore, until a homeopathic treatment can be shown in a full, unbiased clinical trial to be as effective as any current medical treatment, the idea that the NHS should fund such a therapy is utterly ridiculous.

Friday 7 September 2012

I want to be a writer…


More specifically, I want to be a science writer.
Neither of those two sentence s come across well in front of a careers advisor. They conjure up awkward questions like “what have you written?” and “how much work experience have you got?”
Unfortunately my answers are the same as everyone else’s: a few things if the student papers and little, bordering on none. This does nothing to help me stand out from the crowd.
And so I am now trying to write for anything that’ll take me. Blogs, mini-articles, really long articles. Anything and everything. My portfolio is, gradually, expanding; although the veiw count on my blog remains embarrassingly low.
It is that coveted journalism work placement that I am after. And now it is crunch time. In just over six months, I will no longer be an undergraduate.  A terrifying thought. By CV is remarkably bare, and I haven’t updated it for over a year.
But all is not lost. The internet, as always, has come up trumps. I read an aarticle written by a man named Ed Young, in which he told how he and his coworkers  became science writers. Their routes were widely varied and, in many cases, circuitous. Some took formal training, others had had no previous experience  writing. Many took years to get a steady job, others landed one straight from university. Very few of the writers managed to live perfectly to the life plans that they made aged 18.
This is the point I’m trying to make. It doesn’t matter where you are now, as long as you are focused on where you want to be. Don’t make a set and steady plan for yourself, having a few goals and targets is good, but life is not predictable.
And, most importanly, keep updating  that CV, you never know when you’ll next get a chance to send it off.

Published grads.co.uk September 2012

Wednesday 5 September 2012

House-mates and Hazmat Suits

As part of the generation at university now, it is increasingly likely that we are all going to be sharing rented houses until we are well into our twenties and, for the unfortunate few who decide to remain in the world of academia for any longer, all the way to our thirties. From groups of undergraduate friends, to postgraduate acquaintances, to employed professionals who you barely know; house-mates can vary wildly.


 Sources of contention between house-mates are just as wide ranging. Fights can break out over anything from who sits where in front of the TV to why someone feels the need to hide all the teaspoons in their wardrobe. But these rows can really be boiled down to three basic root problems. They are money, food and cleanliness. Everyone has different ideas as to how to manage these three minefields based around their own upbringing and personalities. And as always, their way is the wrong way when compared to yours.

All of these issues, as with any problem, should really be tackled sooner rather than later but they rarely are. Instead house-mates are often seen to sink into passive-aggressive note writing and the light bulbs that break in the first week will never get replaced. Some really good advice can be given as to handling these issues, though it is unlikely to be taken. Instead here are my three top tips to dodging your way through the bigger issues (relatively) unscathed.

Money – Sort this out early on as it’ll only get more complicated as time goes by. Bills are the biggest spend in a shared house so it is important to make sure that they are all covered and not in just one person’s name. A responsibility shared is a damn good way of making sure fewer people successfully avoid their bills.

The bills themselves can either be dealt with as they come, or people can each pay a little money into the kitty once a month and the bills paid out of that lump sum. This latter method has the added advantage of all the remaining being money split between you on moving out day. As for shared shopping (covering everything from toilet roll to light bulbs), always get a receipt but try not to niggle over the pennies. Ten pence here or there will not bankrupt you.

Food – Don’t eat anyone else’s food. Simple. Ideally this would mean that they will refrain from eating your cereal, bread, fruit and precious, precious Nutella. That is however, not how the world works. If you know who it was who ate your last chocolate pudding, it is best to ask them to cease and desist with their eating habits as soon as possible. This may work, but you will be better off buying a mini-fridge and starting to hoard anything particularly yummy in the safety of your own room.

Cleanliness – Now, many people recommend setting up a household rota for chores such as vacuuming and scrubbing the shower as early on as possible. If your house-mates are all sane, reasonable humans then this should work. You would also be one of the lucky, lucky few with normal sane housemates – the rest of us aren’t so lucky.. Instead, do your best to keep your own personal area tidy (bedroom, kitchen cupboard and the like) and in extreme cases, you may find yourself having to keep your own stash of crockery to save it from being left to rot in the washing up pile.

Some people will have just left a home where their mothers clean everything and tidy up after them; these people will need help in learning how to use a vacuum cleaner and will not be aware as to what actually constitutes washing a plate. They may even be useless layabouts, but try to avoid becoming that one person who always does all the washing up because you will be vulnerable to being taken advantage of. However, remember that it is not below you to rinse out someone else’s coffee mug on occasion; they may then do the same for you.
The one thing that must be maintained in a shared house above all else is communication. Don’t sulk in your room because someone is playing their music too loud or have left two day old takeaway in the sitting room. Go and ask them to turn it down or pick it up. They will not resent your for it, and it will prevent your soul from becoming pickled in a sea of hatred for your house-mates. If the worst comes to the worst, you will all be able to bond over a shared hatred of your landlord.


Published Grads.co.uk August 2012

Thursday 23 August 2012

Belief Under The Microscope


Humans as a species, show many traits that, whilst exhibited individually in some other species (tool-using crows, communicative dolphins, altruistic Meerkats) are not all exhibited in any other species.

But what is it, of this combination, that makes us “human”? All cultures across the world show three big ideas that come together, wired in to our brains as they develop. They are complex language, music and, of course, religion.

The origins of religion were not, until quite recently, ever given much deep consideration. Possibly because of the vaguely taboo nature of the subject. The idea that we may be evolutionarily hard-wired to have faith in divine, all-knowing beings can raise some hackles both on the side of the theists and the atheists.

This hinges on the idea that religion arises simply as a result of the way the human brain works. Children (especially younger infants) are a good way of investigating the “default settings” of the human brain, and do show a strong tendency towards believing in gods, or at least in stronger powers at work.

Paul Bloom, a psychologist at Yale University, suggests that this is because the human mind uses distinctly separate cognitive systems when considering inanimate objects (boxes, trees and the like) and things with minds – or at least, that can move under their own free will.

Show a five month old child a person moving in a stop-start way about a room, they will be content, but a box moving in the same pattern with elicit a surprised response. Even babies know that boxes cannot move around by themselves; something with a will must be behind this phenomenon.

It was this reasoning that early humans would have applied to things that they did not understand. Something must be causing rustling in the bushes, where else could it come from? Our early ancestors that thought it could be a monster and ran away. What mind is behind lightning, who is creating it? Some of the first humans probably thought it would be a good idea to be nice to something that powerful, perhaps leave some food out for it just in case it came for them.

Another key tendency is to attribute purpose or design to inanimate objects. Ask a small child why the sea exists, they reply “so the fish have somewhere to live”. It would make much more sense to think “these berries have been made for us to eat” rather then “through millions of years or trail and error adaptations we have evolved to be able to eat these berries”. In modern times, we still apply human characteristics to animals or technology (albeit in a more knowing way). Our pets have personalities and our computers only crash when they know we've nearly finished the articles we are writing.

Pre-school children, when interviewed about why things exist and where did they come from, are more likely to suggest that life and the world were created by an higher, unseen power than by humans, and incredibly unlikely to suggest any theories that would necessitate an understanding of long periods of time.

If a human can create a pot out of clay, surely a more powerful being with an endless supply of clay could create the mountains?

The concept of “common-sense dualism”, where by we except, briefly, things we know aren't true for methods of forward planning “what-if” situations and use for emphasizing with others plays a key role in our acceptance of disembodied minds.

By the age of four, over half of all children will have and an imaginary friend. These can be friendly or malignant, human or animal, but it widely regarded as being a way that young minds learn empathy and to think for people other than themselves. Evolutionarily, those who have learnt those skills will go on to be more successful in early societies.

In adults these traits can still be seen. In a non-religious setting, this can be anything from maintaining a mental relationship with a dead loved-one to creating a fantasy life, often with fictional characters or celebrities. In these cases, the desired behaviours in the character are projected from the believer. It is suggested that, in believing in gods, adults are simply projecting a series of assumed behaviours on another imaginary being. People with autism or autistic traits are less likely to believe in gods and rarely have imaginary friends in childhood as they find it harder to imagine what another would be thinking.

But these traits all have an awareness built in. People know that they aren't actually speaking with their long departed grandmother and only the very delusional truly believe that Ryan Reynolds is their boyfriend. So how do beliefs in gods avoid this rationalisation?

This is where organised and group religion comes into play. You believe what everyone else believes. No one tells you otherwise. Religion-as-an-adaptation as it were. The shared religious beliefs of a group of our ancestors would bring the tribe closer together, cooperating better in hunting and food gathering. This would thus allow the religious group to out compete the other tribes.

So religion may have started and spread as a by-product of our evolutionary success. Does this invalidate it?

Possibly, but many could argue against that. Religious beliefs are one of the things that set us far apart from the other animals. They may have given us the first moral law systems, the first large buildings, some early tribes in the Middle East have been shown to have come together for worship. They also however created wars and genocides, stalled scientific progress and caused all sorts of discrimination.

Have we out-evolved religion now? Take a look at the world, and it is pretty obviously not the case. Are we now a rational enough species to see the source of these beliefs and make the most good of what we have done so far? I'd hope so, but we do have a long way to go.


Published in Spark October 2012 (yorkspark.co.uk)

Monday 23 July 2012

What next against the microbes?



Anti-microbial resistance is a rapidly growing problem and one that is going to hit future generations hard. Back in march this year, Margret Chan, director of the World Health Organization, highlighted in a speech to ECCMID (European Congress of Clinical Microbiology and. Infectious Diseases) the risk of the last 80 years of medical advances being wiped out in one stroke as bacterial diseases such as TB and even E. coli infections become once again fatal. Maternal death rates will rise and even a grazed knee could put a toddler at risk of serious illness. Procedures such as hip replacements and organ transplants could not go ahead. Even something as common place as an appendectomy would become a high risk procedure.

So how do we slow this advance? What other options do we have? There are three main directions of approach that are being taken.

Firstly, the antibiotics that we currently have must be used correctly. This means using them in the right combinations and in the right situations. Antibiotic resistance was first noted in the late 1950s when the drugs were used in vast quantities to promote livestock growth. The Swann report banned such use of drugs then involved in human medicine in 1969 however, a decade is more than enough time for plasmids coding for enzymes conferring resistance to enter the gene pool and these developments were more seen as "voluntary reforms" rather than an out-right ban. Ways around the ban were quickly found as more antibiotics were discovered.



In recent months however, the FDA has been accused of withdrawing proposals for back in 1977 (that ban the use of penicillin and tetracycline antibiotics in farm animals) in the USA. To respond to this accusation, a new ban on cephalosporin antibiotics was announced in January if this year. This ban only has an effect on around 0.2% of antibiotics currently used on American farms. In Europe, all-encompassing bans have stood for well over a decade.

In more recent years, GPs have reported patients demanding antibiotics for colds (which are caused by viruses, thus antibiotics will, at best, have a placebo effect) and, more frequently, failing to finish a course of antibiotics as they felt better after only a few days. These situations both lead to increased levels of resistance in populations of bacteria. Here it lies in the individuals’ responsibility to use their common sense and simply read a set of instructions on a packet. However it is surprising how frequently people fail to do so.

An important focus in the correct use of antibiotics is in using the right drug for the infection presenting; this means a vast improvement in many current diagnostic techniques. This is crucial in the tackling of super-bugs. The drugs used to treat such illnesses tend to have many toxic side effects, and you risk increasing antibiotic resistance in the bacterial strain if it is overly exposed to less than fully effective antibiotic. 

Currently, the average, well-supplied hospital in the developed world takes around 48 hours to fully identify a bacterial strain (samples must be taken, sent to a lab, grown up and then returned with information about the strain and appropriate drugs). A new approach, discussed at the ECCMID, was the promotion of “labs on a chip” technology (such as those already in use for AIDs or liver disease diagnosis) which would allow treatment to be tailored to the infection much more efficiently.

The second area is the development of new antibiotic drugs. In the four years between 2008 and 2012 only two new antibiotic drugs were improved for human consumption but back between 1983 and 1987 the FAD (Federal Drug Administration in the USA) approved the use of 16 new antibiotics. The problem really lies (as it often does) in the money.  A course of antibiotics is usually taken for a few weeks whilst a drug to lower cholesterol or to increase longevity will be taken for life. The comparative profit in antibiotics is small.

A complete overhaul of the drugs industry would be ideal, whereby the financial rewards are based around in usefulness of the drugs rather than the quantity sold. This is unlikely. However the pharmaceutical company GlaxoSmithKline announced last year that is involved in a $40million contact with the Biomedical Advanced Research and Development Authority and the Defence Threat Reduction Agency (two USA government agencies). It seems that the potential threat of bioterrorism is enough to get people putting up the cash.

Finally, as always it is the simple things that make the difference. Proper hand washing in hospitals, going back to the basics of regulated sterilizing techniques, has seen a fall of 80% in cases of MRSA infections in NHS hospitals since the peak in 2003. Infection control had become lax due to over reliance on the power of antimicrobial drugs.

Multi-drug resistance is not a problem that will go away; it will become a greater and greater issue to future generations. But many agencies and organisations are now working on solutions and new approaches to give us more time for the next Fleming to revolutionise medicine again.


Published in The Yorker October 19th, 2012

Saturday 7 July 2012

Censorship in Science


Earlier this week, it came to light that Dr George Murray Levick, a scientist working with Captain Scott’s Antarctic expedition, had his observations of the unusual sex lives of the Adelie penguins censored. This censorship was twofold, performed by Murray himself, by writing his findings in Ancient Greek, and by those who published his paper “The Natural History Of The Adelie Penguin”, by omitting those sections.

The reasons for this were simple, his observations and graphic descriptions of the abnormal sex acts occasionally performed by the younger penguin males were deemed as being too shocking for the Edwardian public. In contrast, the BBC and numerous other media outlets appear to feel that the modern-day public are desensitized enough to learn about penguin necrophilia. The original publication has almost become an ornithological Lady Chatterly’s Lover.

Considering the theme, censorship of scientific data and discovery is not a new phenomenon, be it by the church or, more recently, by the state. The infamous example of this occurring in 1633, with Galileo and the Catholic church. The story of Galileo Galilei and his support of Copernicus’ heliocentric model of the solar system now being taught in schools as an example of science proving blind faith wrong.

Today, the church, in all its forms, has long lost the power to suppress discoveries it disagrees with and the internet has taken freedom of information to a level that even our parents’ generations would have struggled to predict. Nowadays, ideas and discoveries are not considered dangerous if they change a long held view of the universe, astronomy is no longer a threat to any country’s stability. Instead, it is the fields of defence research that come under scrutiny.

The most recent examples of government censorship of science can be seen in the USA, as the result of heightened national security post-9/11. One of the most publicised examples occurred in 2011, when the National Science Advisory Board for Biosecurity (an American federal organisation) requested that key information on the methods behind the modification of a flu virus were removed from a scientific paper. The justification being born from the partially exaggerated fear that the information contained within may be used to create an extremely deadly strain which would be utilised as a powerful bioweapon.

Some may argue that, when it comes to censorship by the state, governments could be seen as justified in their attempts to keep discoveries and research into fields such as virology and nuclear weaponry on the quiet. Indeed, the line between breaking a censor and committing treason can become very narrow indeed.

A noted demonstration of this occurred in 1953, when the issue of censorship ceased to be a one of contemplation for two American communists and engineers. For Julius and Ethel Rosenberg, censorship had become an issue of life and death as they became the first US citizens to be executed for espionage, after passing information about the construction of atomic bombs to the Soviet Union.

It would be naïve to assume that censorship is only performed by religious organisations or governments. In reality, the media is also to blame as the larger and more established scientific journals, such as Science and Nature, discreetly request a kind of confidentiality to be applied by all its contributors. Essentially a subtle form of self-censorship. Many feel that information that could pose a threat to the general population should remain unpublished. In fact, this form of self-censorship is likely to have had much more of an effect than any government mandated enforcement.

Upon reflection, history does appear to show us that, in every field of science and the arts, attempts at censorship will always eventually fail. Sometimes the truth is too big to be contained, sometimes people gossip too much to keep secrets. Eventually, the truth will out.


Published in Nouse, June 20 2012

Bioengineering: a brave new world?


It seems to be alarmingly socially acceptable to parade a lack of scientific or mathematical understanding, yet the same attitude towards literature and the arts is ridiculed. An arts student can comfortably say, “well, I barely passed my GCSE maths” when faced with dividing a restaurant bill, but a undergrad scientist claiming “I only got a D in English Literature” when stumped by a pub quiz question on the author of Romeo & Juliet is very brave indeed. So why is it acceptable to lose all scientific and mathematical know-how during the post-GSCE booze-up, whilst science students are required to have a general knowledge of the arts?

This subject is covered in great depth, and with greater degrees of anger, all over the internet, so I shall refrain from going into it further. Instead, I offer a possible degree of balance in providing a background understanding to the science seen in the media and in literature.

In 1931, Aldous Huxley published his seminal work, Brave New World. Generally filed under that grand umbrella title “science-fiction”, it is a thought-provoking work of satirical fiction. Its dystopian future features developments in reproductive technology, brainwashing, sleep-learning, and the placation of the masses with the wonder drug soma.

But from a biological perspective, the stand-out technology Huxley introduces is the concept of bioengineering. We now live in an age where our ever-increasing knowledge of our genome and development could (and will) escalate us rapidly into having the ability/possibility to design the perfect children, the clones of lost loved ones, and perhaps, eventually, the perfect workers.
So how does this work? Well, to start with, we can consider the processes necessary to create transgenic mice – mice whose genomes contain genetic information introduced artificially from another organism. These mice were first created nearly 40 years ago in 1974. Transgenic animals are used to produce therapeutic drugs for humans. Tracy the Sheep (no relation to Dolly) was one of the first. Created in 1990, she produced a protein called AAT in her milk, due to the insertion of human DNA into her genome. This protein is used to treat patients suffering from cystic fibrosis. Tracy made it to the ripe old age of seven and inspired the cloning of Dolly.

One method of creating transgenic animals is the Embryonic Stem Cell Method. The foreign DNA is introduced to embryonic stem cells of the target species by processes such as electroporation (the cells are zapped with a quick burst of electricity which creates tiny holes, allowing the genetic material to enter) or by treatment with specific chemicals.

These stem cells are then inserted back in to the blastocyst (very early stage embryo) and then the whole thing is implanted into a foster mother and born normally.

The animal produced here is considered chimerical, think of the Greek myth, an animal made up of parts of different creatures. Some of its cells will have accepted the foreign DNA, some will have not. Conventional breeding techniques, which have been in use for hundreds of years, can then be used to produce offspring that are fully transgenic.

So what is the difference between creating a mouse or a sheep with foreign DNA and doing the same for a human? Could we make a human with an ape’s strength or, more excitingly, could we create glow-in-the-dark babies?

Frankly, there is very little difference from a purely biological point of view between creating fluorescent puppies, as done in 2009 in South Korea by Byeong-Chun Lee, and doing the same with a human. Time and money are factors; it takes many more years and a great deal more lab space to raise a human to maturity than it does a mouse. Ethical concerns, on the other hand, play a greater role.

So are Huxley’s visions of production lines of children, each carefully specified to the role that they will play in society unrealistic? Well, yes. Scientists are people too; we have the same ethical concerns as the rest of you. But is it impossible? Certainly not. Bioengineering is a rapidly growing field – and who knows, perhaps someone out there with enough funding really does want to create a Brave New World all for themselves.

And, as for the book, well I think it is a damn good read.


Published in Nouse, May 1 2012