Left-handed (DNA) compliment

While attending the surprise birthday party for a friend in the sciences, I noticed he owned a coffee mug from a group called Life Sciences Career Development Society based at the University of Toronto. Actually, to be more accurate, what caught my eye was the DNA double-helix on the mug…and the fact that it was wrong.

Wrong

BOOM! My head explodes.

As someone with biochemistry & genetics degrees who has written about the life sciences for ~15 years, this seemingly innocuous error drives me crazy. This must be how copy editors feel when they see a misused semi-colon (and I’m sorry about that).

right v wrong

For the non-biochemists, through a characteristic known as chirality (look it up…too hard to explain here), the typical DNA double-helix in human cells—the famous Watson-Crick-Franklin structure or B-DNA—has a right-handed helix. This means that if you could see the double-helix (above) and let your fingers follow along the curve of one side as it wraps around the other, you could only do this comfortably with your right hand.

You can see that in an example of a single helix that curves either to the left (left) or the right (right).

handed

Unfortunately, somewhere in the mists of science illustrations time, an artist drew the double-helix of DNA in the wrong direction and that double-helical abomination has perpetuated ad nauseum, including irritatingly enough, in science publications and scientific logos such as a recent health article in the Toronto Star (image below), the cover of a sci-fi novel and the LSCDS logo.

DNA - backwards

And if I only saw this in 50% of the images, I’d be irritated but probably not the raving loony I am now. But no, this freak of nature is everywhere. The correct right-handed DNA illustration is the anomaly.

So why does this matter? Maybe it doesn’t, and I am just a raving loony.

Certifiable loony!

Certifiable loony!

But for me, I struggle with what else might be wrong in a life science or healthcare story if they can’t get the DNA helix right. Do I trust the math of an investment planner who doesn’t know that four quarters equals one dollar?

Luckily, the fix is an easy one. Simply mirror the image you have (not turn it upside down) and the left-handed double-helix becomes a right-handed one. (In a mirror, your actual left hand looks like a right hand.)

So, medical illustrators everywhere, please fix your catalogues of scientific illustrations. My head can’t take too many more explosions.

So much better

Nerves already calming.

[Also, from a design perspective, up and to the right is more palatable for North American audiences as it implies future success (e.g., rising value). Has nothing to do with handedness, here.]

For the more biochemically savvy:

Yes, there is left-handed DNA, which is also known as Z-DNA. But as you can see from the illustration below, its structure is significantly different from the smooth-flowing curves of B-DNA (and the equally unusual A-DNA).

Two rights don't make a left

Two rights don’t make a left

Off the pedestal, Western medicine

hubris

Western medicine can be a smug son-of-a-bitch. Seriously.

Now, it would be unfair to lump this attitude on all practitioners of Western medicine, but I haven’t the time to survey all of its adherents and gauge individual opinions so that I can name names of those who are the smug bastards and those who believe in thoughtful open-minded consideration.

To provide some context, I have a B.Sc. in molecular biology and a M.Sc. in medical genetics, and have written about the latest biotechnical and biomedical advances for about 15 years. I have also written about Western medicine for about 7 years.

Given this background, it may seem odd to some that I am writing a complaint about the attitudes of Western medicine, but what may not be as obvious about that background is the amount of hubris and self-satisfaction I have seen in questionable practices with limited benefits.

Old wisdom isn't useless because it is old

Old wisdom isn’t useless because it is old

Recently, there was an article in New Scientist magazine that described the rediscovery of a possible treatment against superbugs (e.g., MRSA), a therapy chronicled in an Anglo-Saxon era manuscript. The roughly 1000-year-old remedy is being studied in a modern lab and early results suggest that it may prove effective against the bugs that threaten modern lives on a weekly basis.

(BTW, there is a thousand miles between early results and coming to a pharmacy near you.)

But what struck me most was the response to the findings in various media, which bordered on shock and awe that something relevant to today could come from such an ancient source. Even CBC’s The National (Canada’s national news broadcast) commented that the discovery came from an era when leaches were considered good medicine.

Which leads me to scream:

Science wasn’t invented in 20th century, people.

The grand assumption seems to be that anything that happened in medicine before the First World War was complete voodoo and not worthy of consideration in an era of rational thought.

Everyone involved in health remedies before the modern medical era was either a charlatan or a moron, and either way was dangerous to the people around them. The human capacity for sober scientific enquiry did not occur until shortly after the invention of the Erlenmeyer flask, the spectrometer and the harnessing of the X-ray.

I call bullshit.

If you can grind it or infuse it, you can medicate with it

If you can grind it or infuse it, you can medicate with it

Folkloric medicines are based on scientific inquiry by people without test tubes and spectrometers. The approach may have been less statistical in nature, but everyone from apothecaries to shamans (shamen?) ran clinical trials the old-fashioned way.

Take this. Do you feel better? Great. It’s a keeper. Did you die? Yes. Nuts, try something else on the next guy.

Having actually looked at modern clinical trials, the only differences between then and now are the test patient population size and the accounting of the results. And I don’t know that we can say definitively that these parameters have improved things.

I am not advocating that we discard modern medicine—it has merit—but rather than it must get off its high-horse and approach historical medicine with an open mind so that more rediscoveries like this latest one can happen and be tested.

TCM has worked for millennia

TCM has worked for millennia

China has about 20% of the planet’s population, so there might be something to Traditional Chinese Medicines (TCMs). The same goes for India and folkloric Indian medicines (FIMs). Or Anglo-Saxons or Sumerians or the indigenous peoples of the Americas. These people were not morons.

Our ignorance and outright hubris is a hangover of the Age of Reason as we dismiss everything that came before because it was often presented in raiments of spirits and ritual.

Modern does not guarantee success

Modern does not guarantee success

We should not let our fascination with the instrumentational bells and whistles of the modern scientific method blind us to the wonders of the not-so-modern scientific method, which lacked in instrumentation but not in knowledge and understanding.

Before you blithely dismiss something as troglodyte quackery, perhaps you should ask yourself:

What would Hippocrates do?

And as to the CBC’s comment about the era of leeches, both leeches and maggots have a long history up to this day of facilitating health in people (see Leeches and Maggots).

Tar Pits – Los Angeles

And when I wasn’t looking at signs, one of the few touristy things I did in Los Angeles was visit the La Brea Tar Pits and the George C. Page Museum.

See also: Graffiti and Signs – Los Angeles

Showing concern

One of the myriad gulls sharing the local boardwalk

One of the myriad gulls sharing the local boardwalk

There are truly good people yet in the world.

As some of you know, I am going through a bit of a problem with one of my shoulders (a condition with the stupid name frozen shoulder).

While wandering the boardwalk near my apartment earlier today, I absent-mindedly tossed an acorn at a bench (not a euphemism, folks) and immediately doubled up in searing pain, grabbing my arm and shoulder, and plopped on the bench to wait for the pain to subside. It did…it always does.

Ill-named condition involving loss of range of motion

Ill-named condition involving loss of range of motion

But as I was getting up to finish the trip home, two cyclists stopped to make sure I was okay. They had seen me grab my arm and drop to the bench. It probably looked like a heart attack or seizure.

I explained the affliction and that the pain was mostly due to my unthinking idiocy, which seemed to allay their concerns. I thanked them, however, for checking on me and making sure I wasn’t in more serious trouble.

Nice to know that I’m never alone…I only hope I show the same concern should I be presented with something similar.

Loved the mood captured by the street lamp

Loved the mood captured by the street lamp

Why models fail us in childhood, on TV and in drug discovery

I know you’re used to me babbling in these pages, but I am sure two of you have wondered, so how does he pay his bills? I know my landlord wonders that.
Below, I have reprinted my latest commentary from DDNews, a magazine for which I write regularly and for which, to my great surprise, the Publishers pay me. Thought it might make a nice change of pace for those tired of the current pace.
DDnews
For those of you who read the article, there is a bonus at the bottom (worst fruit-in-yogurt tagline, EVAR!) 
For a brief period of my childhood, I dabbled in model airplanes and model ships. And by “dabbled,” I mean I spent an inordinate amount of time with my fingers glued together. But aside from the medical agonies of modern chemistry, what struck me most about the exercise was how pale an approximation these models were of the real thing—about the only thing my miniature Spitfire had in common with the WWII fighter was the sheer carnage of the plane as it went from airborne to groundborne in its flight across the room.
More recently, my fascinations have turned to models of a more human variety, such as those found on the catwalks of America’s Top Model. (Let’s face it, after DDNews, I am all about the latest issue of Vogue.) And like the plastic variety described above, these models seem to be at best a glittery approximation of the actual thing.
It’s often hard to believe that I share about 100 percent DNA sequence identity with these mystical creatures. Now, 98 percent coherence with bonobo chimps, I have no problem believing.
All this to say that we are constantly surrounded with models that are poor facsimiles of the real deal upon closer inspection. So, it should probably should come as no surprise that the same is true in medicine and drug discovery.
Last week, I read a story in the newspaper that touted the life-prolonging properties of the diabetes drug metformin; a regular fountain of youth, the headlines implied. And yet, after coursing through the article, I eventually discovered that the rejuvenation experiments were performed on the worm C. elegans, which was shocking for two reasons.
One, I did not realize that there were largely untapped market opportunities in annelid diabetes. And two, the life-extending implications were being made based on a species that wasn’t even a chordate, let alone a mammal.
Now, I appreciate that this is an extreme example, probably overhyped by an eager press officer, but the literature is rife with examples of models that completely failed to live up to expectations when researchers tried to match success in model systems with success in actual humans.
As oncology god Judah Folkman once mused, we have become really good at curing cancer in mice.
Part of the challenge, I think, is that because we cannot experiment directly on humans, or at least not within the editorial reach of DDNews, researchers are often forced to study new compounds or therapeutic modalities on approximations of approximations of approximations of human disease.
We don’t study the impact of irinotecan on human colorectal cancer but rather extrapolate from its effects on an induced form (approximation 1) of murine colon cancer (approximation 2) in mice (approximation 3). Or we study new biologics against a chemically induced inflammation in dogs that bears a passing resemblance to rheumatoid arthritis in humans.
Within the realm of in-vitro models, the advent of technologies like 3D cell culture and microtissues is adding some biological context back into the completely artificial realm of 2D cultures of immortalized cell lines. (For more on this, see the special report “Life moves on” in this issue, on page 21). But in the absence of factors such as tissue vascularization and the like, even these advances result in weak approximations.
The goal of a better, more representative model may be getting a step or two closer, however, with the help of stem cells.
As we’ve reported previously in these pages, and as I am presently hearing at the ISSCR conference in Vancouver, stem cells are giving us enhanced opportunities to study human disease generated from the source material—human cells—potentially down to the scale of the individual patient.
The standard technical limitations of in-vitro analysis hold for stem cells—a lump of microtissue in a microwell dish does not a micro-human make—but we do have the opportunity to limit one or two approximations.
At ISSCR, for example, Daniela Cornacchia and colleagues at Sloan-Kettering and Weill Cornell Medical College describe their efforts to understand the inadvertent de-aging of cells transformed into iPSCs. Even when taken from older patients, the reversion process makes it difficult to use the cells to study late-onset diseases. The group is trying to identify factors that will allow them to induce natural aging into these cultures to improve models of such diseases as dementia.
Similarly, Rohan Nadkarni and Carlos Pilquil of McMaster University are endeavoring to produce 3D lung tissue from iPSCs that contain both conducting and gas-exchange zones mimicking normal lung function. If the model bears out, it may provide an even more realistic platform to study respiratory diseases in vitro.
Until we are in a position where we can do high-throughput human screening—in a 96-well cube farm, perhaps—the search for better model systems must be a priority. And given the challenges of translating preclinical success into clinical success, perhaps it should be a higher priority than the development of new therapeutics.

Look for more on this topic in a special feature on disease modeling in the November issue of DDNews.

Added bonus for blog readers:
#1 on the left photo and #2 on the right photo

#1 on the left photo and #2 on the right photo

Ergo ego

(Property of evolution.berkeley.edu)

(Property of evolution.berkeley.edu)

Be as egocentric as you want, but always remember: You were one point mutation away from being somebody else.

(PS To the genetics nerds out there, I know they messed up the “original” strands in this diagram as the originals from different strands cannot be identical but rather should be complementary.)

Write what you—No!

Image

The adage I hear a lot in writing circles and books is “Write what you know”. By that, people mean write about the things for which you have a passion, because that passion for the subject will shine through your writing and become infectious to your reader or viewer.

To a large extent, I agree with this sentiment, but I think there has to be codicil. When you know how to write, write what you know.

Let me explain with an anecdote.

When I first started writing, I was coming out of a career as a biochemistry researcher who had spent the bulk of my training in protein biochemistry and genetics. That was where my passion lies. So, perhaps as no surprise, when I decided to become a science writer, I focused much of my initial energies on writing about protein biochemistry. I understood the science; I could see the story quickly; I could write about it with some fluency.

Unfortunately, despite or perhaps because of my passion and fluency, I was completely unreadable to anyone who wasn’t already a protein biochemist. I wasn’t speaking to my audience in terms they could understand but rather in terms I could understand. To a greater or lesser extent, I might as well have been writing in Klingon, which I suspect would have given me a broader audience.

When I finally realized what was happening—thanks to all of the people who beat me about the head—I made a pact with myself. Until I felt that I could really tell a story, I would do my best to avoid writing anything about that at which I was most expert. I had to become my audience: the relative non-experts.

About a year into writing about topics I had to research and for which I had to ask potentially stupid questions, my writing had matured to the point where I could go back to my area of expertise and approach it in the same way. I had finally arrived.

I think the same holds true for any kind of writing, whether news, novels, screenplays, blogs.

Until you are capable of telling a story that your audience can decipher, and more importantly wants to read, you are probably better served to stay away from the topics you know best. To do otherwise means running the risk that you will leave out the “obvious” and the “well, yeah” that you know in your bones, but that could be vitally important to an audience member trying to understand why certain facts or behaviours in your story exist.

Give yourself—and by extension, your audience—a chance to learn your story, to experience it at a visceral level. As you develop your story, you’ll likely find yourself asking questions of your plot or characters that your audience would ask. You want your audience to think, but you never want them to have to research. Until your work becomes part of a school curriculum, it shouldn’t require a study guide.

It is easier to remove the truly superfluous common knowledge or understanding later than it is to convince yourself that the information you need to add isn’t common knowledge.

When you are ready to tackle it, the subject(s) for which you have passion will still be there. Consider them the reward for all the hard work you did up front.

In a future post, I hope to discuss a flavour of this topic: “Based on a true story”.