Auto(populate)bots Assemble!

robot-n-women

Reports suggest that within the next two decades, robots will take responsibility for upward of 50% of jobs currently handled by humans, who simply remain inefficient and cannot work 24 hours a day.

Already, we are seeing automation in manufacturing and order-entry kiosks in the services industry. It is assumed the next stage will involve a complete takeover of the social media industry, with timelines auto-filled by trivia and rumour bots.

This post brought to you by the Xorblat 3-11A

Photo stolen without permission from The Robot State blog.

For a legitimate discussion of such topics, check out The NeoHuman Blog.

Malnourished with malinformation

KnowledgeI’d argue that any amount of knowledge is a good thing. It is a little bit of information that is likely to trip you up.

As many of you know, I am a science and medicine writer in another life—the more lucrative one, but that’s not saying much—and so I spend many of my days immersed in the worlds of scientific and medical discoveries and blundering. I even spent several years working at a biochemistry bench as a scientist—you may genuflect, now—so I know the world of which I speak.

For this reason, I tend to view science and medicine as a work-in-progress, as so much noise with moments of signal. Rarely do I herald the hype and equally rarely do I despair the bumps.

To my friends who see every announcement as a breakthrough, I am a cynic. And likewise, to everyone who pounces on every setback as evidence of mass conspiracy, I am a complicit shill. Whatever.

The challenge comes when I engage in a discussion of the topic du jour, because more often than not, the person with whom I am talking is adamant that he or she knows the truth. They are empowered by something they have heard or read from a renown expert. They have information.

(Let me state here that I do not believe that I am the holder of all truths. I do feel, however, that I have a good handle on what I do not know, and just as importantly, what is not yet known for certain.)

So, let’s start with some definitions (purely mine) of information types:

Information: A collection of facts about a subject upon which someone can formulate a testable theory or postulate a conjecture.

Misinformation: Incorrect declarations that potentially lead one to false conclusions.

Disinformation: Knowingly false declarations for the purposes of misleading another group (e.g., counter-espionage, propaganda).

information triangle

I suggest, however, that we need another category to address the shade of grey between the positives of information and the negatives of mis- and disinformation.

Following the model of nutrition versus starvation, I propose we call this new category malinformation, with the following definition:

Malinformation: A collection of facts that, while true, is insufficient to formulate a definitive conclusion without the support of further facts.

Just as a malnourished person is not starving, but rather suffers the effects of an insufficient blend and quantity of nutrients to experience balanced health, a malinformed individual is not wrong per se, but rather suffers the effects of an insufficient blend and quantity of facts to experience balanced knowledge and understanding.

For example, people who change their eating habits because they read about a single study that showed a specific food extract reduced tumour size in mice. Or a clinician who has created a behavioural modification program to reduce addiction based on a thought exercise using largely unrelated studies.

Any of these decisions are based on legitimate data from legitimate studies, but often ignore (or simply don’t look for) alternative and/or possibly conflicting data from equally legitimate studies. Rather than analyze all available data before generating a theory, they find the malinformation that supports their beliefs and then stop; a little bit of data being taken to conclusions that simply are not supported.

Boom-bust

Maybe they’re right. But more likely, it is much more complicated.

In conversation, I find the malinformed much more intractable than the ill-informed. With the latter, there is a chance you can correct the misinformation. With the former, however, the mere fact that the malinformation is correct seems to be sufficient cause for them to defend the castle they have built in the sky. When you “yes, but” them, all they tend to hear is the “yes”.

In fairness, all information is technically malinformation as we will never have access to the complete knowledge of the universe. We are always going to be forced to make decisions based on limited knowledge.

But where more knowledge is available, I think there is duty to examine and understand it before becoming intractable in our positions.

If there is a newspaper article about a new scientific discovery, efforts should be made to learn more about the limitations of the science that led to the discovery. How far can you realistically extrapolate from those few data points?

In biomedical research, that which occurs in a mouse is, at best, a clue to what might happen in a human. Nothing more.

It could lead to the next step in scientific inquiry—the actual purpose of science—or to a dead end.

Belief is nice, but unless that belief is well founded on broad and balanced information, it is limiting and might be dangerous.

(Or at least, as far as I know based on my understanding of the available information.)

Toronto Marlies crunch Syracuse

As the New Year arrived, my beloved Toronto Marlies played host to their cross-border rivals Syracuse Crunch.

The New York team is well named, with a history of pounding their opponents physically if not always on the scoreboard…and the local boys were ready to give as good as they got.

Saturday, January 2: Toronto 3 – Syracuse 2 (YouTube highlights)

Sunday, January 3: Toronto 3 – Syracuse 2 (OT) (YouTube highlights)

The hard-hitting series sets up an interesting three-quel when Syracuse visits Toronto’s Ricoh Coliseum yet again on Wednesday, January 20.

Substance over volume

ddnews

When you meet someone who does not speak your language, there is a cliché response of talking louder to make yourself understood. There is something within many of us that says if we simply pump up the volume, we can overcome the disconnect.

A couple of months ago, Tufts University released their latest estimates for the average cost of developing a new drug: $2.6 billion (I’ve seen estimates up to $5 billion). Eleven years ago, the same group calculated the costs at $0.8 billion.

Now, every time these estimates arise, the hand-wringing begins over how the costs were calculated, which factors make sense and which are over-reaching. What no one seems to argue, however, is that drugs are less expensive to develop today than they were a decade ago.

So what has this to do with speaking louder?

The same period has seen amazing technological achievements designed to facilitate and accelerate drug discovery and development.

Combinatorial chemistry was heralded as a way to expand compound libraries from hundreds to hundreds of thousands. High-throughput and high-content screening, as well as miniaturization and automation, were lauded as ways to screen all of these compounds faster under the paradigm of “fail early, fail often”. And given the masses of data these technologies would churn out, the informatics revolution was supposed to convert data into knowledge and knowledge into healthcare.

And yet, for all of these improvements in throughput, I question whether we have seen much improvement in the number or quality of drugs being produced. We certainly haven’t made them less expensive.

Please understand, I don’t place any fault in the technologies. These are truly marvels of engineering. Rather, I question the applications and expectations of the technologies.

Almost two years ago, GSK CEO Andrew Witty told a London healthcare conference: “It’s entirely achievable that we can improve the efficiency of the industry and pass that forward in terms of reduced prices.”

The pivotal question here, I believe, is how one defines efficiency.

I wonder how many people simply felt economies-of-scale would improve discovery, much as mass production made Henry Ford a rich man. But drugs are not cars, and where throughput and scale make sense when you have a fully characterized end product, they have their limitations during exploration.

When I was a protein biochemist in an NMR structural biology lab, I spent some time trying to wrap my head around two concepts: precision and accuracy. A 3-Å protein structure is very precise but if the structure isn’t truly reflective of what happens in nature, it is meaningless. A 30-Å protein structure is much less precise, but if it is more accurate, more in tune with nature, then it is likely more useful.

By comparison, I wonder if our zeal to equate efficiency with throughput hasn’t improved our precision at the cost of our accuracy. If you ask the wrong question, all of the throughput in the world won’t get you closer to the right answer.

In researching the DDNews Special Reports over the last couple of years, I have spoken at length to several pharma and biotech specialists about this topic, and many feel that the industrialization of drug discovery and development has underwhelmed if not outright failed. Several have suggested it is time to step back and learn to ask better questions of our technologies.

But getting back to the costs issue.

I know many will rightly point out that the largest expense comes from clinical trials. To address this challenge, new technologies and methodologies are being developed to get the most useful information out of the smallest patient populations.

Here again, however, no one segment of the drug development process stands in isolation, and I think back to the compounds reaching the clinic and question the expense of incremental improvements.

Oncolytics CEO Brad Thompson discussed the challenge in Cancer in the Clinic (June 2014 DDNews).

“If you could double [overall survival], you could show that in a couple of hundred patients. If you want to do a 10-percent improvement, you’re talking thousands of patients to do it to the statistical level that everybody would prefer to see. How do you run a study like that?”

That is a huge difference in financial expenditure that begs the question is an efficacy improvement of just 10 percent of value.

From an individual patient perspective, assuredly. From a pharmacoeconomic perspective, maybe not, and particularly with the growing prevalence of high-cost targeted biologics. Maybe we need to aim for bigger improvements before moving candidates forward, which happens long before the clinic.

Again, I’m not placing blame. The history of any industry is filled with experimentation in different methodologies and technologies. Everyone involved had the best of intentions.

But after a couple of decades of middling results, perhaps it is time to question how and when many of these advancements are applied. Simply yelling at a higher volume doesn’t seem to be enough.

[This piece was originally published in the January 2015 issue of DDNews. A lot has happened in the year since, including some amazing results in the field of immuno-oncology that might just address the demand for high-performance treatments even if only for a select patient population. For more on that, see my June 2015 Special Report “Body, heal thyself”.]