Sensationalism in science reporting

Being British, I have a certain reverence for the BBC, which was arguably at some point in history the finest news organization in the world. That’s certainly no longer the case, especially in science reporting.

Being a cardiac biologist, my interest was piqued by the following headline… “Mini hearts grown to study disease“, but the story itself is less actual science reporting, and more lab/university PR.  Such material does not science news make. A better headline would be “Dude from obscure Scottish university figures out how to do something already done“.

The first red flag that this news piece is nothing more than PR, is no link to a report of the underlying science in a peer reviewed journal. The most fundamental bar for something being newsworthy in science is journal publication. How did this even get past the BBC proof readers?

Well, maybe the link to the authors’ University webpage* can help? (oddly it was put at the bottom of the piece, not embedded in the text of the article itself).  Nope, nothing about cardiac stem cells there.**

What about PubMed?  Nope. 5 papers published since 2007, none of them about hearts.

Then there’s this statement… “They are indeed human cells, which physiologically are the same as human hearts, in this case the size does not matter”. Ever heard of pre-load?  Afterload? Frank-Starling? Oxygen tension?  Does anyone in the BBC science department even understand the meaning of the word “physiology”?  Resisting my temptation to make a puerile joke that size DOES matter, let’s just say a ball of cells is about as far removed from a living beating human heart, as a naked mole rat penis.

The kicker is this tag-line at the end… “We can work now, in one experiment, with 1,000 human hearts and test large amounts of compounds, which you can’t do in animals”.  While ignoring that you actually CAN do this in animals (you just need 1000 animals, which isn’t a lot for a mouse lab), the key word here is “now”.  NOW we can do this.  Before we couldn’t do this.  Everything hinges on novelty, but as the link above to the pioneering work of Chuck Murry shows (as does a quick search for “heart on a chip”), this is far from novel.

None of the above is to detract from the actual work of Dr. Zhelev. I’ve never met him, and his work is probably awesome. Hey, he got cardiac stem cells to grow in a dish, which is more than I’ve ever done with stem cells, so hats off to him!  But please BBC science news, try and recognize when you’re being used for blatant PR with zero underlying content.  As a scientist, I like a little more meat behind my science reading.

_______________

*Who even knew there was such a thing as the University of Abertay? I come from the other end of the Sceptred Isle and had never heard of it before. I’m sure it’s a great place, but maybe someone in their science PR office needs to lay off the caffeine for a while.

**Lordy! Those are some freaky cold lookin’ folks at the top of the page. The dude in the middle looks like he just got out of a knife fight. That one with the big eye in the magnifier is giving me the fear!

 

Administrative Bloat

This is not so much a rant, but a post intended to give some idea of the enormous amount of administrative burden a typical PI at a research university has to deal with every day. For the first 3 days of this week I accomplished absolutely zero science whatsoever – no experiments, no paper writing, no data analysis, nothing. Instead, I spent 3 solid days (8am-5:30pm) trying to rid my email in-box of administrative tasks…

  • Wrote a letter of recommendation for someone I’ve only ever met once and who didn’t send me her CV when requesting the letter. Typically do at least 2 such letter a month.
  • Submitted 2 animal protocol modifications via what is quite possibly the most arcane online system imaginable (Granite/Topaz – only works in IE, frequently crashes or just doesn’t load at all).
  • Submitted protocols again when they came back with questions. Then submitted them a 3rd time just now.
  • Bought plane tickets for a visiting lectureship I have to do next week. Admin’ assistant at the other university had left, and her replacement didn’t contact me to arrange anything. Still haven’t received any itinerary or idea of who I’m meeting with during my visit. They want me to take a cab from airport to hotel when I arrive at 1pm (seriously, there’s no-one available to meet with?)
  • Complied powerpoint slides for the above.
  • Processed paperwork for getting student reimbursed for a trip he made to a collaborator’s lab. Apparently students have to use a different form than the one everyone else uses. Does it say “not for students” on the form we used first time? Like hell it does.
  • Drafted 2 thank-you letters for the reviewers and readers of a grant panel that I co-chair for a charity.
  • Submitted an NIH progress report via the delightful new RPPR system. This is the replacement for eSNAP, and hey why use a simple 2 page form when 7 pages with multiple tabs will do?
  • Related to this, 2 weeks ago I submitted manuscripts via the NIHMS system, to make sure they’re in compliance with the public access mandate. Apparently it takes up to 6 weeks for each manuscript to be assigned a PubMedCentral ID. The business admin’ at my institution contacted me at 5:30pm the night before the progress report deadline, claiming they could not submit the report due to non-compliance. Some advance notice of this might have helped! (Thankfully they submitted anyway, but now there’ll be a delay in issuance of the NOA).  All-told over the past 2 weeks this progress report involved >30 emails back and forth.
  • Just to confuse matters further, the NIHMS website, the RPPR site, and the MyNCBI site (used to enter papers into NIHMS), all use the exact same login as NIH eRA commons. As such, at one point I was logged into all 4 websites simultaneously with the same login ID. If this isn’t a major security flaw I don’t know what is. Could they make it any more complicated?
  • Had an annual evaluation meeting with my lab’ tech. This year Human Resources replaced the simple 1 page form with a new 6-page PDF, full of really insightful questions (Q. How does the employee behave toward patients and customers? A. They don’t because we’re researchers with no patient contact!  Q. How does the employee add value to the institution’s mission? A. Maybe if the institution didn’t re-write their mission statement every 6 months I’d be able to answer).  The PDF “lost” some information during submission (different versions of Acrobat), so my Dept’ admin’ had to print it out and bring it to me to fill in the missig info’ and sign it manually.
  • In addition to the 6 page forms (one each for me and the employee), there was a job description form, an education record, a HIPAA statement, an “ICARE” employee values commitment statement, and an “age specific competency statement”.  6 items in total, more than half of which was about patient care duties, for a research lab’ tech with no patient contact whatsoever.
  • To get it out of the way (to make room for more admin!), I mistakenly decided to do my Mandatory In-Service exam. This is something all employees have to do to prove they understand various rules (what “code blue/pink/orange” on the intercom means, etc). The course is administered in BlackBoard (vomit). The first part of the test is an assessment of roles to determine which test you have to take. Despite qualifying as having no patient contact, my test (65 questions, 58 correct required to pass) included questions about wearing lifting gear to transfer heavy patients, how to deal with patients with guns, how to find an interpreter, etc. Who designs these things? I passed, but seriously what a waste of everyone’s time.
  • Until recently I was paid 5% off a grant that involved human subjects research, and even though I had no involvement in that part of the project, all members of the team were required to have human subjects protection training. I’m no longer involved with the project, but got a notice to renew my certification last week. The test is administered via a 3rd party (CITI). Given that I hadn’t done the test in about 4 years, the first step was to recover my password for the site.  1st attempt – nothing after 15 minutes. 2nd attempt – nothing. 3rd attempt – bingo there’s the email!  Reset my password and took the test. It was littered with grammatical errors, dead links, obtusely worded questions. I had to stop half-way to attend to something else. At 11pm (6 hours after first logging in), I got 2 more emails for password reset. When I tried logging in the next day it wouldn’t let me (because of the password reset requests – doh!) so I had to reset twice more and then resume the test – quel surprise – some of my answers had been lost.
  • Processed paperwork for getting RAP’s professional society membership refunded from a discretionary account. But account was overdrawn…
  • Last fall an equipment purchase resulted in the account being charged twice and going $5k overdrawn. Finally (5 months after being requested!) accounts payable got their act together and refunded, so now the account is in the black it can be un-frozen for me to actually spend on.
  • Reviewed 2 de novo manuscripts for journals, and a 3rd that was a resubmission.

There was a bunch of other stuff interspersed too – follow ups from lab’ safety inspection, scheduling of June NIH study section and assignment of proposals to review, lots of dealing with journal editors in my ongoing activities in the area of scientific integrity, plus minor stuff like responding to emails, signing bits of paper.

But where’s the science?  During the past 3 days (well actually a week because I was out of town reviewing grants Thu/Fri last week) I’ve done nothing that I’m actually paid to do. No experiments at the bench, no reading papers (except on the bus), no discussing data with my lab, no thinking about future ideas for experiments, no writing exam questions or updating teaching materials.  Just hour after hour of mind-numbing attempts to get my email inbox cleared of this constant barrage of administrative clutter…  If it’s not paper review requests then it’s grant reviews, travel arrangements, departmental paperwork, grant submissions, society work, personnel and HR stuff, committees, legal/compliance/oversight paperwork, newfangled forms, accounting errors, chasing up lost orders, etc. It never ends. Never.

I did it in the end (got down to 3 emails, all of them from collaborators and talking actual science). This catharsis is supposed to make me feel accomplished, serene, refreshed, ready to do and talk science. But instead all I feel is anger at the wasted time, depressed at how this is going to happen all over again in a few days. Administrative bloat is sucking my will to live. It is slowly killing me and everyone around me, one email at a time.

P.S. required reading for anyone enraged by this topic

Is it spring yet?

It’s been a long and cold winter in Rochester, but there are still lots of interesting things going on in the lab’, so we don’t have to think about the weather too much…

- Jimmy Zhang (MD/PhD student who rotated in the lab’ last year) has decided to come back and pursue his PhD with us, starting this June.

- Marcin Karcz, a resident in the Department of Anesthesiology, wrote a fellowship grant proposal to FAER (The Foundation for Anesthesia Education and Research), so if that gets funded then he’ll be joining us for a year of research starting this July.

- Isaac Fisher, a Pharm/Phys graduate student, is just now completing his rotation project, doing some drug screening for regulators of cardiac metabolism.

- Paul’s paper entitled “Internet publicity of data problems in the bioscience literature correlates with enhanced corrective action” was accepted at Peer J, so should be in press by the end of March. The title pretty much says it all, but I’ll post more details once it’s out.

- Paul gave an interview to Science Careers, about how to walk the fine line between being a whistle-blower and maintaining an academic career.

- Andrew Wojtovich resubmitted his K99/R00 application, and Chad Galloway resubmitted his K01 application. Also Andrew’s review on optogenetic control of ROS came out in the new SFRBM open access journal Redox Biology.

- On March 12th URMC is hosting Nicholas Steneck, a guru in the area of research integrity (having been involved in the establishment of ORI guidelines on responsible conduct of research), for the annual bioethics visiting lectureship. This is part of a one day CTSI workshop about ethics in translational research.

- Graduate student Owen Smith is off to Yale for a week, to work with our collaborator ‘Liz Jonas on mitochondrial ion channels.

- Most of the lab will be headed to Michigan in May, for the wedding of former graduate student David Hoffman, who now works at Cayman Chemical.

- Now if we could just understand our metabolomics data…..

 

Do you think I was born yesterday?

This is my question for PLoS Biology today.  You may recall some time ago I spoke out regarding what appeared to be data of questionable integrity in paper at the journal. Last month I provided a follow-up to coincide with the 6 month anniversary of the paper’s publication, highlighting a complete lack of action taken in response to my findings. Well today, I got the following email from the editor…

Dear Dr. Brookes,
I am writing to let you know that we have issued a formal correction for the manuscript entitled “Effects of Resveratrol and SIRT1 on PGC-1α Activity and Mitochondrial Biogenesis: A Reevaluation” by corresponding author John Holloszy, published in PLOS Biology. The authors inadvertently used the wrong blots in three of the main figures resulting in duplications. You can follow this link to see the correction: http://www.plosbiology.org/annotation/listThread.action?root%3D78137
Thank you for contacting us regarding these issues.
Sincerely

The correction itself reads as follows:

The authors inadvertently used the wrong blots in three of the main figures resulting in duplications. The blots in question are PGC-1a and LCAD in Figure 1A, Cyto C in Figure 4C, and COXIV in Figure 6B. The authors and their institution have confirmed that these errors do not affect the interpretation of the results or conclusions of the paper. The authors apologize for any confusion caused by these mistakes and thank the readers for alerting them to these issues.
Please view the correct Figure 1 here:
http://www.plosbiology.or…
Please view the correct Figure 4 here:
http://www.plosbiology.or…
Please view the correct Figure 6 here:
http://www.plosbiology.or…

Ignoring the fact that the correction was issued a week ago, and they only just saw fit to tell me about it, I guess that clears everything up then!

Except…
(1) The problem with Figure 4A was not duplication, but splicing. Some panels of a blot were spliced (in one case, every lane of the blot) but the supposed loading controls were not spliced.  As such, these controls could not possibly have arisen from the same gel or membrane and are therefore not proper loading controls.  This issue is completely un-resolved by the correction.

(2) The correction claims that the authors “inadvertently used the wrong blots in three of the main figures”.  To me, “wrong blot” implies the whole blot. Not part of a blot. Not one lane of a blot. The whole thing. In the case of Figure 6 this may be an adequate explanation – the entire blot (all 4 lanes) was wrong, so the whole thing has been replaced in the correction. BUT, in the case of Figure 4C, the substitution was not a whole blot, it was a single lane of a blot. How exactly does one “inadvertently” use the wrong band in a blot?

Just to be clear, the phrase “inadvertently used the wrong blots” is completely inadequate as an explanation for what actually gave rise to the figure – duplication of a single band between blots via un-disclosed splicing.

In the case of the LCAD blot in Figure 1, again it was not simple duplication of an intact blot – the lanes were shifted over by one position. Is “used the wrong blots” an adequate explanation, for the practice of using the same blot panel and shifting the lanes over by one position (not to mention adjusting the saturation) to represent different experiments? Is “duplication” an adequate explanation for what happened here, given the manipulation that took place to give rise to the different images.

(3) The corrected images are interesting to say the least.  Each image is presented as a TIF, which is not particularly problematic, until you look at the horizontal/vertical alignment.  To do this, open a Figure (say Fig. 4), hold down the CTRL key and roll the mouse wheel to magnify the image.  Then scroll vertically so one of the features which should be horizontal (e.g. a graph x-axis) is aligned with the upper or lower boundary of your computer window.  Notice anything funny?

That’s right- these images are not aligned, as they would be if they’d been generated from a software package such as PowerPoint or Illustrator. I did this for the graph at the bottom of Fig. 4, to demonstrate…
Fig4Align
…but it’s the same for every part of the Figure, including the bottom edges of all the blot images. How did these figures come to be rotated ever so slightly?  My speculation is that they were scanned in from a paper copy at some point, and someone mis-aligned the paper in the scanner.

So here’s the problem… Why, when the integrity of these data was being questioned, would the authors choose to use such an arcane method of figure preparation which raises more questions than it answers?

Why did it take 7 months for the journal to do anything about this? Do they honestly expect their readers to accept such a totally inadequate explanation from these authors? Do they really expect us to buy that this “does not affect the interpretation of the results or conclusions of the paper”? Do they think I was born yesterday?

Naturally, a response email calling out the above issues has already been sent, and posts to PubPeer, PubMed Commons and the PLoS Biology page itself, are in progress.

Our broken academic journal corrections system

This is a long read, so here’s the short version: In 2012, some problems came to light in three papers with a common author. One journal retracted the paper, albeit after much effort and with an opaque retraction notice. Another acknowledged the problems but refused to act and is now incommunicado. At the third journal there is reason to suspect an investigation may have been compromised by conflict-of-interest. Combined, these events do not speak highly to the manner in which journals handle these affairs. What’s more COPE guidelines appear to encourage such practices. Oh, and just to complicate things, the author of the papers threatened to sue me. There is no happy ending.

Disclaimer and motives: I can’t get into the potential reasons why the data in these papers appear the way they do. All I can do is report on what’s out there in the public domain, and offer my own (first amendment protected) opinion about why these images appear odd to me. It should also be clear that in discussing these papers, I’m not drawing any parallels to online discussions that may have happened about them in the past. The opinions expressed here are my own (***see also bottom of page).

So why do this?  Well, I don’t like mistakes, and assuming everything I’m about to show you is the result of honest mistakes, that means: (i) journals are not doing their jobs as gatekeepers of reliable scientific information, (ii) people are not proof-reading their manuscripts before submission, and (iii) poor data management is being rewarded with grants, publications and career advancement. Because success in academic science is a zero sum game, when people who make mistakes get ahead, others are deprived of success. That’s not fair.

How it started: Alleged problems in the data of three papers from Gizem Donmez at Tufts University came to light in fall 2012:

J. Neurosci. 2012;32;124-32; PMID22219275
J. Biol. Chem. 2012;287;32307-32311; PMID22898818
Cell 2010;142;320-332; PMID20655472

The problems appeared to involve undisclosed splicing together of western blot images, and in some cases apparent re-use of western blots (or portions thereof) across different figures. Detailed descriptions are available at PubPeer (J Neurosci / J Biol Chem / Cell) for those interested. I have been involved for over a year in trying to ensure the journals deal with these alleged problems appropriately. What follows is a summary of what happened in each case. (Note – some email content is redacted for brevity, but full transcripts can be provided on request)…

What happened at J. Neurosci. 7 months after my initial contact, Editor-in-Chief John Maunsell responded:

The SfN requested institutional investigations [...] institutions produced detailed reports that included the original images used to construct the figures. The images clearly documented that the figure elements that appeared to be replicates in the article in J. Neurosci. were in fact from different gels. Because there was no evidence that any results had been misrepresented, the SfN has dismissed the case. We consider the matter closed.

I protested, asking why the issue of undisclosed splicing was not resolved. I also questioned whether the journal had actually seen the originals or was simply relying on the institutional report? This was the response:

Your concerns about the spliced gels did not escape our attention.  We did not take action for two reasons. First, the image manipulation policy of J. Neurosci. was only instituted in December 2012, after this article was published. While it is regrettable that we did not have a more rigorous explicit policy in place sooner, we cannot impose retroactive policies on published articles. Second, while the spicing misrepresented details of the experiment design, the institutional investigations left us with no reason to believe they misrepresented any of the scientific findings of the study. While there are good reasons why authors should not manipulate images in this way, the figures appear to faithfully reproduce the results obtained, and would not impede anyone from replicating the author’s results. Because none of the scientific finding were represented, a correction would provide no value for readers of the article.

Refusing to retroactively apply a policy that every other journal had in place years ago, seems rather odd, as does the assertion that a correction would provide no value. Anyway, I let it go for a while then wrote back again in fall 2013, notifying that the JBC paper had been retracted (see below), and the lead author had threatened to sue me. The journal responded:

We are unprepared to reopen this case.  The events surrounding other publications and your interactions with Dr. Donmez are immaterial for this matter. Your assertion [...] is misguided. I acknowledged these images look very similar, but the institutional investigation included examination of the original, high-resolution, uncropped images, and the material presented to us** showed the similarity was a remarkable coincidence.  I understand that you and others will find that surprising, but there is no arguing with the originals. We consider this matter closed.

** It is unclear if the journal has actually seen the originals, or is merely relying on what has been been presented to them by the institution. So I wrote back again, this time documenting a further analysis of the data, including the following images:

Donmez J Neurosci 6A5A
Donmez J Neurosci 4A5C4C
Donmez J Neurosci 5E

While it’s technically feasible those images originated from different experiments, it would represent a remarkable coincidence!  Maybe I should go buy lottery tickets right now? Anyway, it doesn’t matter because the journal didn’t respond to my email. Four months later they’re still ignoring me.

What happened at J. Biol. Chem.  JBC‘s dedicated ethics officer Patricia Valdez has been a model citizen for interactions with readers and forwarding the journal’s agenda of increased transparency. That being said, things got interesting in May 2013 when JBC published a correction to the questioned paper. The correction acknowledged that some western blots in the paper were subjected to splicing, and replacement figures were provided with splicing seams clearly indicated by solid lines. The correction stated that the splicing “in no way affects the conclusions of the paper or the original interpretation of the results”. If only it were that simple…

Straight away it came to my attention that, while acknowledging the splicing, the correction failed to address the issue that images on each side of the splicing seams appeared similar. So, I wrote to JBC with a detailed analysis of the corrected images, including the following:

JBC Corr for web 1
JBC Corr for web 2
JBC Corr for web 3
JBC Corr for web 4
JBC Corr for web 5

After walking Ms. Valdez through the images in a ‘phone conversation, she agreed that things deserved another look. She also indicated by email that the initial decision to correct was based on an institutional report (sound familiar?). After several follow-up communications the journal issued a retraction notice on July 24th 2013. Here’s the full text of that notice:

This article has been withdrawn by the authors

Move along, nothing to see here! I won’t speculate on the reasons why JBC broke protocol and permitted an opaque notice. However, I will note that this occured on the same day as Dr. Donmez threatened to sue me. In addition Diane Souvaine (Dean at Tufts) stopped answering my emails around the same time.

What happened at Cell. The journal didn’t respond to my original contact in fall 2012, so I tried again in fall 2013 (yes thank-you I have a day job that keeps me busy, hence the delay). In my email I emphasized the importance of avoiding conflict-of-interest, because the senior author on the Cell paper (Leonard Guarente at MIT) sits on the editorial board of the journal. A week later, Editor-in-Chief Emilie Marcus responded that they’d look into it. Nothing was heard for 2 months, then last week I received the following email:

Dear Dr. Brookes,
Thank you for your e-mails. In addition to having been informed of the results of the institutional investigation, we have also examined the implicated figure panels editorially. Despite some apparent superficial similarities, upon extensive examination we were unable to find any compelling evidence for manipulation or duplication in those panels and therefore are not taking any further action at this time.
Best wishes,
Sri Devi Narasimhan, PhD, Scientific Editor, Cell

Having not seen the name before, I looked into the background of Dr. Narasimhan. It turns out she came from the lab’ of Heidi Tissenbaum at UMass. Guess where Dr. Tissenbaum did her post-doc! Can you say conflict-of-interest?

Despite being specifically warned about conflict, Cell put someone who is a scientific descendent of the lead author in charge of the investigation. The response also offered no details on the tools used to reach these conclusions, and no mention whether original images were requested. Furthermore, the response fails to address undisclosed splicing.

I responded, raising these issues and CC’ing the Committee on Publication Ethics (COPE). Despite additional ‘phone calls and even leaving my cellphone number, a week later I’ve received no response from Cell. So, in the interests of completeness, here’s another look at the data, plus a few more apparent problems I found along the way…

Cell 2nd eval 1
Cell 2nd eval 2
Cell 2nd eval 3
Cell 2nd eval 4
Cell 2nd eval 5
Donmez Cell 2nd eval 4B
Cell 2nd eval 6
Cell 2nd eval 7
Cell 2nd eval 8

Cell hasn’t seen this re-analysis yet, so there’s no word on whether they’ll re-open this case after reading the above. I’ll update if that changes.

What does (the) COPE say? I’m generally a fan of the Committee on Publication Ethics, although they don’t seem to have strong enough teeth to get journals to behave. In response to being CC’ed on my email to Cell, they replied:

Thank you for your email and telephone call [...] COPE cannot investigate individual cases but we do hear concerns about member journals. More information and how to follow this process can be found here.

At the link, I learned that official complaints must state which part of the COPE code of conduct has been breached. In that code, editors are advised to “inform readers about steps taken to ensure submissions from members of the [...] editorial board receive an objective and unbiased evaluation”. The Cell case appears to be a breach of code, but requiring a formal complaint to be filed places an unnecessary burden on the reader (me). Why cant COPE simply act on the evidence they’ve already received?

Furthermore, COPE guidelines for editors to handle such issues recommend deference to the authors’ institution for a full investigation. If the institution says OK, the journal should run with it. This policy ignores any potential conflict-of-interest at the institution itself (e.g., indirect costs on faculty grants), which to me seems like a rather large hole in the system… Let’s put people who stand to lose big heaps of money in charge, and hope they do the right thing.

What about the legal issue, and what next? In dealing with these papers and the ensuing legal issues, I’ve spent a lot of my own money. That’s money no longer in my kids’ college savings account. I also seriously doubt I’ll be reimbursed by the journals for the significant time I spent doing image analysis – analysis which they should have done during peer review.

There is no happy ending – just a refusal to act, an opaque retraction and a conflicted investigation. Meanwhilean extremely talented junior investigator from a world-class institution continues to secure high profile grants and, with help from a good lawyer and the powerful Harvard/MIT/Tufts network, is all set for a stellar career. Let’s hope nobody looks at her earlier papers.

_______________

*** If you think anything written above is not a fair representation of the truth, I am open to discussion by email. However, if you wish to challenge the above account, please provide specific evidence to the contrary, not simply opinions/threats.