Had an interesting situation come up the other day. Some of you who have been on the site a while might know that I'm in the medicine buzz, as in creating new medicines. One of the things I do on the side for extra cash is to help some Wall Street types decide if they should put money into developing some molecule that a company is looking to develop (if I have a superpower, it's smelling bullshit). Another thing I'm known for in some circles is being very cynical about science. I think the "reproducibility crisis" isn't due to a bunch of complicated factors, but to the simple factor that people publish bullshit way more often than not. I think most of the bullshit is biased data moreso than fraudulent data (soft vs hard corruption, say). But yesterday, as I was reviewing a diligence package on a company I found a figure that looked familiar to me, but not entirely. So I cross reffed the recent papers from the scientist whose lab the work came from, and there it was: the figure in the paper had actual manipulated data... Not something like the results of one molecular test that an underling could have messed up, but a complete misrepresentation of the work that was done. Needless to say I called my people and said, this guy's a liar and don't ever talk to him again. But there's a bigger issue here, too, which is that the knowledge I have could ruin this guy's career (he's a decently well known guy in the field, though not like a really famous scientist). But if NIH knew what I know he would be black listed from getting a grant for a long time, maybe forever. I'm bound by an NDA not to blab, and I won't, because my career matters more to me than the satisfaction I'd get out of blasting this dude. But man, it's gonna weigh on me for a while. Science can't proceed by fraud. And yet there's so much suspected fraud in the field, so you want to help when you can. But I'm powerless here. Really interesting situation for me (but probably nobody else!).
I've been involved in science for many years, and while I've suspected fraud a lot of times, this is the first time I've ever actually discovered hard proof with my own two eyes. So it's basically new territory to me. The hard thing here is that the false data are in the public domain, whereas the complete data I only saw as part of a confidential diligence package, which means there isn't much I can do. Were I in a safer position, I might have more to think about.
That is an interesting situation. I believe I am responsible for a 'corrigendum' to this article: https://pubmed.ncbi.nlm.nih.gov/31542391 Figure 5 looked suspect. I haven't looked into the corrigendum enough to decide if I buy it.
This was that crazy photonegative figure you mentioned in chat! The correction shows one figure depicting intensity change rather than before-and-after values. The figure now seems to match the change in before and after values. Using the raw change percent (not logâ‚‚) the pattern mostly matches, though I can't distinguish between a 400% increase and a 141% increase in the new Figure 5.In the article, we showed data after transforming the Day 0 and Day 14 values within subjects as %. Based on the advice of colleagues and readers, we show data and statistics without that transformation and show the raw data in the following Figures and Appendix. We present analyses of these data conducted and/or reviewed by professional statisticians (see acknowledgements).
Just saw this post. Does this guy work for a university / research institution? If so, there would have to be whistle-blower channels where you can flag this anonymously. It might take some endeavour, but I'm with you that research integrity needs to be held close to sacrosanct.
Public university. Lots of NIH money. I think I can describe in better detail what happened without crossing a line. Basically the guy published the results of an experiment (a drug trial of an animal model of a disease), which he reported to be a several day experiment, and was described as such in the methods. When I got access to the full data, the experiment was actually a several week experiment. The "statistically significant" effect he observed was an obvious random occurrence, as there was no indication that there were any between group differences at any other time point. So what happened, obviously, is that they just decided to publish the experiment as if it were always supposed to end at the time point that he published it to be. That might seem not so bad to people outside the field, but in the business that is considered straight up lying, no wiggle room, no grey area--just plain lying. When you do drug trials you have to always describe what you are going to measure before the experiment takes place. The reason for that is that you can calculate, based on some assumptions about the data, what the a priori odds of a "significant" finding are. The trouble with not doing that is that you can measure a whole bunch of stuff and then look for things that appear different between groups, which there are bound to be if you do enough measurements. Then you can calculate what the a priori odds would have been had you not run the trial, and say, "Great! There's only a 1% chance of making this finding by chance, so it must be true." But it isn't. It's like throwing 5 heads in a row and then convincing yourself that the coin is loaded, because there's only a 3% chance of that happening, so it meets the p value requirement. But there's a 3% chance of any given series of 5 coin flips, and one of them has to happen. It's offensive to me that the dude is out there getting grant money based on this horseshit. But the grant game is fucked anyway, so whatever. What's really annoying to me is that he's out there raising drug money for this. Like you want to waste millions of dollars and thousands of hours of time just to boost your ego a little? It doesn't make any sense to me. What kind of an ego gets off on an obvious lie? Where is the accomplishment to be proud of? I guess the real problem is with science. The incentives are set up to make people lie, so I suppose we're all irresistible to that force at some point. Edit: I should point out that I emailed the potential financier about this, which he just straight up forwarded to the scientist in question while doing nothing except redacting my name and editing out the part where I called it "fraud." So if this were to come up from his research office, it would be very obvious where it came from.
Since you know exactly where the error/lie is in the data, you can whisper to someone else, "gee, isn't that table on page 9 odd? That third number in the second column seems off to me. Someone should mention that..." A wink would probably be a nice closer. Then someone in the field can call out the fraud, and it's not YOU.