Top 3 bizarre (and wrong) scientific misconducts

  • Qian-Chen Yong
  • August 26, 2017

In my previous posts, I have discussed some examples of wrong ways to carry out an experiment or analyze data. Disclaimer: these top examples are all from my personal experience, meaning I have talked to people who personally thought what they were doing was correct, or that what they were being told to do was correct or not a big deal. I will not disclose the identity of any personnel who were involved in those wrongdoings.


The most common form of data manipulation is probably cherry picking. I once encountered a researcher who thought it was okay to disregard some of the original raw data to make the overall reported data become statistically significant without legitimate justification. This researcher carried out an experiment with 12 mice in each control and test group, but the researcher, who was instructed by his mentor, only reported 3-5 animals that showed the data they want to see. I believe everyone involved in biomedical research will agree that you will see outliers in almost any kind of experiment. To me, data may only be discarded if you have proper justifications, which may include knowing something definitely went wrong in that particular experiment. For example, let’s say the positive control did not respond as expected; therefore, the data of the subject tested along with the positive control from the same experiment batch should probably be discarded. Another example of outlier data that can be cautiously removed is when an outlier is in the same direction as the remaining data; this creates a huge standard deviation and causes the statistical analysis to lose its significance. Still, when deleting the data you should make sure you have at least another 4-6 data points remaining. All in all, you need to have legitimate reasons to remove outliers. If you just remove data to make data become statistically significant, you’re committing a scientific misconduct. You’re cherry-picking the data you want to see, and this is WRONG! For a little bit more information, check out this website.
Here I’d like to talk about the top 3 bizarre scientific misconducts I have encountered:


1) First of all, I have heard people say, “for in vitro experiments, we run qPCR experiments in quadruplicate (each sample is run across 4 wells); thus, n = 4.” Usually for qPCR experiments, researchers run the same sample in duplicate/triplicate to control for variation between reactions (personally, I prefer to run them in duplicate to save reagent, but I recommend triplicate). Nonetheless, this postdoc I know was told that the same single sample can be counted as 4 different samples; all you have to do is run the same one sample in quadruplicate. So, they just keep doing the cell experiments and run the qPCR in quadruplicate until they get the desired results. Once they obtain the “expected” qPCR data, they stop and report the data. I cannot emphasize how wrong this is. The same, one, sample from one individual experiment can only be counted as sample size n = 1. Period. No matter how many replicates you run. For in vitro experiments, it appears that most peers will agree that 3 individual experiments are sufficient. Again, this is based on whether or not you were able to repeat the experiment without much failure. Then, 3 individual experiments should be sufficient. If you need to run 6 experiments to see 3 sets of data you want to see, then you probably need to continue to optimize your experiment instead of stopping there and reporting the 3 experiments out of 6, because this is called cherry picking!


2) Another misconduct is when one creates a fake email account and serves as a reviewer for their own papers. I believe this is pretty well-known now. The first time I heard about this was before the large scale of retraction being reported. I was shocked when I first heard about it from one of my colleagues. I did not know that the system has been abused so badly. The good thing, however, is that now most of the journals will choose the reviewer based on their own research rather than using the reviewers suggested by the authors. Another similar and pretty common situation is that, the reviewer is your good friend, and he/she is too busy (or too kind) to review the paper and passes the ball back to you and asks you to review the paper yourself. Of course the manuscript will stand a good chance to receive minor revision and get published in a short time because you are one of the reviewers. This is definitely wrong. Definitely a kind of scientific fraud.


3) Thirdly, I have seen that when a group of data is close to being, but is not actually, statistically significant, some believe it is okay to add the mean value into the data set to make the data statically significant. For example:


Notably, this kind of manipulation does not change the average, and therefore, that particular student I mentioned thought it was okay to do. This is again NO NO NO NO! This is data manipulation because the experiment is imaginary! It has not been done! This is falsification! Fabrication!


In conclusion, the aforementioned ways to manipulate data are some of the most memorable and surprising that I have seen. I am sure there are many other ways to manipulate data that have probably surprised you before; please share with us in the comments.


Some of those trainees who committed those scientific misconducts were told by their mentors that they were common and acceptable actions. This is crazy! I hope their mentors were lying when they said these practices are common. But maybe this is exactly why so many works are not reproducible. I still remember one of the PhD students I once met who told me, “do not trust any data coming out from my lab.” That just showed me how vulnerable a PhD student is to being easily forced by mentors to do what something they are uncomfortable doing. Luckily a few years later, scientific misconducts from this particular lab were exposed by Retraction Watch and the mentor resigned. Good for us!


All this said, scientific discoveries will definitely be a lot more reliable if we all do our parts to review our own mentor, on QCist.com, and warn prospective trainees about our mentors if they have some problems with scientific integrity. On the other hand, we should also definitely promote our mentor if our mentor is a good scientist! Let’s start by reviewing our mentors today!

Add comment

3 responses to “Top 3 bizarre (and wrong) scientific misconducts”

  1. I was a grad student/post-doc in a “highly reputed” lab for several years. I was instructed to cherry pick data, I was told not to run experimental validation procedures, and was “accused” of being a perfectionist. All I wanted to do was produce scientifically valid, reproducible results that would contribute to the scientific basis of disease. The worst thing for me was that my entire project was based on an experiment (done before I came on the scene) that was analyzed by “eyeballing” the data from multiple images (“replicates”) of one or two animals. By the time I realized the PI had been reckless in interpreting that “pilot” data, it was too late. I was nearly done with my work when I found out. My data were obtained properly, and its interpretation lead to my conclusion (based on what is known in the published literature) that the experiments should never have been done in the first place. I did not play along with the game of finding what I was supposed to find, and I suppose they were embarrassed. I was not allowed to publish, and was hazed into not defending my dissertation. I was not allowed to salvage the experiment by following some interesting leads. All papers from that lab (over 100 papers) are suspect, based on what I have seen. It’s just appalling.

    • Dear Lori, you’re definitely not alone. The PhD student I mentioned in the article had almost exactly the same problem as yours. But she decided to play along. I can’t really blame her. If she didn’t do so, she wouldn’t have defended her dissertation and received her PhD degree. I just hope that you’re doing really well now, and your life is not really affected by what you experienced in that “reputable” lab. This is exactly the mission of QCist.com, to direct young and bright trainees, who are still very passionate for science, to good labs. We have all experienced the dark side of scientific research that crushed our passion for science. Let’s work together to help the future trainee by reviewing our mentors now and tell others about QCist.com. Be diplomatic when you review your bad mentor to protect yourself. We will understand what you’re trying to say.

  2. I absolutely love your blog and find nearly all
    of your post’s to be precisely what I’m looking for.
    Do you offer guest writers to write content for you personally?

    I wouldn’t mind composing a post or elaborating on some of the subjects
    you write related to here. Again, awesome site!

Leave a Reply

Your email address will not be published. Required fields are marked *