Go to JCI Insight
  • About
  • Editors
  • Consulting Editors
  • For authors
  • Publication ethics
  • Publication alerts by email
  • Advertising
  • Job board
  • Contact
  • Clinical Research and Public Health
  • Current issue
  • Past issues
  • By specialty
    • COVID-19
    • Cardiology
    • Gastroenterology
    • Immunology
    • Metabolism
    • Nephrology
    • Neuroscience
    • Oncology
    • Pulmonology
    • Vascular biology
    • All ...
  • Videos
    • Conversations with Giants in Medicine
    • Video Abstracts
  • Reviews
    • View all reviews ...
    • Complement Biology and Therapeutics (May 2025)
    • Evolving insights into MASLD and MASH pathogenesis and treatment (Apr 2025)
    • Microbiome in Health and Disease (Feb 2025)
    • Substance Use Disorders (Oct 2024)
    • Clonal Hematopoiesis (Oct 2024)
    • Sex Differences in Medicine (Sep 2024)
    • Vascular Malformations (Apr 2024)
    • View all review series ...
  • Viewpoint
  • Collections
    • In-Press Preview
    • Clinical Research and Public Health
    • Research Letters
    • Letters to the Editor
    • Editorials
    • Commentaries
    • Editor's notes
    • Reviews
    • Viewpoints
    • 100th anniversary
    • Top read articles

  • Current issue
  • Past issues
  • Specialties
  • Reviews
  • Review series
  • Conversations with Giants in Medicine
  • Video Abstracts
  • In-Press Preview
  • Clinical Research and Public Health
  • Research Letters
  • Letters to the Editor
  • Editorials
  • Commentaries
  • Editor's notes
  • Reviews
  • Viewpoints
  • 100th anniversary
  • Top read articles
  • About
  • Editors
  • Consulting Editors
  • For authors
  • Publication ethics
  • Publication alerts by email
  • Advertising
  • Job board
  • Contact
Top
  • View PDF
  • Download citation information
  • Send a comment
  • Terms of use
  • Standard abbreviations
  • Need help? Email the journal
  • Top
  • Abstract
  • The role of motivated reasoning and conflicts of interest
  • How do we study dishonesty?
  • Implications for biomedical research
  • What can we do?
  • Summary
  • About the authors
  • Footnotes
  • References
  • Version history
Article has an altmetric score of 53

See more details

Picked up by 2 news outlets
Blogged by 1
Referenced in 1 policy sources
Posted by 48 X users
On 2 Facebook pages
93 readers on Mendeley
  • Article usage
  • Citations to this article (7)

Advertisement

Op-Ed Free access | 10.1172/JCI84722

Dishonesty in scientific research

Nina Mazar1 and Dan Ariely2

1Rotman School of Management, University of Toronto, Toronto, Ontario, Canada.

2Center for Advanced Hindsight, Duke University, Durham, North Carolina, USA.

Address correspondence to: Nina Mazar, Rotman School of Management, University of Toronto, 105 St. George Street, Toronto, Ontario M5S 3E6, Canada. Phone: 416.946.5650; E-mail: nina.mazar@utoronto.ca.

Find articles by Mazar, N. in: JCI | PubMed | Google Scholar

1Rotman School of Management, University of Toronto, Toronto, Ontario, Canada.

2Center for Advanced Hindsight, Duke University, Durham, North Carolina, USA.

Address correspondence to: Nina Mazar, Rotman School of Management, University of Toronto, 105 St. George Street, Toronto, Ontario M5S 3E6, Canada. Phone: 416.946.5650; E-mail: nina.mazar@utoronto.ca.

Find articles by Ariely, D. in: JCI | PubMed | Google Scholar

Published November 2, 2015 - More info

Published in Volume 125, Issue 11 on November 2, 2015
J Clin Invest. 2015;125(11):3993–3996. https://doi.org/10.1172/JCI84722.
Copyright © 2015, American Society for Clinical Investigation
Published November 2, 2015 - Version history
View PDF
Abstract

Fraudulent business practices, such as those leading to the Enron scandal and the conviction of Bernard Madoff, evoke a strong sense of public outrage. But fraudulent or dishonest actions are not exclusive to the realm of big corporations or to evil individuals without consciences. Dishonest actions are all too prevalent in everyone’s daily lives, because people are constantly encountering situations in which they can gain advantages by cutting corners. Whether it’s adding a few dollars in value to the stolen items reported on an insurance claim form or dropping outlier data points from a figure to make a paper sound more interesting, dishonesty is part of the human condition. Here, we explore how people rationalize dishonesty, the implications for scientific research, and what can be done to foster a culture of research integrity.

The role of motivated reasoning and conflicts of interest

Ethics research, including our own work, has shown that dishonesty is not about good versus bad people, rather it is primarily about conflicts of interest and motivated reasoning (1). Dishonest actions penetrate the most mundane of situations and are committed by ordinary people that have moral standards and think highly of themselves in terms of being honest and exemplar members of society. Yet, when facing a conflict-of-interest situation in which people are tempted to give in to selfish motives at the expense of crossing the boundaries of what they usually consider morally acceptable, they can find “perfectly valid” reasons for these actions. Such motivated reasoning allows people to stretch moral boundaries without feeling bad about themselves or even registering the immorality of their conduct.

How do we study dishonesty?

Throughout the years, we have gathered a vast amount of evidence for this deeply human condition. We study dishonesty with various conflict-of-interest tasks (1). One such task is the matrix task, in which participants are presented with a test sheet with 20 matrixes, with each matrix consisting of 12 numbers. Participants’ task is to find and circle in each matrix two numbers that add up exactly to 10 (e.g., 1.53 and 8.47). For each correctly solved matrix, we pay $0.50. After five minutes, participants count the number of correctly solved matrixes on their test sheet and write down their performance on a collection slip. In our baseline control condition, participants submit both the test sheet and the collection slip. The experimenter then checks each participant’s performance and pays them accordingly. In our treatment condition, we instruct participants to shred their test sheet and only submit the collection slip to the experimenter. In this latter condition, participants face a conflict of interest at the end of the task: they know that they can either be honest or overstate their performance to earn more money. Several versions of this experiment, with thousands of participants in various countries, consistently find that, despite random assignment to these conditions, participants “solve” more matrixes in the conflict-of-interest condition — evidence for dishonesty. Interestingly, however, despite theoretically being able to claim having solved all 20 matrixes and getting away with it, people rarely cheat by the maximum amount. On average, they cheat by only 2 to 3 matrixes. Finally, these observations are never driven by a few bad apples that cheat by the maximum possible amount while others are honest. Instead, we find that almost everyone cheats but only by a limited amount.

The idea behind this limited amount of dishonesty is that motivated reasoning only works within certain boundaries. In other words, if people cheat too much, it becomes hard to rationalize away their immoral conduct so that they can continue feeling good about themselves in terms of their morality. Thus, it is important to understand what makes it easier or harder to engage in motivated reasoning. We found that when we give participants tokens for each correctly solved matrix that are then exchanged for $0.50 each, cheating significantly increases. Why? Because the consequences of peoples’ actions are now less direct, and this artifact facilitates motivated reasoning.

The good news is that we have evidence that moral reminders in the form of, for example, the Ten Commandments or commitments such as signing an honor code impede motivated reasoning and thus dishonesty (2). The key aspect to such reminders is their timing. For example, people often are asked to confirm the veracity of self-reports, such as their income tax or insurance claim, at the end of a form, after they have already faced the conflict of interest and likely engaged in some amount of dishonesty and motivated reasoning. We have shown in our work, for example, that a small rearranging of the code of honor on an insurance form — moving it from the end to the beginning — significantly decreased dishonest self-reports of policy holders (see What we know about dishonesty).

Implications for biomedical research

In research and in the practice of medicine, there are lots of opportunities for conflicts of interest and motivated reasoning. These opportunities are not due to bad people, they are just a by-product of how the academic research system and the rewards for certain kinds of research outcomes have been created (just as there were opportunities due to the structure of the financial system and bonuses for certain kinds of investment gains). No system design is perfect, and selfish motives are an evolutionary factuality. This is why society has developed a system of moral values and principles of conduct that are taught to children from the moment they are born.

For example, the academic research system rewards statistically significant research findings with prestigious publications, grants, and promotions. Statistically nonsignificant research findings, on the other hand, are almost entirely disregarded, despite the fact that we sometimes learn more from them. Consequently, the system sets up a conflict of interest when, after thousands of dollars of research funding and hundreds of hours of work, one faces null effects (3). Tampering with data and misreporting of experimental procedures and results seems like a severe and rare reaction, but even the tendency to underreport negative and overreport positive data, which may appear less severe and therefore more acceptable, is a troublesome practice with potentially harmful consequences for the biomedical research community.

This temptation for bias in research reporting can extend beyond underreporting negative and overreporting positive data to selective reporting of results that conform to the hypothesis being explored. Researchers naturally become invested in their own proposed model, and there are many opportunities to disregard data that do not fit or to selectively show results that validate the model, even if they do not represent a typical outcome. Particularly in preclinical research, in which sample sizes are often small, it may be tempting to focus only on findings that corroborate the hypothesis. In addition, preclinical studies are not “preregistered” the way that clinical studies are, which means researchers do not have to declare their primary endpoints before the study begins. This allows them to be much more fluid about how the study is designed and what outcomes are reported. With limited research dollars, pressure to produce positive results, and increasingly in-depth studies, the opportunity exists to rationalize biased presentation of data, whether it is a Western blot, flow cytometry data, microscopy, or the reported severity of a given phenotype (4).

Yet another example of an opportunity for a conflict-of-interest situation is created through group-based work. The more people that are part of a decision-making process and involved in a research paper or care of a patient, the easier it is to use diffusion of responsibility in their motivated reasoning to give in to selfish motives in conflict-of-interest situations (5). Multiauthor papers are now standard in biomedical research, with papers often including work of many different labs. In such situations, some authors may not have scrutinized all of the raw data, providing a potential opportunity for an author to provide a desirable result without much accountability.

Other examples of conflicts of interest in clinical research studies include the nonblind selection of patients for clinical tests and protocols or the receiving of grants, gifts, and invitations (part of their standard promotional arsenal) or consulting payments and investment income from medical companies and device companies. It is hard for anyone in such situations to suppress their underlying motives or feelings of reciprocity to be entirely unbiased (for a more nuanced discussion see ref. 6). While it is true that, for many of these types of issues, the International Committee of Medical Journal Editors has created standards to which the community must adhere in order to have their work published (e.g., all clinical studies must be preregistered in an international recognized platform, such as clinicaltrials.gov; authors need to report any type of industry funding received), these safeguards often are not implemented at the right time: at the time when researchers face a conflict of interest and need to decide which path to take. This mismatch in timing reduces their effectiveness. Furthermore, safeguards that are meant to encourage honesty and accountability are largely missing for preclinical research.

What can we do?

If we adopt the standard view that we encounter in policy and law, we would think that policing and punishment are the answers. But these alone are not the sole answers; otherwise, we would not have any of the problems presented here. The standard view assumes that we are rational human beings for whom the decision of whether to be dishonest is like any other decision: we weigh the benefits of being dishonest (e.g., one more top-tier publication) to its costs (i.e., the likelihood of being caught and the severity of the punishment one would have to face if caught). According to this view there is no morality. The world is simple. As long as the trade-off is in favor of the benefits, we will be dishonest. Thus, dishonesty is easy to fix. All one needs to do is to amp up the costs (or reduce the benefits) and make everyone aware of them. If, however, we understand that dishonesty is part of the human condition, that it is about the system that creates conflicts of interest and humans’ ability for motivated reasoning and not the person itself, what can we do?

There is no one-size-fits-all solution, but what we can learn from dishonesty as a fundamental part of the human condition is that there are several additional tools that have been largely untapped: education, moral reminders, and system design-changes to name but a few.

Education is a pillar of utmost importance. From the start, students need to be trained in proper conduct and taught about potential conflicts of interest as well as the potential perverse effects of their disclosures (7, 8). Furthermore, they need to be given directions on how to deal with conflicts of interest as well as situations in which they witness others engaging, knowingly or unknowingly, in misconduct. In this context, it is important to set very clear and specific rather than general rules of conduct — at least for the most critical of situations. For example, “Doctors should not accept gifts from patients” is much less subject to motivated reasoning than “Doctors should not engage in conflicts of interest” (9). This way we learn early on that there are certain rules that we need to obey, and over time, these rules become conventions commonly adhered to. While not formalized, they become the fabric that builds and maintains respect, cooperation, and trust. Along similar lines, we may want to reconsider recommendations for how researchers and institutions train students and postdocs. The NIH requires basic ethics training, but it is even more important to create a standard practice of principal investigators asking their students how often they ran an experiment, insisting on always showing the complete set of raw data and presenting individual data points instead of data processed into figures, and inquiring about any outliers or excluded observations.

In addition, moral reminders should be applied at the right time, that is, in the specific contexts in which we may experience conflicts of interest. Even the strongest of us experience weak moments in which the benefits that one may reap from dishonest actions are just too tempting. In these situations, moral reminders may help us focus again and keep us on the right path. For example, having to read aloud and sign an honor code just before indicating the procedure performed on a patient for health insurance claim purposes may deter some doctors from faulty reporting (2). The challenge of such reminders is to get peoples’ sustained attention at the right time.

The most substantive change would be around system-design changes that modify how institutions reward researchers. In designing such changes, we should try to eliminate conflict-of-interest situations as much as we can, once identified. For example, if we know that it is easier to get funding for statistically significant findings, which tempts researchers to misreport, we could change the rules for funding. In general, we need to start rewarding people for doing things the right way, no matter if this culminates in statistically nonsignificant results or prolonged research projects. We need a culture in which we celebrate and take pride for following the proper process and protocol and not for the delivery of certain outcomes. That is, in an ideal world, we want to redesign institutional rewards, such as tenure, or the criteria for grants and publications (to which tenure is currently inherently linked) to consider the scientific process and conduct of researchers. Similarly, if we know that, financial conflicts aside, researchers’ desire for their treatment to succeed often clouds judgment, we may want to create a system in which scientists who develop novel treatments are not the ones testing them. As has been pointed out, since the skills required to develop novel treatments tend to differ from the skills needed to run clinical trials in humans, separating the roles should not threaten innovation (6). In addition, journals could introduce a requirement of preregistering online not only clinical but also preclinical trials (10) and continue to establish higher standards (i.e., more transparency and detail) of data reporting (11) for publication.

Summary

Dishonesty is part of the human condition, and we need to think realistically about how we want to deal with such a world. We can pretend that we are entirely rational human beings, thereby implicitly shutting our eyes to the deeper problems. But we can also choose to accept our capacity for motivated reasoning and that the world is not just black and white. We can accept that we need to actively think about ways to help ourselves to change the environment in which our research community is operating such that we have an easier time staying on the right path rather than falling down a slippery slope.

About the authors

Nina Mazar is an Associate Professor of Marketing at Rotman School of Management, University of Toronto; a founding member of the Behavioral Economics in Action research cluster at Rotman (BEAR); and a fellow of the Science Leadership Program in Canada. She investigates human behavior, how it deviates from standard economic assumptions, and its implications for policy.

Dan Ariely is a James B. Duke Professor of Psychology and Behavioral Economics at Duke University. His research focuses on how people make decisions and why we frequently do not act in our own best interest. Dan is also the author of the New York Times bestsellers Predictably Irrational, The Upside of Irrationality, and The Honest Truth About Dishonesty and a producer of the documentary (Dis)Honesty — The Truth About Lies.

Footnotes

Conflict of interest: Dan Ariely has received funding from Eli Lilly and Company and GlaxoSmithKline.

Reference information: J Clin Invest. 2015;125(11):3993–3996. doi:10.1172/JCI84722.

References
  1. Mazar N, Amir O, Ariely D. The dishonesty of honest people: a theory of self-concept maintenance. J Marketing Res. 2008;45(6):633–644.
    View this article via: CrossRef Google Scholar
  2. Shu LL, Mazar N, Gino F, Ariely D, Bazerman MH. Signing at the beginning makes ethics salient and decreases dishonest self-reports in comparison to signing at the end. Proc Natl Acad Sci U S A. 2012;109(38):15197–15200.
    View this article via: PubMed CrossRef Google Scholar
  3. John LK, Loewenstein G, Prelec D. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol Sci. 2012;23(5):524–532.
    View this article via: PubMed CrossRef Google Scholar
  4. Chamberlin TC. The method of multiple working hypotheses. Science. 1965;148(3671):754–759.
    View this article via: PubMed CrossRef Google Scholar
  5. Mazar N, Aggarwal A. Greasing the palm: can collectivism promote bribery? Psychol Sci. 2011;22(7):843–848.
    View this article via: PubMed CrossRef Google Scholar
  6. Rosenbaum L. Conflicts of interest: part 1: Reconnecting the dots — reinterpreting industry-physician relations. N Engl J Med. 2015;372(19):1860–1864.
    View this article via: PubMed CrossRef Google Scholar
  7. Sah S, Loewenstein G, Cain DM. The burden of disclosure: increased compliance with distrusted advice. J Pers Soc Psychol. 2013;104(2):289–304.
    View this article via: PubMed CrossRef Google Scholar
  8. Loewenstein G, Sah S, Cain DM. The unintended consequences of conflict of interest disclosure. JAMA. 2012;307(7):669–670.
    View this article via: PubMed CrossRef Google Scholar
  9. Mulder L, Jordan J, Rink F. The effects of specific and general rules on ethical decisions. Organ Behav Hum Decis Process. 2015;126:115–129.
    View this article via: CrossRef Google Scholar
  10. Jansen of Lorkeers SJ, Doevendans PA, Chamuleau SAJ. All preclinical trials should be registered in advance in an online registry. Eur J Clin Invest. 2014;44(9):891–892.
    View this article via: PubMed CrossRef Google Scholar
  11. Jackson S. The importance of being transparent. J Clin Invest. 2015;125(2):459–459.
    View this article via: JCI PubMed CrossRef Google Scholar
Version history
  • Version 1 (November 2, 2015): No description

Article tools

  • View PDF
  • Download citation information
  • Send a comment
  • Terms of use
  • Standard abbreviations
  • Need help? Email the journal

Related comments

  • Dishonesty in scientific research
    Shashi Seshia

Metrics

Article has an altmetric score of 53
  • Article usage
  • Citations to this article (7)

Go to

  • Top
  • Abstract
  • The role of motivated reasoning and conflicts of interest
  • How do we study dishonesty?
  • Implications for biomedical research
  • What can we do?
  • Summary
  • About the authors
  • Footnotes
  • References
  • Version history
Advertisement
Advertisement

Copyright © 2025 American Society for Clinical Investigation
ISSN: 0021-9738 (print), 1558-8238 (online)

Sign up for email alerts

Picked up by 2 news outlets
Blogged by 1
Referenced in 1 policy sources
Posted by 48 X users
On 2 Facebook pages
93 readers on Mendeley
See more details