Study highlights how small analytical decisions can shape outcomes and why transparency matters in modern research.
An East Texas A&M University alumnus contributed to a large-scale international research study published in Nature, one of the world's leading scientific journals, examining how different analytical approaches can lead to varying scientific results.
Dr. Kaleb Mathieu, who earned his Ph.D. in Experimental Psychology from East Texas A&M in May 2025, was one of nearly 500 researchers involved in the study. Notably, Mathieu began working on the project during the early years of his doctoral program at ETAMU, applying the research and statistical training he developed at the university. Currently, he works as a learning engineer at Carnegie Learning where he is continuing his research.
The study, led by Balázs Aczél and Barnabás Szászi, explored the “analytical robustness” of scientific findings—essentially asking whether different researchers analyzing the same data would reach the same conclusions.

A global effort to test scientific findings
The project was part of the SCORE (Systematizing Confidence in Open Research and Evidence) initiative through the Center for Open Science. Researchers examined 100 previously published studies across psychology, economics and political science.
Each study was assigned to multiple independent analysts, who were given the same dataset and research question but no specific instructions on how to conduct their analysis.
Mathieu served as a “reanalyst,” meaning he independently examined data from one of the selected studies and reported his findings.
“We were given a claim from a research paper and the original data and told to analyze it however we thought was appropriate,” Mathieu said. “That was the whole point—there are many valid ways to analyze the same data.”
Across the project, at least five analysts reexamined each study, allowing researchers to compare how different analytical choices affected outcomes.
When small decisions make a big difference
One of the key findings of the study was that while most researchers arrived at similar overall conclusions, the details often varied.
“Most of the studies were qualitatively the same,” Mathieu said. “But there were a lot of differences in the specifics—things like the strength of the results or the exact statistical values.”
These differences stem from what researchers call “analytic variability”—the many small decisions made throughout the research process.
From how data is cleaned to which statistical model is used, each step can influence the final result.
“All those small decisions can add up over time,” Mathieu said. “Even something as simple as how you handle missing data or what type of analysis you choose can change the outcome.”
Perhaps most striking, a small percentage of reanalyses—about 2%—produced conclusions that were opposite from the results of the original study.
“That was the biggest surprise,” Mathieu said. “You wouldn't expect a different analysis to lead to the opposite finding, but it shows how much those decisions can matter.”
Not a crisis of coincidence, but a call for awareness
Despite these differences, Mathieu emphasized that the findings should not be interpreted as a reason to distrust scientific research.
“It doesn't discredit prior research,” he said. “Instead, it shows that a single analysis doesn't capture all the uncertainty that exists in the research process.”
In most cases, the general takeaway of a study remained consistent, even when the details varied. However, the research highlights the importance of acknowledging the hidden layers of uncertainty behind published findings.
“There's a lot of variability that isn't always considered,” Mathieu said. “We need to be more aware of that when interpreting results.”
The study also connects to broader conversations in the scientific community, including the “replication crisis,” in which some well-known findings could not be reproduced in later studies.
As a result, many journals and researchers are adopting new practices such as preregistration—publicly outlining research methods before conducting a study—and conducting robustness checks to test results under different conditions.
Preparing students for real-world research
Mathieu's involvement in the study reflects the kind of hands-on research experience emphasized within East Texas A&M's Department of Psychology and Special Education.
The department offers a range of programs, including the Experimental Psychology Ph.D., which focuses heavily on research design, statistical analysis and critical thinking. Students are trained not only to conduct research, but also to evaluate it from multiple perspectives.
Mathieu credits his time at East Texas A&M with helping prepare him for the project.
“The program gave me the foundation in research methods and statistics,” he said. “But it also encouraged me to think beyond just one way of doing things.”
That mindset proved essential in a study designed to explore how different approaches can lead to different outcomes.
From the classroom to global collaboration
The scale of the project—nearly 500 researchers working across disciplines and institutions—presented unique challenges, but Mathieu said the collaboration was well organized.
“It was run very well,” he said. “Everything was clearly documented, and there were systems in place to make sure all the analyses were reviewed and checked.”
Participants submitted their analyses and findings, which were then compared and evaluated collectively. The result was one of the most comprehensive looks to date at how analytical decisions influence scientific conclusions.
Today, Mathieu continues to conduct research as a learning engineer at Carnegie Learning. His experience on the project continues to shape how he approaches data and analysis.
Advice for future researchers
For students interested in psychology or research, Mathieu offered a key piece of advice: stay open-minded.
“Don't feel like you're locked into one way of doing things,” he said. “There are multiple valid approaches, and being open to learning new methods is really important.”
He also encouraged students not to be discouraged by the complexity revealed in studies like this one.
“The point isn't that research is unreliable,” he said. “It's that it's more nuanced than it might seem at first.”
A broader impact
The Nature study underscores an important lesson for both researchers and the public: scientific findings are not just about data; they are also shaped by the choices researchers make along the way.
For East Texas A&M students, Mathieu's contribution serves as an example of how classroom learning can lead to meaningful participation in global research efforts.
As scientific methods continue to evolve, studies like this one aim to improve transparency and strengthen confidence in research—not by simplifying it, but by embracing its complexity.


