Scientific research from rich countries is much more widely cited than comparable work from poorer nations, and biases have only deepened over the last 35 years — this, according to a global analysis of nearly 20 million papers spanning 150 fields.
While the study, recently published in Nature Human Behaviour, is hardly the first to identify unfairness in citation practices, it is among the largest geographic analyses to do so across multiple fields. “On the surface, science seems more international than ever before. But unfortunately, despite that … the attention and recognition in science is skewed,” says lead author Charles Gomez, a sociologist at Queens College, City University of New York. “It’s worrying that despite more work done than ever before in human history, certain voices are being excluded.”
To identify any bias, the authors turned to social network analysis. First, they created two social networks from millions of papers stored in the Microsoft Academic Graph metadata repository. One of these social networks organized keywords, extracted from abstracts, by country using a text analysis algorithm. The second network tracked the cumulative number of citations for countries in each field by year.
In principle, countries producing similar research should have similar citation numbers. But a comparison of the two social networks revealed widespread bias. Many developed and developing countries studied similar things but had wide discrepancies in their citation numbers. Take the field of aerospace engineering for example, Gomez says. Hypothetically, researchers in the United States, Canada, and the Philippines might all publish similar papers on computer-aided flight. All three countries should all have similar citation numbers. But in practice, only work from the US and Canada would be highly cited.
One possible retort to such results: Perhaps work from poorer countries is simply lower quality, and therefore should be cited less. To address this possibility, Gomez and his collaborators limited their social networks to a set of journals that have been continuously publishing since the early 1980s. Ostensibly, papers published in the same journals in the same few decades shouldn’t have wide gaps in the quality of their published papers.
A comparison of the two social networks of science in this study (Left) showed that some countries are over-recognized (purple, Right) and some are under-recognized (yellow/orange, Right). Image credit: Charles Gomez.
These findings are strongly suggestive, but not definitive, says sociologist Misha Teplitskiy at the University of Michigan School of Information in Ann Arbor. While he considers the study a “nice contribution” to the field, in particular because of its large scale as well as its methodology, he notes that limiting the number of journals doesn’t prove that all the papers are of similar quality. To do so, he says, would require experiments or other compelling quality measures—for example, changing a researcher’s affiliation from one country to another on the same paper and then comparing citation numbers. The Microsoft Academic Graph database is also primarily composed of English language literature, he says, which could have a confounding effect on the countries cited. Moving forward, Teplitskiy would like to see both smaller scale experiments that allow cleaner quality control as well as big-picture analyses of the entire literature. “I see this as a brick in a fairly large set of bricks in a wall, trying to establish biases in recognition,” he notes.
Earlier research has attempted to identify citation bias, but taking into account the actual content of the papers using text-based analysis is new and promising, says sociologist of science Mathias Wullum Nielsen at the University of Copenhagen in Denmark. Citation networks combined with text analysis is a fresh approach in the field sometimes called the “science of science,” he says. And while at this stage, he says, text-based analyses are limited, eventually they may evolve to reveal more detailed differences between papers too.
It’s still not clear exactly why citations are biased internationally. But Gomez already sees some possible ways to remedy the bias. One of the primary salves might simply be encouraging scientists to be more mindful when compiling their reference sections in new studies. Journal editors, too, could be more mindful when they seek out, or perhaps require, the same classic citations again and again. “Despite science being much more international and global by many measures,” he says, “science is still far from equitable.”
Other recent papers recommended by Journal Club panelists:
Temporary nature-based carbon removal can lower peak warming in a well-below 2 °C scenario
Ant phylogenomics reveals a natural selection hotspot preceding the origin of complex eusociality