Search the Internet for any research article published in 2011, and you have a 50–50 chance of downloading it for free. This claim — made in a report1 produced for the European Commission — suggests that many more research papers are openly available online than was previously thought. The finding, released on 21 August, is heartening news for advocates of open access. But some experts are raising their eyebrows at the high numbers.
There has been a steady move over the past few years towards getting research papers that are funded by government money into the public domain, and the best estimates2, 3 for the proportion of papers free online run at around 30%. But these are underestimates, argues Éric Archambault, the founder and president of Science-Metrix, a consultancy in Montreal, Canada, that conducted the analysis for the European Commission.
The firm initially asked a team led by Stevan Harnad, an open-access campaigner and cognitive scientist at the University of Quebec in Montreal, to check a random sample of 20,000 papers published in 2008 (from the Scopus database of papers run by Elsevier). It used a program designed by Yassine Gargouri, a computer scientist at the same university, to find free articles. The team found that 32% of the papers that it downloaded in December 2012 were freely available. But when Archambault’s group checked 500 of these papers manually using Google and other search engines and repositories, the figure rose to 48%.
On the basis of this initial test, Science-Metrix applied its own automated software, or ‘harvester’, to 320,000 papers downloaded from2004 to 2011; the tool searches publishers’ websites, institutional archives, repositories such as arXiv and PubMed Central, and sites such as the academic networking site ResearchGate and the search engine CiteSeerX.
It found that an average of 43% of articles published during 2008–11 are available online for free, with the results varying by country and discipline (see ‘Freedom online’). But the true figure is probably higher, because the harvester does not pick up every free paper. When the incompleteness is adjusted for, the proportion of free articles from 2011 rises to about 50%, says Archambault.
The report “confirms my optimism”, says Peter Suber, director of the Office for Scholarly Communication at Harvard University in Cambridge, Massachusetts, and a proponent of open access to research. He thinks that it reflects the experiences of working scientists today. “When researchers hit a paywall online, they turn to Google to search for free copies — and, increasingly, they are finding them,” he says.
The rise of open-access journals is part of the explanation: the share of papers published in these journals rose from 4% in 2004 to 12% by 2011, the report found — agreeing with figures published last year by Bo-Christer Björk, who studies information systems at the Hanken School of Economics in Helsinki.
But the number of peer-reviewed manuscripts made free by other means has also increased, the report says. That includes those eventually made free — often a year after publication, and sometimes on a temporary promotional basis — by publishers that charge for subscription. But it also includes manuscripts that researchers themselves archive online on repositories and personal websites. Some of the articles, although free to read, may not meet formal definitions of open access because, for example, they do not include details on whether readers can freely reuse the material.
The report does not try to distinguish between types of manuscript, nor where and how they were posted, says Archambault. “The situation is so complex that it’s very hard to measure.”
Björk says that the latest measurements seem to have been carefully done, although he adds that because he does not have details of the robotic harvester’s code, he cannot evaluate its method. “Experts on the subject would probably agree that the open-access share of papers, measured around a year and a half after publication, is currently at least 30%,” he says. “Anything above that is dependent on ways of measuring, with this new study representing the highest estimate.”
The report, which was not peer reviewed, calls the 50% figure for 2011 a “tipping point”, a rhetorical flourish that Suber is not sure is justified. “The real tipping point is not a number, but whether scientists make open access a habit,” he says.
Harnad thinks that the next step should be to obtain more accurate measures of when papers become free. “It’s hardly a triumph if articles are only accessible after a one-year embargo,” he says. Greater measurement accuracy is tricky to achieve, he adds, because Google routinely blocks all robotic harvesters. He believes that research on the growth of open access should be given special concessions.
The proportion of free online papers is likely to increase in the next few years. The European Commission says that, from 2014, the results of all research funded by the European Union must be open access. And in February, the US White House announced that government-funded research should be made free to read within 12 months of publication (see Nature 494, 414–415; 2013). Federal agencies are due to submit their plans for achieving this to the US Office of Science and Technology Policy by 22 August.
- Journal name:
- Date published:
DISGUISED as employees of a gas company, a team of policemen burst into a flat in Beijing on September 1st. Two suspects inside panicked and tossed a plastic bag full of money out of a 15th-floor window. Red hundred-yuan notes worth as much as $50,000 fluttered to the pavement below.
Money raining down on pedestrians was not as bizarre, however, as the racket behind it. China is known for its pirated DVDs and fake designer gear, but these criminals were producing something more intellectual: fake scholarly articles which they sold to academics, and counterfeit versions of existing medical journals in which they sold publication slots.
As China tries to take its seat at the top table of global academia, the criminal underworld has seized on a feature in its research system: the fact that research grants and promotions are awarded on the basis of the number of articles published, not on the quality of the original research. This has fostered an industry of plagiarism, invented research and fake journals that Wuhan University estimated in 2009 was worth $150m, a fivefold increase on just two years earlier.
Chinese scientists are still rewarded for doing good research, and the number of high-quality researchers is increasing. Scientists all round the world also commit fraud. But the Chinese evaluation system is particularly susceptible to it.
By volume the output of Chinese science is impressive. Mainland Chinese researchers have published a steadily increasing share of scientific papers in journals included in the prestigious Science Citation Index (SCI—maintained by Thomson Reuters, a publisher). The number grew from a negligible share in 2001 to 9.5% in 2011, second in the world to America, according to a report published by the Institute of Scientific and Technical Information of China. From 2002 to 2012, more than 1m Chinese papers were published in SCI journals; they ranked sixth for the number of times cited by others. Nature, a science journal, reported that in 2012 the number of papers from China in the journal’s 18 affiliated research publications rose by 35% from 2011. The journal said this “adds to the growing body of evidence that China is fast becoming a global leader in scientific publishing and scientific research”.
In 2010, however, Nature had also noted rising concerns about fraud in Chinese research, reporting that in one Chinese government survey, a third of more than 6,000 scientific researchers at six leading institutions admitted to plagiarism, falsification or fabrication. The details of the survey have not been publicly released, making it difficult to compare the results fairly with Western surveys, which have also found that one-third of scientists admit to dishonesty under the broadest definition, but that a far smaller percentage (2% on average) admit to having fabricated or falsified research results.
In 2012 Proceedings of the National Academy of Sciences, an American journal, published a study of retractions accounting for nation of origin. In it a team of authors wrote that in medical journal articles in PubMed, an American database maintained by the National Institutes of Health, there were more retractions due to plagiarism from China and India together than from America (which produced the most papers by far, and so the most cheating overall). The study also found that papers from China led the world in retractions due to duplication—the same papers being published in multiple journals. On retractions due to fraud, China ranked fourth, behind America, Germany and Japan.
“Stupid Chinese Idea”
Chinese scientists have urged their comrades to live up to the nation’s great history. “Academic corruption is gradually eroding the marvellous and well-established culture that our ancestors left for us 5,000 years ago,” wrote Lin Songqing of the Chinese Academy of Sciences, in an article this year in Learned Publishing, a British-based journal.
In the 1980s, when China was only beginning to reinvest in science, amassing publishing credits seemed a good way to use non-political criteria for evaluating researchers. But today the statistics-driven standards for promotion (even when they are not handed out merely on the basis of personal connections) are as problematic as in the rest of the bureaucracy. Xiong Bingqi of the 21st Century Education Research Institute calls it the “GDPism of education”. Local government officials stand out with good statistics, says Mr Xiong. “It is the same with universities.”
The most valuable statistic a scientist can tally up is SCI journal credits, especially in journals with higher "impact factors"—ones that are cited more frequently in other scholars’ papers. SCI credits and impact factors are used to judge candidates for doctorates, promotions, research grants and pay bonuses. Some ambitious professors amass SCI credits at an astounding pace. Mr Lin writes that a professor at Ningbo university, in south-east China, published 82 such papers in a three-year span. A hint of the relative weakness of these papers is found in the fact that China ranks just 14th in average citations per SCI paper, suggesting that many Chinese papers are rarely quoted by other scholars.
The quality of research is not always an issue for those evaluating promotions and grants. Some administrators are unqualified to evaluate research, Chinese scientists say, either because they are bureaucrats or because they were promoted using the same criteria themselves. In addition, the administrators’ institutions are evaluated on their publication rankings, so university presidents and department heads place a priority on publishing, especially for SCI credits. This dynamic has led some in science circles to joke that SCI stands for “Stupid Chinese Idea”.
The warped incentive system has created some big embarrassments. In 2009 Acta Crystallographica Section E, a British journal on crystallography, was forced to retract 70 papers co-authored by two researchers at Jinggangshan university in southern China, because they had fabricated evidence described in the papers. After the retractions the Lancet, a British journal, published a broadside urging China to take more action to prevent fraud. But many cases are covered up when detected to protect the institutions involved.
The pirated medical-journal racket broken up in Beijing shows that there is a well-developed market for publication beyond the authentic SCI journals. The cost of placing an article in one of the counterfeit journals was up to $650, police said. Purchasing a fake article cost up to $250. Police said the racket had earned several million yuan ($500,000 or more) since 2009. Customers were typically medical researchers angling for promotion.
Some government officials want to buy their way to academic stardom as well: at his trial this month for corruption, Zhang Shuguang, a former railway-ministry official, admitted to having spent nearly half of $7.8m in bribes that he had collected trying to get himself elected to the Chinese Academy of Sciences. Chinese reports speculated that he spent the money buying votes and hiring teams of writers to produce books. Widely considered to be a man of limited academic achievement, Mr Zhang ultimately fell just one vote short of election. Less than two years later, he was in custody.
Upgrade your inbox and get our Daily Dispatch and Editor's Picks.