In 2015, molecular oncologist Jennifer Byrne was surprised to discover during a scan of the academic literature that five papers had been written about a gene she had originally identified, but did not find particularly interesting.
“Looking at these papers, I thought they were really similar, they had some mistakes in them and they had some stuff that didn’t make sense at all,” she said. As she dug deeper, it dawned on her that the papers might have been produced by a third-party working for profit.
“Part of me still feels awful thinking about it because it’s such an unpleasant thing when you’ve spent years in a laboratory and taking two to 10 years to publish stuff, and making stuff up is so easy,” Professor Byrne said. “That’s what scares the life out of me.”
The more she investigated, the more clear it became that a cottage industry in academic fraud was infecting the literature. In 2017, she uncovered 48 similarly suspicious papers and brought them to the attention of the journals, resulting in several retractions, but the response from the publishing industry was varied, she said.
“A lot of journals don’t really want to know,” she said. “They don’t really want to go and rifle through hundreds of papers in their archives that are generated by paper mills.”
More recently, she and a French collaborator developed a software tool that identified 712 papers from a total of more than 11,700 which contain wrongly identified sequences that suggest they were produced in a paper mill. Her research is due to be published in Life Science Alliance.
Even if the research was published in low-impact journals, it still had the potential to derail legitimate cancer research, and anybody who tried to build on it would be wasting time and grant money, she said. She has also suggested that journals could flag errors while articles were under investigation, so people did not continue to rely on their findings during that time.
Publishers and researchers have reported an extraordinary proliferation in junk science over the last decade, which has infiltrated even the most esteemed journals. Many bear the hallmarks of having been produced in a paper mill: submitted by authors at Chinese hospitals with similar templates or structures. Paper mills operate several models, including selling data (which may be fake), supplying entire manuscripts or selling authorship slots on manuscripts that have been accepted for publication.
The Sydney Morning Herald has learned of suicides among graduate students in China when they heard that their research might be questioned by authorities. Many universities have made publication a condition of students earning their masters or doctorates, and it is an open secret that the students fudge the data. The universities reap money from the research grants they earn. The teachers get their names on the papers as contributing authors, which helps them to seek promotions.
International biotechnology consultant Glenn Begley, who has been campaigning for more meaningful links between academia and industry, said research fraud was a story of perverse incentives. He wants researchers to be banned from producing more than two or three papers per year, to ensure the focus remained on quality rather than quantity.
“The real incentive is for researchers to get their papers published and it doesn’t have to be right so long as it’s published,” Dr Begley said. He recently told the vice-chancellor of a leading Australian university of his frustration with the narrative that Australia was “punching above its weight” in terms of research outcomes. “It’s outrageous,” Mr Begley told the vice-chancellor. “It’s not true.”
“Yes,” the vice-chancellor replied. “I use that phrase with politicians all the time. They love it.”
According to one publishing industry insider, editors are operating with an element of wishful thinking. This major publishing house employee, whose contract prevented him from speaking publicly, said when his journal started receiving a torrent of applications from Chinese researchers around 2014, the staff assumed that their efforts to tap into the Chinese market had borne fruit. They later realised that many of the papers were fraudulent and acted, but he was aware of other editors who turned a blind eye.
“Obviously there’s so much money in China and the journals have their shareholders to answer to, and they are very careful not to tread on Chinese toes because of the political sensitivity,” he said. “There’s a lot more they could do to sort the good from the bad because there is good science going on in China, but it’s all getting a bad name because of what some Chinese people have worked out — that there’s a market here for a business.”
Last month, SAGE journals retracted 212 articles that had clear evidence of peer review or submission manipulation, and subjected a further 318 papers to expressions of concern notices. The Royal Society of Chemistry announced last year that 68 papers had been retracted from its journal RSC Advances because of “systematic production of falsified research”.
To indicate the upswing in cases, German clinical researchers reported last week that in their analysis of osteosarcoma papers, just five were retracted before the millennium and 95 thereafter, with 83 of them from a single, unnamed country in Asia. University of Munster Professor Stefan Bielack, who published the study in Cancer Horizons, said some open access journals charged academics US$1500 to $2000 to publish their work, so they were more interested in publishing lots of papers than their scientific validity.
“There is a systematic problem and in some countries people might have the wrong incentives,” Professor Bielack said. “I think the journals have a major role. They all need to be more rigorous.”
The problem is not confined to China, but it has accompanied a dramatic growth in research output from that country, with the number of papers more than tripling over the last decade.
In 2017, responding to a fake peer review scandal that resulted in the retraction of 107 papers from a Springer Nature journal, the Chinese government cracked down and created penalties for research fraud. Universities stopped making research output a condition of graduation or the number of articles a condition of promotion.
But those familiar with the industry say the publication culture has prevailed because universities still compete for research funding and rankings. The number of research papers produced in China has more than tripled over the last decade, with dramatic growth over the past two years. The Chinese government’s investigation of the 107 papers found only 11 per cent were produced by paper mills, with the remainder produced in universities.
Until last year, University of NSW offered its academics a $500 bonus if they were the lead author in a prestige publication and $10,000 if they were the corresponding author of a paper published in Nature or Science. The system, which was designed to reward quality over quantity, was discontinued due to financial constraints.
But others have questioned whether the quality of a paper can be measured by the journal in which it is published, and an open access movement has sprung up in opposition to the scientific publishing industry, arguing that research paid for by taxpayers should be freely available to all.
Alecia Carter, an Australian biological anthropologist at University College London, said the emphasis on getting published in a high-impact journal rewarded sensational results over integrity, positive results over negative results and novel findings over building the evidence base. Researchers might inflate effect sizes or omit conflicting evidence because it muddied the overall story they were trying to tell.
“We as scientists know all these things that are wrong with the way the system is set up, but we still play the game,” Dr Carter said. “We’re all chasing the same thing.”
Dr Carter boycotts luxury journals, publishes as much as possible in open access journals and reports negative results, though this has come at a cost to her career. She was once asked at a job interview why she would bother reporting results that were not interesting.
“I said, ‘If it’s interesting enough to do the research then we should publish the results’.”
She did not get the job.
Fonte: aqui
Sem comentários:
Enviar um comentário