Medical knowledge changes rapidly. We therefore become more vulnerable for certain results that do not meet required standards
“If I have seen further, it is because I have stood on the shoulder of giants,” writes sir Isaac Newton to his colleague Robert Hooke in 1676. He was not the first to have used this metaphor about development of new knowledge, but he is perhaps the most quoted (1). Newton”s point was of course that new insight does not evolve in a vacuum; it is always built upon existing knowledge. Researchers are individuals, but research is collective. The acknowledgement that ones own contribution is always built upon work done by others is the background for the tradition of referring to previous work when scientific results are published. It visualises which sources the work is based on and acknowledges the contribution of other researchers. To be cited is to be taken seriously. Through citations ones own article continues to live and influence the development of knowledge.
Being cited is therefore regarded to be a good, but far from perfect indicator of the importance, interest and relevance of ones contribution. One who is often cited becomes influential. This is the background for a journal”s impact factor, that Thomson Scientific has developed into an industry: According to a meticulously developed formula it is calculated how much an average article is referred to in a certain number of (English language) journals (2).
For the same reason Google Scholar (Google”s search engine for scientific publications) shows where an article is referred - and by whom. This is meant to give important additional information about the article”s impact (3). Google Scholar today use “Stand on the shoulder of giants” as a slogan for their search engine.
To have a high impact factor gives prestige also for the journals that publish articles, because it means that the new knowledge they publish is carried further by others. This also has a self-strengthening effect: The best and most ambitious researchers wish to publish in journals with a high impact factor to increase their chance of being read and noticed. And the journals wish to attract the most exiting and innovative results, because these will be used and cited by other researchers. The journals can thus maintain and perhaps increase their impact factor. The researchers compete about being published in the best journals - and the best journals compete about publishing the most interesting research - first. At its best, this exchange improves the quality of publications. But the medal has a dark backside that shows itself when published information does not measure up to standards or is directly wrong.
As an example, South Korean Woo Suk Hwang and his stem cell research was met with great expectations. In about a year he published two articles which appeared to be innovative in the Journal Science (4,5). The first one showed that it was possible to produce patient specific stem cells from oocytes- i.e. cells that continue to divide in the laboratory and are therefore capable of creating new tissue. In the other article he states that he has established stem cell lines from people with 11 different serious diseases. This was what many had hoped for: That it should be possible to produce cells that could be treated in the laboratory and used as replacements for sick cells in seriously ill people. It turns out that Hwang fabricated the data for the article and the Journal Science has withdrawn both the articles and the editorials that were written as comments to his research (6).
This matter has much wider consequences than those for the single researcher and his co-authors. It is said that medical opinions change every tenth year, perhaps more often. We use this as a confirmation that medicine develops quickly - in a good way. New knowledge is created, captured and disseminated rapidly. But it does have large consequences if wrong information is published. Other researcher”s work can be based on wrong claims or theories, or on pure fabrication. This has consequences for doctors and patients that are given false hopes. According to Google Scholar, Hwang”s article in Science from 2004 was cited 181 times and about half of these articles have already been cited further - 1- 20 times (3). Through citation”s multiplying effect not only research, but also mistakes, are spread quickly and widely. Information from Hwang”s articles is already deeply woven into the knowledge base for stem cell research.
Our Journal has also discussed Hwang”s research in the column Medical News (7). After his article was withdrawn the summary we had given and the comments on the importance of the findings are no longer valid. Jens Bjørheim comments, in a web version of this issue of the Journal, that stem cell research can be set far back - perhaps many years, because not only Hwang”s research, but research based on his results must be reassessed (8).
In addition to leading researchers on the wrong track it steals resources, time from other research and reduces the public”s faith in medical research and medical practice. It does not help to stand on the shoulders of giants if the giants stand in mud. When they fall, they bring along many others with them.