More transparency in reviewing is called for
EDITOR-The recent calls for boycotting Israeli scienceassume that Israeli authors are as likely to publish as others.1 In fact, nations are differentially represented in scientificjournals.
The total number of publications from a given country might be expected to correlate positively with the number from that country in any given journal. Data on the publication trends during 1973-2002 show for the Proceedings of the National Academyof Sciences and Science significantly positive correlations for 34 among 40 countries. The six exceptions (15%) could be explained in terms of cold war politics, small samples, etc.Publication trends in Nature were similar for most countries,implying similar professional criteria in all three journals(figure ).
However, 19 countries (47.5%), including Israel, are underrepresented in Nature as compared to Science and the Proceedings of theNational Academy of Sciences during 1973-2002.2 Editorial biasis an unlikely cause: it wouldn’t remain consistent over 30 years. This implies that hidden prejudices affect publication chances of scientists from some countries in some journals. Recent boycotts therefore appear even more as attempts to justify old biases and bring them into the open.
Information on rejection rates could prove or disprove these suspicions. These are available online for the BMJ (bmj.com/advice/ms_breakdown.shtml ),whose publication trends resemble those of the Proceedings oft he National Academy of Sciences. The BMJ‘s country-based rejectionrates correlate consistently between years, and with the number of medical doctors per head, but not other socioeconomic indices (gross domestic product per head, longevity, urbanisation, population density). This indicates that country associated BMJ rejection rates echo scientific quality resulting from different medical education systems.
Potential conclusions are not optimistic: country associated prejudices in some journals indicate that other prejudices (institutional, etc) probably affect more than believed professional decisions; and now even some dare voice that they consciously use prejudices in professional decisions. Editorial boards have the responsibility to detect and fight biases. Although non-scientific criteria might be acceptable publication criteria, these must be objective, published, and applied across the board.
Hervé Seligmann, research associate
Center for Computational Biology and Visualization, Department of Biological Sciences, Louisiana State University, Baton Rouge, LA 70803, USA email@example.com
Competing interests: HS speaks Hebrew at home.
British Medical Journal, fast reply.
Send response to journal:
Organizing publicly one of many decennial silent boycotts: how old is the iceberg? A call for some transparency in reviewing processes
Department of Biological Sciences, Louisiana State University, Baton Rouge, LA 70803
Running title: Hidden academic boycotts
I analyzed the relationships between the number of publications with at least one author from 40 specific countries in three journals (Science, PNAS and Nature) and the total number of yearly publications for that country from 1973-2002. Publications in Science and PNAS increase proportionally with the total yearly number of publications from a country, besides six (15%) explainable exceptions (cold war: Czechs, Hungary, Russia; apartheid boycott: South Africa; and small sample effects: Kenya, Pakistan). For Nature, this was not the case for 19 countries (47.5%), including Israel, a country for which academic boycott has been proposed in 2002. Publication patterns in BMJ resembled those in PNAS.
Abbreviations: PNAS Proceedings of the National Academy of Sciences of the United States of America; BMJ -British Medical Journal; n.s. not significant.
Recent publications in Nature (1-6) and other scientific journals (i.e. BMJ (7)) debate organizing a boycott of Israeli academics, boycott an editorial in Science firmly condemned (8). Figure 1 plots the number of publications in Nature from 1973-2002 with at least one author’s address in Israel, and as control, those for Denmark and Switzerland (selected for similar population sizes). Figure 2 displays patterns for Science. The total yearly number of publications from these countries tripled from 1973 to 2002 (Table 1, columns 2-3). The null hypothesis expects a positive correlation between the total yearly number of publications and that in any journal. This is the case for all three countries in Science, but not for one country in Nature, Israel. Table 1 displays correlation coefficients between yearly total publications and publications in three journals (Science, PNAS and Nature) for 40 countries. Correlation coefficients are positive and statistically significant at p < 0.05 (1 tailed test) for Science and PNAS for most countries, with few expected exceptions (cold war: Czechs, Hungary, Russia; apartheid boycott: South Africa; and small sample effects: Kenya, Pakistan). Publication patterns in Nature for 19 among 40 countries (bold emphasis in Table 1) differ from the trivial expectation. Causes arguably vary among countries. Nature systematically differs from Science and PNAS. For Science, patterns confirm the recent editorial statement (8). For Nature, editorial bias cannot cause multi-decennial patterns. In the absence of direct information on submission, pre-review and post-review rejection rates, analyses are only suggestive, but open to interpretation.
The analysis of papers submitted to BMJ for publication (available online at http://bmj.com/advice/ms_breakdown.shtml) breaks down submission and acceptance rates for several countries included in Table 1, for 1998-2000. The correlation coefficients between the total number of publications per year for a country and the number published by BMJ during these three years correlates significantly with the coefficients for PNAS for those years (r = 0.43, p < 0.025, 1 tailed test), more weakly for Science (r = 0.28, n.s.), and very weakly for Nature (r = 0.12, n.s.). BMJ and PNAS apparently have similar publication patterns, which is true to a lesser extent for Science, those of Nature differ.
Regression analysis of the yearly BMJ data predicts for each country the number of accepted manuscripts (the dependent of the regression analysis) from the number of manuscripts submitted to BMJ (as independent variable). Numbers between zero and three (following the name of a country in the first column in Table 1) indicate the number of years that more manuscripts were accepted than predicted. Overall, discrepancies from prediction were similar for the 3 different years. In total, eight countries (Finland, France, Germany, India, Italy, Japan, Netherlands, Sweden) have for all three years a lower than predicted number of accepted manuscripts, 13 countries are systematically above the prediction (Canada, China, Hong Kong, Hungary, New Zealand, Pakistan, Saudi Arabia, Singapore, South Africa, Switzerland, UK, US, former Yugoslavia) and 7 countries show variation (only two years above predictions: Belgium, Denmark, Ireland, Norway; only one year above prediction: Australia, Israel, Spain). The latter seven countries were closer to the regression line than the other countries, suggesting that for these countries, the qualitative variation between years is mainly random.
These groupings of BMJ acceptance versus submission numbers do not seem to follow socio-economic, geographic or political patterns. The difference between actual and expected publication numbers does not significantly correlate with the country’s average income per capita, population density, longevity or level of urbanization. However, the number of physicians per capita (9) correlates negatively with the number of accepted manuscripts as corrected for the number of submitted manuscripts (r =-0.47, P = 0.005). This interesting result is beyond the scope of this study. It suggests that publication biases in BMJ result from various policies in different countries towards medical research and medical assistance to the population. This correlation is plausibly indicative of causes affecting the quality of submitted manuscripts. Causes for publication patterns unrelated to scientific quality would be unjustifiable, especially when no clear, consensual discriminatory policy has been made public. The example of BMJ suggests partial solutions to the situation. Journals should make publicly available submission and acceptance data broken down by country, so that authors could consider this information while deciding where to send their manuscripts for review. This simple method could reject suspicions of biases that analyses similar to mine might raise. Editorial boards should analyze these data, including further information available to the journal, in order to detect existing biases and suggest their causes. Journals could thereby detect biases in their publication patterns, and try to prevent unscientific factors associated with nationalities from affecting publication rates. An international institute designed to gather such data from journals, develop analytical tools and publish yearly data, results and conclusions might also help. Such policies should also be considered in other domains of intellectual activity, such as social sciences, humanities and arts.
1. Anonymous. Dont boycott Israels scientists. Nature 2002; 417: 1.
2. Rose S, Rose H. Boycott of Israel? It worked for South Africa. Nature 2002; 417: 221.
3. Mangel M. Nature 2002; 217: 222.
4. Abbes A, Balabane M, Farjoun E, Harris M, Rouquier R, Schapira P. Palestinian scientists are already restricted. Nature 2002; 417: 221.
5. Tzfati Y. Collaboration can work if inequality is recognized. Nature 2002; 417: 689.
6. Fink G. Did an academic boycott help to end apartheid? Nature 2002; 417: 690.
7. Chalmers I. Academic boycott of Israel – Threshold for academic boycotts of apartheid South Africa and apartheid Israel needs clarification. BMJ 2003; 326: 714.
8. Kennedy D. When science and politics don’t mix. Science 2002; 296: 1765.
9. World Health Organization. WHO estimates of Health Personel, 1998.
Columns 2-3 present the total number of publications for the first and last years covered by the analyses, columns 4-6 the mean publication number per year for the respective journals, and columns 7-9 the Pearson correlation coefficients between the yearly numbers of publications in a journal and the total number of publications for respective countries. The more positive the correlation coefficients, the more they indicate “proportionality”. All data were compiled from the expanded Science Citation Index (Web of Knowledge,http://isi1.isiknowledge.com )