We are proud to launch our Guest Post Section with this first contribution by Peter Weingart. The Professor Emeritus in Sociology of Science and former Director of the Institute of Science and Technology Studies (IWT), Bielefeld, acts as Member of the Scientific Advisory Council of the Center for Interdisciplinary Studies (ZiF), Bielefeld, and Editor-in-Chief of Minerva. He has also been very active in promoting the role of Science Studies beyond academic borders.
In his talk at the 10th International Bielefeld Conference on “Shaping Future INFO-Structures” he discusses “Openness in Science” in the context of current changes in the scholarly publishing system. From moral codes and reputational hierarchies in science to the inherent monetary necessities in the context of the production of scientific knowledge, Peter Weingart retraces the role of openness in science and the constraints that come along with this implicit ethos first formulated by Robert King Merton back in 1942.

Openness in Science

– Talk at 10th International Bielefeld Conference, 2012

1. The norms of scientific communication

It is generally recognized that the communication among scientists and scholars is governed by a particular scientific ethos, i.e. by a set of rules that are supposed to establish trust in and guarantee the reliability of the knowledge created in the process. This ethos has been given its most succinct and influential formulation by the American sociologist of science Robert K. Merton (1957 (1942)) who defined it in terms of four basic norms of science and with that laid the groundwork for the sociology of science.

The scientific ethos refers to those patterns of behavior among scientists and implicit norms that can be traced back in history to the establishment of the academies in England and France in the 17th century. Regardlessof the many changes in detail there is a remarkable continuity over a period of more than three centuries. Merton‘s formulation of the norms of science can be understood as an analytical condensation of the behavioral patterns that have evolved over the length of this period into a set of institutional imperatives. The ethos, in his words, „is that affectively toned complex of values and norms which is held to be binding on the man of science“ which, although not codified, can be inferred from, above all, the „moral indignation directed toward contraventions of the ethos“ (Merton 1957, 551/2). In the definition of the institutional imperatives all elements of the social behavior can be identified that emerged in the 17th century and were subsequently further developed. Most significantly, they mark a departure from previous patterns of medieval and early renaissance science when ‘secrecy’ played an important role.

Universalism is the principle that truth claims are „subjected to preestablished impersonal criteria“ (Merton 1957, 553) irrespective of the social attributes of their protagonists, .e.g. nationality, race, class, or religion.

Communism (also termed ‚communality‘) refers to the norm that the findings of science „are a product of social collaboration and are assigned to the community“ (Merton 1957, 556). Property rights are kept at a minimum, and the scientist‘s claim to intellectual property is constrained to recognition and esteem by the members of his community.

Disinterestedness is a fundamental institutional element that has its basis in the public and testable nature of science. It contributes to the integrity of scientists by sanctioning and by the accountability to their peers to resist the temptation of using improper means to their own advantage.

Organized scepticism is „both a methodologic and an institutional mandate“, i.e. the scrutiny  of claims and beliefs on the basis of empirical and logical criteria (Merton 1957, 560).

These norms or values constitute, as an integral ensemble, a system of communication that is uniquely geared to produce knowledge that may be considered „true“ in the sense of being reliable but by no means final. Universalism guarantees, at least formally, general social accessibility and at the same time prevents interference of any other criteria (political, religious, ethnic) in the communication of knowledge other than those accepted as belonging to science itself. Communism subjects all knowledge to general and open communication, thereby channelling proprietary interests of the scientist to gaining recognition by obtaining priority of discovery. It creates the social cohesion of the scientific community. Disinterestedness constitutes the self-reference of scientific communication and distinguishes science from the professions in that it has no clientele. The clients of science are scientists themselves, fraud or quackery may lead to temporary advantages at best. Organized scepticism is the flip side of that norm as it stipulates the impersonal scrutiny of any truth claim as a general principle of scientific communication. It is institutionalized in the peer review system of journals and funding agencies.

Merton‘s list of norms condenses the institutional patterns that have emerged over this time period into an analytical scheme which serves to explain the unique status of science as a set of methods to produce certified knowledge, to accumulate the knowledge that accrues from their application, and as a set of cultural values and mores that governs these activities.[1]

Most important in the context of this talk is the fact that these norms imply that ‘openness’ of any communication in science is the prerequisite for the attainment of ‘true knowledge’. The central assumption is that scientific knowledge has to be and is only certified if it has passed the scrutiny of whoever is competent to test truth claims. Only if such truth claims are accepted as such by the collective of scientists can they be considered – for the time being – as secure and reliable knowledge. This is indicated by the common definition of malpractice or fraud in science and the kinds of sanctions connected to them. To plagiarize or to fabricate data are considered the chief acts of fraudulent behavior within science. They violate trust which comes in tandem with openness: if scientists openly exchange their insights they can never scrutinize all truth claims but have to focus on the most important, most contentious ones, and consequently have to trust all others that are not in the focus of attention. Openness makes scientific knowledge production particularly vulnerable to fraud. That is why misconduct is sanctioned with public shame.

There is a second reason for the sanctioning of misconduct which points to the functional importance of openness. The social structure of science, i.e. the reputational hierarchy that is based on the communal assessment of achievement, depends on the open exchange of ideas and findings. Any distortion of that process implies the attribution of ‘false’, undeserved reputation and, more importantly, the misdirection of attention since reputation is a marker for important research and orients the community’s attention for scrutiny. It has this function also to the outside: the public, politicians or entrepreneurs that are unable to judge members of the community of experts and their achievements themselves must rely on their reputation.

The communication system based on openness and the attribution of reputation based on communal assessment implies a crucial motivational arrangement: reputation within the community is the only meaningful remuneration. Since reputation is attributed only for adding new knowledge (in the broad sense, i.e. including new methods, materials) ‘priority’ is the precious good to be sought which is the basis for intense competition (mostly individual but more and more collective and even institutional).

In order to understand the importance of each of the ‘norms’ and the coherence of them as a system one just has to perform the thought experiment of eliminating one of them or ‘turning it around’. Leaving aside Merton’s particular concern about the impact of totalitarian governments on science the most obvious threat to the operation of the open system of communication comes from two sources: secrecy mandated by military and industrial funding and use of scientific research. The crucial question is if military and industrial (or rather commercial) contexts of research with their constraints on free communication have the detrimental effect on the production and communication of new knowledge that are predicted by Merton’s functionalist account, especially an increasing reluctance among scientists to share information with colleagues, a deterioration of trust, the erosion of constructive criticism (organized scepticism), and in increasing incidence of fraudulent behaviour.

2. Constraints of openness and causes of secrecy

The last three decades since then have seen an increase of corporate involvement in universities. This was supported by governments’ shift of science policy towards ‘innovation’, thus actively promoting the opening of universities to industrial research interests. Universities have responded to the lure of funds from private enterprises all the more willingly as at the same time public funding declined. In contrast military research is in the shadow of public attention, partly for lack of information.

This development has raised widespread concern about the effects of this greater involvement of private interests in the universities, most notably on the openness of communication. While it is generally accepted that openness is a vital feature of scientific knowledge production qualifications arise as soon as different contexts of research are discussed. The inalienability of openness is linked to basic academic research, and universities as the central institution of that type of research are considered to be threatened most by encroachments on openness.

One source of secrecy and an obstacle to open communication is inherent to the system itself. Although always present it becomes more acute as the economic environment has changed. The motive for all scientists to participate in competitive research is to acquire reputation by achieving priority with contributing new knowledge. To gain competitive advantage over colleagues scientists often withhold information about their recent research findings of disclose incomplete information. This strategic behaviour has probably become more frequent as competition has intensified, research at least in some fields has become more economically relevant, and the delay between basic research and its applications has been reduced if there is such a distinction at all. The gravest danger would be a general change of the ‘culture of open communication’, as Eric Holtzman has warned already in the mid 1980s: “…the belief that others are now hiding more information than they used to, whether justified or not, threatens to damage traditions of sharing, that will spill over into many research areas in which competition is still essentially in terms of receiving credit for ideas and obtaining grants…rather than for monetary profit” (Holtzman 1985).

The corrupting effect of a creeping commercialization of science has become most visible in medical and pharmaceutical research. Here, it has been found that both individual scientists as well as universities have succumbed to the temptations of money or the threats of the withdrawal of funds. Scientists have been found to give biased reviews of clinical trials where their own monetary interests are involved. Bias (which is the violation of disinterestedness) has become a pressing issue for scholarly journals, primarily in biomedicine and related areas where industry interests are tied to substantial amounts of money (Krimsky 2003, ch.9; Angell 2004). Since some scandals have surfaced the journals have reacted by disclosing reviewers’ commercial interests.

Universities have, in some cases, acted similarly. In the case of David Kern, an associate professor for public health at Brown University, lost his job because of a critical report to an international conference. The university treated a confidentiality agreement with a local company as a research contract fearing litigation and elimination of financial support (Kern, A Recent Case Study, www.aaas.org/spp/secrecy/Presents/Kern.htm, visited April 6, 2012.)

This development is also supported by a gradual blurring of the boundary between publicly and privately funded research, characterized by a massive expansion of property rights in the form of patents and licenses (Mirowski, Sent 2007, 657f). In the vast literature on the repercussions of the intensified engagement with various instruments of securing IPRs the following negative effects are mentioned. Patenting not so much delays publication significantly but rather “it can encourage a climate of secrecy that does limit the free flow of ideas and information that are vital for successful science…it may also affect the direction of publicly funded research encouraging short-term, applied research that has merit but is usually better done in industry..(Royal Society 2003, V). Giving priority to patens rather than publications may also have an adverse effect on young scientists as they cannot accumulate as much intellectual capital and may end up less productive.

While these concerns apply to fields like physics where basic and applied research can be distinguished this is not the case in biotechnology. The biomedical fields are the prime examples for another relatively new form of commercialization: the extension of IPRs to research instruments and data. This is being done in the form of the so-called Material Transfer Agreements (MTAs). They are contracts between public or private research institutions about the use of particular instruments, assays, data etc. The defenders of these contracts claim that, similar to patents, these contracts actually broaden the potential use and even help to fund the development of new instruments. The critics, on the other hand, point to the ever more complex contracts, their costs and the indirect effects on the willingness of researchers to cooperate (Mirowski 2008). Although economists and researchers contradict each other in their interpretations of the impacts of MTAs and patents – neo-liberal economists cannot see any harm of course – the diagnosis is worrisome. Perhaps even more problematic than disturbances and delays is the orientation of the communication process to the profit motive. If the communication of knowledge is subjected to economic calculations this would have negative long term effects on mutual trust among scientists and the reliability of knowledge they share.

Another effect of the general ‘economization’ is the concentration process taking place in the publishing world and affecting scientific journals in a particular way. The ‘science-technology-medicine’ (STM) publishing houses have a de facto oligopoly, the demand side, i.e. scientific institutions have hardly any alternative. Consequently, costs imposed on academic libraries have skyrocketed and led to a limitation of access for individual scholars at certain institutions and even for entire countries. Similar arguments apply to databases. New legislation regulating the rights of databases that was introduced in Europe in the late 1990s “has been driven by media and commercial interests and is potentially very damaging to scientific research. It rewards the creator of the database rather than the creator of the data, though in science the latter is the more costly contribution. Unlike copyright, database rights effectively protect the data themselves, which cannot be extracted and re-used except under restricted fair dealing arrangements” (Royal Society 2003 V).

3. Attitudes towards limitations of communication

The most important source of constraints on communication in science in general terms is ‚economization’. It appears in the form of the protection of IPRs and affects science on different levels: as contractual relations between corporations and individual scientists or departments or entire universities; as the copyright protection of articles combined with the oligopolistic pricing of journals and databases; and as the protection of instruments, data and assays in MTRs. There is very little systematic knowledge about the actual effects on the attitudes of scientists, on the attitude of universities, about the effects on research agendas, and most importantly: on the reliability and the longer term credibility of scientific knowledge. Much is on the level of impressions and single case descriptions. The few reviews that do exist can only be generalized with caution as conditions differ from field to field.

Probably the most comprehensive review of surveys between 1985 and 2003 focusing on financial ties of academic and clinical researchers concludes: Most investigators report that some financial ties and industry collaboration are appropriate but only under certain conditions. Access to resources is seen as the primary benefit of industry relationships. Risks of industry relationships are related to limits on the free exchange of information and threats to the integrity of research.

[1] It has often been argued that scientists do not behave in accordance with the norms and that these are subject to historical change. However, the existence of norms is not necessarily reflected in actual behavior. Merton‘s theoretical construction of the norms is a complex combination of different elements: 1) Socialpsychological patterns of attitudes that are expressed in internalized (but not necessarily explicit) reactions to violations of the norms, in the awareness that one‘s own actions or those of others are breaching a code; 2) Socialstructural patterns of sanctions, i.e. mechanisms institutionalized in science that sanction, positively or negatively, certain behaviors like plagiarism (negatively) or the open exchange of information (positively).



Holtzmann, Eric (1985) Commentary: Biology faces life – Pressures on communications and careers, Science, , 10,2, 64-72.

Krimsky, Sheldon (2003). Science in the Private Interest. Rowman-Littlefield Publishing Co.,

Merton, Robert K., (1957) ‘Science and Democratic Social Structure’ in: Merton, R. K., Social Structure and Social Theory, rev. (Hrsg.), Glencoe: Free Press, 550-561.

Mirowski, Philip (2008), Livin’ with the MTA. Minerva, 46,

Mirowski, Philip/Sent, Esther-Mirjam (2007). The Commercialization of Science and the Response of STS. In: E. Hackett u.a. Eds.., The Handbook of Science and Technology Studies, Thousand Oaks/London: Sage, 635-689.

Royal Society, The (2003) Keeping science open: the effects of intellectual property policy on the conduct of science. pdf


– kindly provided by Peter Weingart