Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Up until 1960, higher education was reserved for the talented few. Those attending universities belonged in the privileged elite, being mentored and educated also by the elite of university professors and fellows. All this changed successively as education reforms opened up universities to much larger parts of the population in the Western world of the United States and Europe (Goldin and Lawrence 1999). The mass university was born in the early 1960s, and in the decades that have followed, more and more students have been admitted to university programs, and previous nonuniversity trainings and education are now part of the higher education system (Whitley 2000, p. xvi). Today, it is not unusual in, for example, Sweden that around 50% of an age cohort is studying in universities or university colleges (Bennich-Björkman 2007). In the 1940s and 1950s, it was just a few percent.

In the just 50 years that have passed, this is indeed a tremendous change. Depending on the specific historical traditions and preconditions of the university systems throughout Europe and in the United States, the challenges of the mass university have been handled differently. Far from being coherent, institutions of higher education and research have been shown to be surprisingly divergent and continue to be so (Clark 1995). Nevertheless, what universally seems to have followed suit, as a consequence of the growth in number of doctoral exams and the increasing emphasis on knowledge as promoting national competitiveness, is a shift from “elite to mass” also within research. I argue polemically in this chapter that there has been a move into what can be called the era of the academic research industry. What used to be small scale and based mostly on individuals is today increasingly streamlined and large scale, at times embedded in large webs of collaborative networks that involve researchers who never even met in person. In parts of the European university systems, there is a rapidly growing tendency to centralize and control external funding processes to university managers, thus turning universities themselves into more of corporate-like entities.

Even though I am not particularly fond of this development, mainly because it destroys some of the necessary preconditions for creative work in favor of mechanization, I believe it is essential to try to understand the forces behind the reality we experience by taking a step back and, like an anthropologist, look upon the developments from the outside. Below, I will point to some indicators sustaining the argument of a growing research industry and furthermore reflect upon possible causes and consequences of that development.

Is There Really an Academic Research Industry?

Growth in Numbers and Outputs

Let me start by looking at the number of researchers globally. Most figures point to that there has been an increase; the United Nations Educational, Scientific and Cultural Organization (UNESCO) states that “R & D expenditure and the number of researchers worldwide have grown significantly between 1996 and 2007,” and between 2002 and 2007, full-time equivalents of researchers has increased from 5.8 million to 7.1 million. A substantial part of this absolute increase rests on the growth of China as a research and development nation (UIS Fact Sheet 2009, p. 12). Researchers are defined broadly, as “professionals engaged in the conception or creation of new knowledge, products, processes, methods, and systems and also in the management of the projects concerned” (p. 1). It is hard to find independent figures on the numbers of researchers in the humanities and social sciences, but even so, the figures available clearly show that there is substantial growth in the number of professionals engaged in academic research overall, most likely affecting also humanities and social sciences. If we believe UNESCO, the increase in the number of researchers has been particularly substantial over the last 10–15 years.

What about the number of scientific journals? It is sometimes stated that the ­number of journals has exploded over the last decades. This, however, does not seem to correlate with actual developments. Even though there clearly is a growth, it ­follows an established pattern of an approximate doubling of the number of scientific journals every twentieth year. From 1800 until today, the annual average rates of 3.46% have persisted. The exception is the period 1945–1975 (after the Second World War), when growth rates were higher as a consequence of extensive economic growth and subsequent investments in research and development (Mabe 2003). In 2001, the number of “active, refereed academic/scholarly serials” was approximately 14,700. As far as can be concluded, natural sciences, medicine, humanities, and social science journals are all included in this count.

What Mabe finds in trying to explain this steadily continuous growth in journals is that it is basically author driven, primarily caused by the successive increase in number of researchers (shown above). “The connection between growth and the number of journal titles and growth in number of researchers is unmistakable” (Mabe 2003, p. 195). Growth could also accelerate further as a result of certain publishing strategies by researchers, for example, “recycling” of already published results, arguments, and texts, in slightly new forms and outlets, and the “breaking down” of results and research work into least publishable units (LPU). Such publication strategies, which are hard to find information about in a more systematic manner but most certainly exist if we listen to how colleges speak about their work, would then be employed to meet demands on “publish or perish.” Such demands themselves become more salient as a mechanism of stratification when the number of researchers increases.

What about number of “papers,” that is, articles, in journals? Analysis based on Thompson’s Science Citation Index (SCI) indicates that there has been a rise in the production of papers, in particular with reference to the developing world (the big exception being Africa).Footnote 1 Again, unfortunately, I have not been able to find some parallel analysis concerned with the humanities and social sciences, but must rely on the assumption that the latter research fields tend to move in a similar direction.

This growing “supply” of research is more of a structurally than intentionally generated process, not necessarily welcomed or intended in the first place but neverthe less affecting the content and profession of research, pushing for a more conscious response and strategy on behalf of individual researchers and university establishments. Let me now focus on two major tendencies in particular: the pressure to increase productivity and the changing characteristics of the reward systems.

Productivity

The academic research industry in the western world is funded by a combination of business and governmental resources with some small additions of private foundations and NGOs (see UNESCO statistics for overview). It demonstrates today some classic signs of industrialization, where emphasis on growing productivity (defined in the classic sense as more production, output, in lesser time, input) is one crucial aspect and a move from individually based activities to “corporate”-based ones (that is on the university level) another.

In every mass industry, increasing productivity is the major instrument by which individual corporations survive and prosper on the market, and productivity is the result of higher efficiency. Increasing efficiency was thus the key of early twentieth-century Taylorism, the rational or “scientific management” principles developed to meet the growing demands on industrial production (Braverman 1974). A strive for efficiency in order to increase productivity is also visible, even salient, in today’s growing research industry. Time and space management of the individual research workers, as for editors of scientific and scholarly journals, have become all the more in focus, and developments within the domain of information processing support the increasing productivity strive.

Is then a growth in productivity, understood as more research papers and books in shorter time, really possible? I would say yes, given that papers are allowed to be, even demanded to be, short, streamlined in their structure, and present results based on material and data that are not time-consuming to retrieve or analyze. That is also precisely the development we see within the social sciences, which I know the best.

“But I have already exceeded the 6000-word limit!” Desperate to stick to the constantly shrinking word limits of academic journals whereas at the same time trying to satisfy critical reviewers by including additional data, more sophisticated analyses and elaborated methodological notes, many researchers today struggle with incompatible demands to be both very short and all the more comprehensive. Space management is growing into a form of art in itself, putting focus on how to reduce the amount of signs by using, for example, numbers instead of letters, cutting out references, and shortening sentences.

In a similar way, presenting at an academic conference today is often primarily an exercise in time management. “You have two minutes left” signals the appointed chair with an embarrassed smile after 8 min, well aware of that this is not what someone spending 4 years on finding out why state building failed in Zimbabwe but not in Botswana longs to hear. Incompatible time management demands on providing enough original insights, substantial evidence and path-breaking conclusions while sticking to the 10 min granted so that the other six panelists and the discussant get their share are constantly present in the international events that academic conferences constitute. It might also reduce the intellectual stimulation and the possibility of dialogue into a minimum.

Nevertheless, as is the case with norms, many subject themselves to these inherent rules of academic publishing and presentation voluntarily and without complaints, feeling a sense of genuine failure when a paper is exceeding the word limit or when taking 3 min too long as a paper giver at a conference. Short is beautiful; to be lengthy is demonstrating both lack of focus and politeness.

Why has publication within the social sciences (and humanities to a certain extent) developed in such a way, and why are many researchers less and less inclined – as an audience – to read longer papers or even books? A generous interpretation is that these developments articulate a welcome awareness that “big is not beautiful” and that it is difficult and skilled to write in a concise manner. Thus, the tendency could basically reflect an improvement as a result of intra-scientific concerns. A less generous understanding instead focus on how the format (including length, style, and structure) has grown in importance because of reasons external to the content of research, such as time constraints of the readers, and a growing number of researchers aiming at getting published. There is simply no time to listen to lengthy arguments. The question then becomes how that affects the type of problems and analytical questions researchers will take on in the first place.

The type of research problems identified is indirectly affected by the publication patterns. Instead of asking questions which demand multifaceted analyses, there is a tendency to break down questions to very small, and thus manageable, “units.” The use and choice of certain types of research material is indirectly favored by the need to be concise and short. In the humanities and social sciences, mostly although not always, this is research based on statistics rather than qualitative material from archives, interviews, focus groups or content, and text analyses. Tables, figures, and formulas are less “word consuming” than lengthy text analyses (although tables certainly demand space).

However, if papers could become shorter and the research works less time-consuming, the Achilles heel of academic research work is precisely the reading, the demand for cumulativity – and of course creativity. Research builds on previous research to which it adds a piece of novel or, at least relevant, knowledge. While the writing of research papers could become more efficiently executed over time by routinization and streamlining, reading, reflecting, and digesting the previous contributions of others still take time.Footnote 2 Even if you could train yourself to read faster and pick out the core points more efficiently, there is a limit to this, and furthermore, the rapidity by which new results appear demands in reality a lot of time to keep up. In order to avoid lowering productivity by such an “absolute” time constraint, there are at least three solutions in the academic research industry of today (extensively used and often combined): specialization, technological assistance, and human assistance.

Specialization

Specialization allows you as a research worker to initially invest in mastering a particular subfield and continuously add to this initial knowledge without losing track of the contributions by others. The problem is that many real-world problems demand a broad-based knowledge of more than one field and that too much specialization thus risks making you less creative, in the sense of novel. Nevertheless, as, for example, Hasselberg points out, the role of specialist within the social sciences has become more salient:

The researcher as specialist is an interesting species. It is a person who is specialized in a number of fields that the rest of society hardly knows exist, or knows little about. The specialist often renounces claims of context and perspective. For her, the absolute limit is put up by the research front, also motivating the question asked. (Hasselberg 2009, p. 128 my translation)

Technological Assistance

As a response to the growth of the research industry and the increasing focus on productivity, technological (particularly software) tools of assistance have been appearing all the more frequently. In the last 10 years, information resources, notes, and reference systems, all crucial to handling the demand on cumulativity in the academic research industry, have developed rapidly. Wikipedia (Jan. 2010) lists 30 software products for reference handling: 22 of them have been launched after 2000 and four before 1990 (including one of the most successful, Thomson’s EndNote). Through these reference systems, an individual researcher is able to build his or her own digital “library” of references, importing them from various databases aside from listing them manually. In searching databases for articles (and books), keywords are often used, leading to that the library often consists of a large number of references that are related to the topic, all of them however seldom read. But given the ease by which it is possible today to, in this and other ways, keep superficially updated on progress in research, the demands mentioned earlier on demonstrating comprehensibility grow. Since the publish and perish logic reduces time to read and digest what you have read in favor of writing up your own pieces, there is probably a general knowledge that references today are not (all) real but “imported”:

You don’t have to run EndNote on your desktop to use EndNote Web. You can export citations to EndNote Web and download them later to EndNote on your desktop, or create bibliographies directly from EndNote WebFootnote 3 (the citation is from New York University Library).

While facilitating research workers to manage their footnotes and references, reference systems also help covering up a severe condition in the academic research industry: the growing lack of time for reading and reflecting. EndNote (and its equivalents) efficiently helps to identify and localize work in databases, articles, and books, so that on appearance, the research worker continues to fulfill the norm of cumulativity. There is just one flaw, particularly damaging for social sciences and humanities that still refer to books: for example, EndNote does not handle (without some manual fixing) page references. Hence, EndNote parentheses include author names and year of publication, usually what is needed in references in articles. But for citations, and most certainly for books, such a reference is basically useless and moreover often reveals that the book has not been read but only identified through keywords. Such reflections may seem to be minor, compared to the decisive facilitation and advantages that the new technologies bring. However, I believe we need to reflect upon what kind of “signals” technological innovations send out and the subsequent behavior they provoke. Research is not an exception.

Human Assistance

Research assistants, which is a manual labor doing both the actual reading and data collection, is a third option available to the research worker in handling not only the cumulativity demand but the time pressure in general. Research assistants are costly, so such a solution depends on generous funding. Furthermore, research assistants are more easily used for mechanistic tasks that rely less on discretion, such as ­counting certain words or phrases in texts or encoding predecided categories of events. Tasks that rests on more of tacit knowledge, judgment, and independence, for example, in-depth and long interviews, focus group research, and archival research of a more advanced kind, are less suited for human assistance because they involve continuous decision-making and on the spot analyses.

Research Work as Art

However, in relation to productivity, there are certain particularities that separate the research industry from classic industries of manual labor and place it much closer to spheres such as art, literature, and the performing arts. Productivity gains are much harder to achieve in these spheres than in classic industries. What Baumol and Bowen once pointed out as the “economic dilemma” for performing arts, that rehearsing and performing a Shakespeare play or Mozart’s Requiem still takes about the same amount of time as it did a 100 years ago, applies also, at least to a certain extent, to academic research (Baumol and Bowen 1968, p. 374). Why is that?

Creative work that is work involving the creation or discovery of something not previously known is time-consuming and must – if it is organized in an optimal way – allow for experimentation, trial and error, and failure. By necessity, it embeds uncertainty and mistakes. Putting it differently, creative work cannot be mechanized, because there is no way to bypass the experimentation that usually produces – failure. By overemphasizing productivity (more output in less time) in research, there is therefore a substantial risk that the creative element (the trial and error) in research work is being downplayed, in favor of what can be learned and mastered: craftsmanship skills, mastering (or managing) the literature, and the conducting of “safe” experiments.

How Does the Research Industry Affect Reward Systems?

Traditionally in academia, every individual used to work for herself, improving one’s own position, through primarily respect and recognition. Hence, when we speak about a tendency in the academic research industry to increase productivity, it is still (but not entirely) on the basis of individual achievements, that is, individual researchers trying to provide more “output” in lesser time.

Rewarding Novelty

What, then, is the equivalent to profit for the individual competitors in the research market? As Robert Merton once pointed out, this is a “competition” not based on material gains (in the first place at least) but on immaterial rewards generated from the group of peers: fundamental respect, scholarly recognition, and an impact on future research (Merton 1973; Mulkay and Williams 1971). Peer admiration is what the researcher wants to earn, and for that, she may have to work an entire lifetime.

What about these “immaterial” rewards generated by peers? As a consequence of research industrialization, the criteria for peer recognition and admiration are in the process of changing as well. These rewards have traditionally been tied to originality (or creativity), not productivity. That is not to deny that creativity and productivity (in the sense of publishing a lot) could, and sometimes do, go together, in that ­creative researchers are also productive. But the rewards are tied to contributing substantially to a research field, and the most admired contribution is the genuine novel and creative one. To push it, theoretically at least productivity in itself should not be interesting at all (although empirically it still often is, to a certain extent).

Rewarding Productivity and Investments

As the academic research industry has grown, the administrative infrastructure has grown as well, while the basic “unit” in the field has started to shift from the individual researcher (or a concrete and identifiable research group) to collectives such as departments and entire universities. University administrations, funding agencies, state authorities, and international organizations are today working full time with research-related questions, not least with finding systems and methods by which to allocate resources between individual researchers and between collectivities such as departments, institutes, universities, and even national university systems. The administrative infrastructure cannot automatically base its reward systems on substantial contributions to scientific fields and originality; for such assessments, it depends on the continuous help of researchers (which is used). However, in order to enlarge its institutional autonomy and independence, the administrative infrastructure needs to develop some parallel criteria for assessments and allocations that it can use autonomously and without having to depend on the scientific community.Footnote 4 Research productivity lies close at hand (since it is easily assessed) and so does rewarding the capability of attracting research investments, that is, research funding (Whitley 2000, xviii). While traditional peer admiration rests on research contributions and their contents, productivity and investments have thus become alternative and I would say rival criteria, not only more and more extensively used by the administrative infrastructure but also, and increasingly, invading the assessments by the peers themselves. In a recent assessment of a candidate up for “docentur” (“associate professor”) in Sweden, the reviewer, for example, pointed out the candidate’s ability to attract “research funding” as a merit in itself. This is not exceptional. Productivity and investment criteria thus invade also the perceptions of peers, at the risk of drawing attention away from contents of the contributions. The foremost instrument for measuring productivity and “impact,” bibliometrics (measuring publications and citations), has grown tremendously in importance in the social sciences and also in the humanities, represented by the Social Science Citation Index (SSCI) and the Arts & Humanities Citation Index (AHCI). If, however, creativity and productivity, as indicated above, can go together, could not productivity then be a good enough proxy for creativity, one could ask?

The problem is that the growing amount of researchers and journals, in combination with the technological development that promotes the writing of more papers, has facilitated productivity to a rather high extent. Thus, today we can probably say that there are many highly productive researchers whose contributions are not particular novel or original. That is not to say that they are lacking creative potential, but the pressure for productivity (growing as a result of the number of researchers and the reward systems driven by the administrative infrastructure) discourages them from devoting enough time and enough energy into uncertain but potentially original research endeavors. As shown in a recent study on how the construction of research funding affect creativity, innovation is promoted by long-term perspectives, the possibilities of early failure and timely and intensive feedback (Azoulay et al. 2011; Agihion et al. 2005).

Traditionally, in academic research, rewards have been generated also by internal rewards: the joys of intellectual challenge, the immensely satisfying feelings of “flow” through deep concentration, and the thrill in novel and innovative findings and solutions.Footnote 5 This has little to do with the question of what is being rewarded in the academic research industry today, but nevertheless is of importance for understanding some of this industry’s psychological effects. The growth of the academic research industry with its emphasis on growing productivity successively perverts more and more of these inner, psychological, rewards. Inner rewards have been shown to be strongly nourished by academic freedom that encourages curiosity-driven research (now heavily questioned). First and foremost, however, more output in lesser time reduces time for extensive concentration and for the trial and error processes that are an inherent part of creative work.

Hence, as partly a consequence of the administrative infrastructural strive for autonomy, productivity and investment attractiveness have become alternative, even competing, criteria for assessing merit in academia. What used to be a very strong emphasis on novelty is today rivaled by more mechanistic assessment tools. Moreover, the kind of indirect rewards that come from the thrills, passions and excitement of discovery, challenge, and new interpretations are affected as well by the developments of the academic research industry.

Some Final Words

If you throw a frog into boiling water, it will quickly jump out. But if you put a frog in a pan of warm water and raise the temperature very slowly, the gradual warming will make the frog doze happily. The frog will eventually cook to death due to its failure to sense the gradual increase in water temperature. The message of the tale is that, because its environment changes so gradually, the frog is never stimulated to take bold action to save its life (Gino and Bazerman 2009, p. 717).

It is easy, tempting and sometimes unavoidable, to adapt to changes when you are embedded in the affected structures. After a while, usually rather quickly, you do not any longer notice the larger pattern that these changes are a part of. What then happens is a successive halt to thinking actively about the ideas behind a certain development and whether you really support or even like its long-term consequences. Instead, the situation has become one of trying to cope and survive (Zimbardo 2007). As part of the system, you grow accustomed to the new practices, even if they are not benefitting the organization. Adaptation then successively leads to acceptance, because rationalization sets in: it has been shown to be hard to live with the type of cognitive dissonance implying that behavior points in one direction and believes in another. Hence, persons working within a system start to believe in its governing principles and successively, without many noticing it actively, there has been a shift from one set of norms to another. The mental processes sustaining such institutional changes have become known in social psychology as the “slippery-slope” syndrome, a gradual, incremental slide into a state or a situation which once was believed to be detestable or highly disliked. In a recent experimental study, researchers showed that “when unethical behavior of others develops gradually, over time, instead of occurring abruptly, people are more likely to accept this behavior” (Gino and Bazerman 2009, p. 717).

As a consequence of the “slippery-slope” effect, academic researchers are, I believe, accepting practices and norms evolving within the research industry that actually contribute to destroying or crucially damage preconditions for original research, innovation, and discovery. The main argument in this chapter has been that there is a growing tendency to focus on productivity and efficiency that bear resemblance to early twentieth-century processes of industrialization. To write and “produce” more in less time has become a value in itself, even though rhetorically accompanied by statements such as that research needs to be “cutting edge” and of outstanding quality. But because research work is much like performing arts, the advantages of scale are simply not there. The equation of more, and better, research in less time is hard to achieve because of the inherent logic in creative research work: the time-consuming activities of experimentation and failure. The perils of the industrialization of academic research lie in that these insights, of the need for risk-taking and acceptance of genuine uncertainty, are buried in all the more elaborated efforts of time and space management. And that we, as researchers, adapt and doze off, while not even noticing that the water slowly starts to boil.