Bibliometrics Higher education Science policy
Research impact now!
September 21, 2017
0
, , , , ,

In my previous post, I laid out some history of research assessment and measurement, all so that in this post I could explore research impact assessment, which was a major topic of discussion at the 2017 Science & Technology Indicators (STI) conference in Paris. In this post, I’ll summarize the major lines of discussion I encountered at STI, use the history from the last post as a basis for diagnosing those underlying challenges, and perhaps even hint at some avenues to resolve these tensions.

Impact: a narrow definition

One of the most sustained discussions of research impact at STI 2017 was the opening keynote address given by Ismael Rafols. (His talk intertwined impact with the responsible use of indicators for governance, a topic that I’ll be addressing separately in another post.) In brief, Rafols’s most salient point was that indicators of societal impact—quantitative indicators especially—would hopelessly narrow the scopes of what we consider to be relevant “impacts” and of who is counted among the relevant “society.”

Our obsessive focus on journal publications to measure research performance (and obsessive focus on statistical significance for journal publication) has created perverse incentives within the science system that are damaging that system itself from the inside. The concern raised by Rafols is analogous: ultimately a narrow measurement of societal impact will create a perverse incentive and only further damage the system.

Narrow measurement of societal impact will create a perverse incentive in research system. Click To Tweet

This problem can be parsed as one of definitions (though it can be parsed otherwise as well). Journals use too narrow a definition of scientific robustness when they rely too heavily on p-values in assessing a paper’s findings. Research managers use too narrow a definition of scientific production & quality when they rely too heavily on publication output in peer-reviewed journals and citations. Research managers are now also taking a very narrow conception of societal impact even in calling for the development of quantitative indicators to track it, and thereby integrate impact into the management machine.

Society: us and them

But it is not only with regard to impact that we’re running into problems of definition. Society is also troublingly nebulous as a concept. For the most part, we define society only as “outside of academia”—in doing so, we define society by what it is not rather than by what it is. Such definitions reinforce some pretty unhelpful ideas and behaviours. Science is defined as the rational; society is therefore defined (even if implicitly) as irrational. Science gets defined by having a method; everyone else is therefore implied to have no method. Attitudes like this are common among academics, needing no encouragement to be sustained. They also make it easier to understand why “everyone else” might find academics arrogant and elitist.

We often define society only as “outside of academia”—by what it's *not* rather than what it is Click To Tweet

It’s not surprising, either, that we have these two very narrow definitions, or that they reinforce one another. During the 19th and 20th centuries, academics were primarily assessing their work for its effects on other academics, thickening the boundary between the academic world and the society of which it is an internal component. Departments vaunted their merits by trotting out their placement records: a list of all the institutions where their graduates had gone on to find work. But only academic institutions were included; non-academic jobs are not “placements” and were brushed under the rug, excused away as exceptional deviations rather than embraced as equally legitimate vocations. We’ve spent decades focusing on papers (as proxies for ideas) and citations (as proxies for dissemination or influence).

That history is a lot of baggage to overcome in expecting the research community to spontaneously generate a direct relationship with non-academics. For a long time, non-academics were on the fringe of the research landscape. It has been too convenient to lump them together as one category: Other. But unless we acknowledge this challenge and face it head-on, we will perpetuate it. We will perpetuate it by assuming that groups within society are homogeneous and that impacts are too, by operationalizing that belief in indicators, and by pushing our researchers to optimize their work along the only dimension that an indicator will see. “Impact is not a scalar,” as Rafols so succinctly put it. A bit of history—albeit caricatured—suggests that such a definition of societal impact is a relic from within the academic sphere, not a corrective mechanism to bring researchers into closer contact with their fellow citizens. (Impact also suggests that the exchange is one-directional, but I won’t open that up here.)

'Research impact is not a scalar!' –@IsmaelRafols Click To Tweet

From definitions to indicators

The work to develop research impact indicators draws importantly from the bibliometric tradition, for which the research paper is the main object of its ontology. Papers have become synonymous with ideas, acting as their proxies. But journal articles are not a category of things that people care much about outside of academia, so we’re setting ourselves a difficult task working from bibliometrics to impact indicators. No one invited a paper to be interviewed about the Brexit vote or about the last US election—they invited people. Journal articles are all about text, but any advertiser worth their salt will tell you that images sell much better than text. Companies don’t hire people; people hire people. And yet we look for evidence of non-academics taking up research papers on Twitter, Wikipedia and elsewhere, only to find that they just aren’t showing much interest.

Not showing interest in journal articles and not showing interest in research are separate things Click To Tweet

But not showing interest in journal articles and not showing interest in research are separate things. People are a much more important conduit for impact in society than papers are, and the relationships between people are reciprocal and iterative—very badly suited to counting and quantitative measurement. We also know that there’s a pretty good uptake rate, with more PhD recipients working outside of academia than inside it. Why do we hammer away at trying to dissect the tiny blip of papers that get taken up on social media? Meanwhile, we mostly ignore the mass migration of doctorate-holding researchers going out into society, and the connections to academia that they maintain after they find employment elsewhere. For one thing, we haven’t collected data on non-academic job placement, because we haven’t needed it up until now (though that’s changing); we marketed academic departments on the basis of their importance within the academic landscape only.

Filling out my applications for doctoral funding some years ago, I found the request for an impact statement to be incredibly frustrating. As a doctoral candidate, with absolutely no infrastructure around me to create any connection whatsoever with someone outside of academia (and a culture that actively discouraged such contact), was I really being expected to outline what impact my research would have on society at large? Flippantly, I thought that I should write: “I will conduct this research, becoming a smart and capable person in the process, and then leave academia to have an impact.”

Impact statement: I will do my PhD #research, then leave #academia to have an #impact. –@DBStruck Click To Tweet

“Indicators in the wild”

If we’re going to have substantial impact, we need to figure out who our relevant communities are that we want to collaborate with, outline what the challenges of these groups are, and identify how we can help. Rafols calls this work “indicators in the wild.” In the process of engaging these groups, we’ll figure out what kinds of things are salient to them, which can inform measurement strategies. It’s not journal articles, and I suspect that people and institutions are probably closer to the target. But we need to test those hypotheses. Different communities have different needs and ontologies, and the longer we perpetuate a narrow definition of societal impact (and call for a single indicator to measure it), the longer we’ll perpetuate the notion that society is Other—the isolationist mindset that’s gotten academia to this point in the first place.

Sadly, I did not submit my flippant impact statement, but I wish that I had because it’s turned out to be true. I finished my PhD, started working in research governance, and my work every day in this field is enriched by my doctoral work.

 

All views expressed are those of the individual author and are not necessarily those of Science-Metrix or 1science.

0

About the author

Brooke Struck

Brooke Struck works as a policy analyst at Science-Metrix in Montreal, where he puts his background in philosophy of science to good use in helping policy types and technical types to understand each other a little better every day. He also takes gleeful pleasure in unearthing our shared but buried assumptions, and generally gadfly-ing everyone in his proximity. He is interested in policy for science as well as science for policy (i.e., evidence-based decision-making), and is progressively integrating himself into the development of new bibliometric indicators at Science-Metrix to address emerging policy priorities. Before working at Science-Metrix, Brooke worked for the Canadian Federal Government. He holds a PhD in philosophy from the University of Guelph and a BA with honours in philosophy from McGill University.

Related items

/ You may check this items as well

Data mining: revisiting our definition

In our ongoing blog series on data mining for poli...

Read more

Data access: Vast possibilities and inequities

In our ongoing series on data mining to inform pol...

Read more

Data mining: Open access policies and outcomes

During our data mining project for the European Co...

Read more

There are 0 comments