Posts Tagged: indicator development

Bibliometrics
Rationalizing the extremes: introducing the citation distribution index
May 10, 2018
0
, , ,
The distribution of citations among the scientific literature is in many respects similar to the distribution of wealth in the Western World: a handful of articles receive most of the citations, while most of the articles receive few or no citations at all. The distribution of citations is indeed highly skewed and not well represented by its average (the so-called “Bill Gates effect” in the wealth distribution analogy). In fact, when the average is computed for entities that publish a small number of articles, a few highly cited articles could be enough to propel these entities into research superstardom. In this post, we’ll look at an alternative metric Science-Metrix has developed to address the limitations of averages, as well as some of the other metrics we explored to get to that point.
Bibliometrics
Positional analysis: from boring tables to sweet visuals
March 28, 2018
5
, , , , , , ,
At Science-Metrix we are obviously very focused on data—sweet, sweet data! We are also very aware that bibliometric data or pages and pages of analysis can be overwhelming and that more intuitive data presentation can help our clients to better understand their study results, which in turn helps them to take action on the findings we return to them. One graphic presentation we find particularly helpful is the positional analysis chart. Positional analysis is a way to visually depict two, three or even more indicators for a given set of entities instead of using a standard (and boring) table. Click through to the post to see how it works.
Science policy
Impact assessment stories: decisions, decisions
March 21, 2018
4
, , , ,
It appears that the research & innovation policy community is not the only one struggling with demonstrations of societal benefit. The federal Liberal government recently unveiled two online initiatives to increase government transparency, sharing information about government activities and outcomes. The challenges that these two platforms face amply demonstrate the difficulty of impact assessment. Those challenges are the same ones that face the science policy community, and this post explains how the shortcomings of these online platforms might help to elucidate some potential solutions.
Bibliometrics
Bibliometric fun facts
January 24, 2018
2
, , , , , ,
Bibliometrics is a complex and nuanced field, and at times, we’ll admit, it’s even a little arcane. At Science-Metrix we take great pride in explaining indicators to clients: what they mean, how they are calculated, their strengths and weaknesses, and what they can and cannot do. As with so many things, examples provide a good entry point. Inspired by this, today I begin a new occasional series that heads back to basics to explain some common indicators and how we use them in house. In this first post, I’ll explore some “fun facts” that can be derived from a bibliometric database.
Bibliometrics
Prestige and inequality in the ivory tower
December 18, 2017
2
, , , , , , ,
It’s no secret that we have an inequality problem within the hallowed walls of the academy. Much focus has been dedicated to problems of inequality—of status, of wage, of job security, of resulting social mobility, and beyond—mainly between tenured faculty and the growing precariat of contract teaching labour. The central importance of published research is often fingered as a central culprit in entrenching this inequality, and in this post I’ll explore the inequality of citations. The analytical approach is borrowed from Thomas Piketty’s Capital in the Twenty-First Century, his landmark work on economic inequality.
Bibliometrics
Team diversity widget: how do you measure up?
December 6, 2017
0
, , , ,
Collaboration and disciplinary diversity are hot topics in the science policy and research communities. At Science-Metrix, we've been working on multi-/inter-/trans-disciplinarity issues for a while now, and we figured that some of you might find it useful to have a tool you can use to take a quick measurement of the multidisciplinarity of your team. As part of our 15th anniversary celebrations, we've created a free widget that we’re sharing for just such a purpose. Any team can be measured—your research team in academia, a product team in your company, or even your Wednesday-night hockey team. In this post, we’ll explain what the disciplinarity widget does, how to interpret the measurements, and how you can use it yourself. We hope you enjoy the widget—a little birthday gift from us to you!
Bibliometrics Science policy
Metrics: state of the alt
November 8, 2017
3
, , , , , , ,
Discussions of research having impact were for a long time limited to citation tracking, to estimate how much one piece of work influences subsequent explorations. However, with calls for research to have impact on the broader society—breaking out of the closed circle of research feeding yet more research—there’s a lot of interest in seeing how we might trace that impact pathway as it breaks through the membrane insulating the world of research. Altmetrics has held the promise of tracking just such traces. At STI 2017, several leading researchers on the topic gave valuable updates on the state of the art, and their estimation is that we should be seriously cooling it with all the hype. This post sums up the points that stuck out to me from their various presentations, and tries to outline my takeaway of what we should be learning from altmetrics.
Bibliometrics Science policy
The death of indicators
November 1, 2017
0
, , , , ,
In last week’s post, I presented some of the major points of Rémi Barré’s keynote speech at STI 2017. In brief, he divides the evolution of S&T indicators into three phases. The first phase is one of indicators promising to elucidate (and thereby improve) the inner workings of science. The second phase is one of them being co-opted into the neoliberal turn, exposing scientific research to competitive pressures that Dr. Barré identifies as pushing science into multiple crises. The third phase is a casting off of the complicity with neoliberalism, using indicators to start opening up discussions about science & technology rather than shutting them down. In this post I’ll expand on this third phase.
Bibliometrics Science policy
Indicating a neoliberal tendency
October 25, 2017
2
, , , , , ,
Continuing on from my previous discussion of impact, the second keynote speech at the 2017 Science & Technology Indicators (STI) conference in Paris was given by Rémi Barré of IFRIS, who echoed many of the points raised by Ismael Rafols. Barré’s call to action—riffing on a very traditional theme—was, “Les indicateurs sont morts! Vive les indicateurs!” Indicators are dead! Long live indicators! The call was provocative, and his talk highlighted some interesting ways in which the struggles we face in research & innovation management are symptoms of a broad and powerful trend in the political sphere: neoliberalism.
Bibliometrics Higher education Science policy
Research impact now!
September 21, 2017
0
, , , , ,
In my previous post, I laid out some history of research assessment and measurement, all so that in this post I could explore research impact assessment, which was a major topic of discussion at the 2017 Science & Technology Indicators (STI) conference in Paris. In this post, I’ll summarize the major lines of discussion I encountered at STI, use the history from the last post as a basis for diagnosing those underlying challenges, and perhaps even hint at some avenues to resolve these tensions.