Posts Tagged: indicator development

Bibliometrics
Team diversity widget: how do you measure up?
December 6, 2017
0
, , , ,
Collaboration and disciplinary diversity are hot topics in the science policy and research communities. At Science-Metrix, we've been working on multi-/inter-/trans-disciplinarity issues for a while now, and we figured that some of you might find it useful to have a tool you can use to take a quick measurement of the multidisciplinarity of your team. As part of our 15th anniversary celebrations, we've created a free widget that we’re sharing for just such a purpose. Any team can be measured—your research team in academia, a product team in your company, or even your Wednesday-night hockey team. In this post, we’ll explain what the disciplinarity widget does, how to interpret the measurements, and how you can use it yourself. We hope you enjoy the widget—a little birthday gift from us to you!
Bibliometrics Science policy
Metrics: state of the alt
November 8, 2017
3
, , , , , , ,
Discussions of research having impact were for a long time limited to citation tracking, to estimate how much one piece of work influences subsequent explorations. However, with calls for research to have impact on the broader society—breaking out of the closed circle of research feeding yet more research—there’s a lot of interest in seeing how we might trace that impact pathway as it breaks through the membrane insulating the world of research. Altmetrics has held the promise of tracking just such traces. At STI 2017, several leading researchers on the topic gave valuable updates on the state of the art, and their estimation is that we should be seriously cooling it with all the hype. This post sums up the points that stuck out to me from their various presentations, and tries to outline my takeaway of what we should be learning from altmetrics.
Bibliometrics Science policy
The death of indicators
November 1, 2017
0
, , , , ,
In last week’s post, I presented some of the major points of Rémi Barré’s keynote speech at STI 2017. In brief, he divides the evolution of S&T indicators into three phases. The first phase is one of indicators promising to elucidate (and thereby improve) the inner workings of science. The second phase is one of them being co-opted into the neoliberal turn, exposing scientific research to competitive pressures that Dr. Barré identifies as pushing science into multiple crises. The third phase is a casting off of the complicity with neoliberalism, using indicators to start opening up discussions about science & technology rather than shutting them down. In this post I’ll expand on this third phase.
Bibliometrics Science policy
Indicating a neoliberal tendency
October 25, 2017
2
, , , , , ,
Continuing on from my previous discussion of impact, the second keynote speech at the 2017 Science & Technology Indicators (STI) conference in Paris was given by Rémi Barré of IFRIS, who echoed many of the points raised by Ismael Rafols. Barré’s call to action—riffing on a very traditional theme—was, “Les indicateurs sont morts! Vive les indicateurs!” Indicators are dead! Long live indicators! The call was provocative, and his talk highlighted some interesting ways in which the struggles we face in research & innovation management are symptoms of a broad and powerful trend in the political sphere: neoliberalism.
Bibliometrics Higher education Science policy
Research impact now!
September 21, 2017
0
, , , , ,
In my previous post, I laid out some history of research assessment and measurement, all so that in this post I could explore research impact assessment, which was a major topic of discussion at the 2017 Science & Technology Indicators (STI) conference in Paris. In this post, I’ll summarize the major lines of discussion I encountered at STI, use the history from the last post as a basis for diagnosing those underlying challenges, and perhaps even hint at some avenues to resolve these tensions.
Bibliometrics Higher education Science policy
A short history of research impact
September 14, 2017
2
, , , , , ,
During the 2017 Science & Technology Indicators (STI) conference in Paris, a number of discussions touched on impact assessment, which has been a topic of growing interest within the research community. That researchers are increasingly aware of impact, impact pathways and impact assessments comes as no great shock, given that the research policy community is increasingly focusing on impact as the basis for funding decisions. The discussions at STI raised some substantive concerns with the current trajectory of discussions about research impact. In this post, I’ll lay out some relevant history (as I understand it) that contextualizes current discussions about impact. In the next installment, I’ll summarize those points from STI 2017 that stood out to me as the most insightful (and provocative), drawing on the history laid out here in order to explore what I think these comments reflect about the underlying research system.
Data mining Science policy
Data mining: Cross-boundary collaboration and innovation
September 6, 2017
0
, , , , , , ,
In our data mining project for the European Commission, two of the six case studies treated levers for promoting innovation, and we’ll start to tease those apart here. In brief, collaboration across disciplinary and sectoral boundaries is believed to promote innovation, while innovation in turn is believed to support broader economic and social prosperity. Even […]
Data mining Science policy
Data mining: The value of a scoping phase
August 16, 2017
0
, , , , ,
In previous posts in our data mining series, we laid out our initial technical framework for guiding data mining projects, then supplemented that with plug-ins to facilitate its use for R&I policy research specifically. These plug-ins helped to overcome the challenge of applying a generic framework to a specific thematic area. However, there was another […]
Data mining Science policy
Data mining: Technical framework plug-ins for the R&I context
August 9, 2017
0
, , , , ,
In my previous post, I outlined the initial technical framework developed by Science-Metrix in the course of the data mining project for the European Commission documented in this blog series. This initial data mining framework—strongly inspired by existing frameworks—provided a solid foundation on which to build. However, to support data mining in a policy context […]
Data mining Science policy
Data mining: The root of a technical framework
August 2, 2017
0
, , , , , ,
Continuing on in our series of posts on data mining for policymaking, this post presents the initial technical framework developed by Science-Metrix to guide the conduct of data mining projects in a government context (with some shout-outs to other contexts as well). This seven-step framework formed the basis of our case studies, and effectively lays […]