Bibliometrics Data mining Open access Science policy
2017: the best of ScienceMetrics.org
January 17, 2018
0
, , , , , , ,

Over the past year, the ScienceMetrics.org blog has grown considerably. We really appreciate our growing group of readers and all the interesting discussions that the blog has sparked. In today’s post, we’ll take a quick look back at 2017, and give you a year-in-review from our side of the “Publish” button.

Capturing your attention

For ScienceMetrics.org, 2017 was the Year of the Series, as we experimented with providing long-form content broken down into series of related posts. Our first experiment was the 3-part series we ran on different forms of capture, which met with a lot of interest. In case you haven’t seen those yet, the notion of capture applies in contexts of collaboration where the participants are meant to enter on an equal footing, but one group ends up having a “home-field advantage.” This problem crops up when we collaborate across boundaries between disciplines or between sectors. We also ran a post on the problem of capture at the science–policy nexus, where the discourse of scientists can sometimes come off sounding like a fear of losing funding or influence, when in fact it is the benefit of science to society that we should worry about being lost.

Data mining for policy

Seeing the strong response to this series, we decided to undertake our biggest endeavour for the year—actually, our biggest endeavour ever—which was our series on data mining for policy. The hype around big data and artificial intelligence (for everything) continues to build, but the reality of implementing these techniques to inform operational decision-making is more complex than the simple gloss they often receive. Having just finished a landmark project for the European Commission examining this topic, we felt that there were a lot of valuable findings to share and that the blog series would be a more accessible way for readers to engage with that content, breaking down the extensive reports available from the Commission project.

In addition to the lead-off post for the series, some of the more popular installments covered our technical framework to guide data mining for policy and the need for a scoping phase to manage risk. Our case studies also garnered a lot of interest, focusing on interdisciplinary and intersectoral collaboration to drive innovation, innovation to drive growth and prosperity, and the effectiveness of institutional policies to increase levels of open access to research.

Science & policy

Towards the end of the year, we also ran two short series on conferences, summarizing and responding to the keynote addresses delivered there. The series on Science & Technology Indicators (STI) 2017 in Paris looked at a simplified history of research impact as well as the current imperative for research to deliver societal impact. One main obstacle was noted here: these imperatives seek to engage researchers with the broader society around them, but defining “broader impact” as just anything outside academia re-emphasizes the divide that the imperative seeks to bridge. Until we can define what impact is rather than only what it isn’t, we won’t make significant headway in engaging society on its terms. This is an example of sectoral capture, as discussed above. The rest of the STI series looked at the use of indicators for governance, how that usage has supported a move towards neoliberalism and what we might do about it, as well as a summary piece looking at the state of altmetrics to measure science.

The other short conference series covered the Canadian Science Policy Conference 2017 in Ottawa. The lineup there was quite impressive, with the federal Minister for Science, the Governor General and the newly appointed federal Chief Science Advisor all giving keynote speeches. The Governor General’s address stirred up quite a bit of controversy in the mainstream media—something of a rarity for a science policy address. We looked at different ways one might interpret her provocative speech and concluded that they all boil back down to a deficit model of science advice, which the science policy community seems to agree is passé.

The Science Minister gave an update on progress towards her mandate. Interestingly, she’s probably the only federal minister whose mandate is basically complete, now only halfway through the government’s term. Nevertheless, challenges remain, especially looking at the integration of different types of research and knowledge. There is also a wide funding gap, pointed out by the Naylor report released earlier in 2017, and while all the messaging and rhetoric is positive, the dollars have yet to materialize.

Another major element of the science landscape for the federal government has to do with evidence-based decision-making, which was delegated to the new Chief Science Advisor, Dr. Mona Nemer. Science must evolve in order to address increasingly complex policy topics, and the decision-making machinery of government must also evolve in order to integrate that knowledge. Our post on the topic explores some of the twists and turns that these twin paths of evolution must overcome.

We will continue to follow all of these issues—and more!—as ScienceMetrics.org continues to grow. Thanks to everyone who’s taken an interest in our blog: those who have encouraged us, those who have disagreed vigorously, and those who just drop by to enjoy a good read. We hope to see you again soon. Best wishes to everyone for 2018!

0

About the author

Brooke Struck

Brooke Struck is the Senior Policy Officer at Science-Metrix in Montreal, where he puts his background in philosophy of science to good use in helping policy types and technical types to understand each other a little better every day. He also takes gleeful pleasure in unearthing our shared but buried assumptions, and generally gadfly-ing everyone in his proximity. He is interested in policy for science as well as science for policy (i.e., evidence-based decision-making), and is progressively integrating himself into the development of new bibliometric indicators at Science-Metrix to address emerging policy priorities. Before working at Science-Metrix, Brooke worked for the Canadian Federal Government. He holds a PhD in philosophy from the University of Guelph and a BA with honours in philosophy from McGill University.

Related items

/ You may check this items as well

Rationalizing the extremes: introducing the citation distribution index

The distribution of citations among the scientific...

Read more

1findr: discovery for the world of research

As of last week, 1science is offering public acces...

Read more

Positional analysis: from boring tables to sweet visuals

At Science-Metrix we are obviously very focused on...

Read more

There are 0 comments