Bibliometrics
Team diversity widget: how do you measure up?
December 6, 2017
0
, , , ,

Collaboration and disciplinary diversity are hot topics in the science policy and research communities. At Science-Metrix, we’ve been working on multi-/inter-/trans-disciplinarity issues for a while now, and we figured that some of you might find it useful to have a tool you can use to take a quick measurement of the multidisciplinarity of your team. As part of our 15th anniversary celebrations, we’ve created a free widget that we’re sharing for just such a purpose. Any team can be measured—your research team in academia, a product team in your company, or even your Wednesday-night hockey team. In this post, we’ll explain what the disciplinarity widget does, how to interpret the measurements, and how you can use it yourself. We hope you enjoy the widget—a little birthday gift from us to you!

(more…)

Science policy
Canadian Science: mandate update from Minister Duncan
November 29, 2017
0
, , ,

Kirsty Duncan (the Canadian federal Minister of Science) gave a keynote address at the 9th annual Canadian Science Policy Conference in early November, during which she outlined the main priorities of her role and what she’s accomplished since being named two years ago. In our ongoing coverage of the keynote speeches from CSPC, this post will summarize her talk and highlight some critical questions. (more…)

Science policy
The new face of the science–policy interface
November 21, 2017
2
, , , ,

The new Chief Science Advisor position is the top job at the science–policy interface in Canada. While attending the 9th Canadian Science Policy Conference in Ottawa earlier this month, the other conference-goers and I were lucky to get a glimpse of how Dr. Mona Nemer—newly named to the job—understands evidence-based decision-making. In this week’s post, I’ll give a summary of her remarks at the CSPC and distill the main views on evidence-based decision-making that they seem to reflect. (more…)

Science policy
Is non-science non-sense?
November 15, 2017
2
, , , ,

At the beginning of November, I attended the Canadian Science Policy Conference, where one of the headline guest speakers was the new Governor General: former astronaut and currently Right Honourable Julie Payette. The Canadian science and science policy communities had an expectedly positive response to the appointment of such a scientifically minded person to this emblematic role. Her Excellency’s speech really played to the home-town crowd, too, emphasizing that science is increasingly embraced in policymaking here in Canada, and calling for science to now be increasingly embraced in society at large as well, even to the point that science would become a matter of cocktail conversation. There was a lot of controversy, though, about how Payette described the beliefs of those who have not yet been converted to our brand of discipleship, those beliefs that do not pass scientific muster. In today’s post, I’ll point out what I see as an underlying tension in her position and what a resolution might require. (more…)

Bibliometrics Science policy
Metrics: state of the alt
November 8, 2017
3
, , , , , , ,

Discussions of research having impact were for a long time limited to citation tracking, to estimate how much one piece of work influences subsequent explorations. However, with calls for research to have impact on the broader society—breaking out of the closed circle of research feeding yet more research—there’s a lot of interest in seeing how we might trace that impact pathway as it breaks through the membrane insulating the world of research. Altmetrics has held the promise of tracking just such traces. At STI 2017, several leading researchers on the topic gave valuable updates on the state of the art, and their estimation is that we should be seriously cooling it with all the hype. This post sums up the points that stuck out to me from their various presentations, and tries to outline my takeaway of what we should be learning from altmetrics. (more…)

Bibliometrics Science policy
The death of indicators
November 1, 2017
0
, , , , ,

In last week’s post, I presented some of the major points of Rémi Barré’s keynote speech at STI 2017. In brief, he divides the evolution of S&T indicators into three phases. The first phase is one of indicators promising to elucidate (and thereby improve) the inner workings of science. The second phase is one of them being co-opted into the neoliberal turn, exposing scientific research to competitive pressures that Dr. Barré identifies as pushing science into multiple crises. The third phase is a casting off of the complicity with neoliberalism, using indicators to start opening up discussions about science & technology rather than shutting them down. In this post I’ll expand on this third phase. (more…)

Bibliometrics Science policy
Indicating a neoliberal tendency
October 25, 2017
2
, , , , , ,

Continuing on from my previous discussion of impact, the second keynote speech at the 2017 Science & Technology Indicators (STI) conference in Paris was given by Rémi Barré of IFRIS, who echoed many of the points raised by Ismael Rafols. Barré’s call to action—riffing on a very traditional theme—was, “Les indicateurs sont morts! Vive les indicateurs!” Indicators are dead! Long live indicators! The call was provocative, and his talk highlighted some interesting ways in which the struggles we face in research & innovation management are symptoms of a broad and powerful trend in the political sphere: neoliberalism. (more…)

Data mining
Using data readiness levels to address challenges in data mining projects
October 11, 2017
0
, , , , , ,

In a blog post from earlier this year, Neil Lawrence describes some challenges to data mining projects that are familiar to many working in the domain—our team definitely included! These challenges include the availability and quality of the data available for the project. Data scientists are often faced with very detailed expectations of budgets and timelines for a project but are provided with very little information at the outset regarding what data they will have to work with, making it difficult to determine whether a project’s outline is realistic. To begin addressing this problem, Lawrence lays out a very general taxonomy of “data readiness levels,” which provides useful language to help us identify and ultimately overcome these important challenges that currently hinder many data science projects. (more…)

Data mining Science policy
Data mining: revisiting our definition
October 4, 2017
0
, , ,

In our ongoing blog series on data mining for policy, we’ve been trying to synthesize a lot of information into short, bite-sized chunks for our audience. Invariably, well-intentioned as such efforts are, something valuable always ends up on the cutting room floor. In this case, we were a bit too hasty in providing the definition of data mining itself, which one of our readers followed up to ask about. Our initial definition was put together through literature review and our earliest experiences with data mining, but the opportunity to revisit that definition more recently has enabled us to uncover some further nuances that we hadn’t yet appreciated. (more…)

Data mining Science policy
Data access: Vast possibilities and inequities
September 27, 2017
0
, , , , , ,

In our ongoing series on data mining to inform policy, we are giving the topic of data access its own post because of the implications it had for the success or failure of our case studies. The simple reality is that you can’t mine data that don’t exist (or that may as well not exist when they are functionally or realistically impossible to access). As a result, access is particularly important since it underpins the rest of the work in a data mining project. Let’s tease this topic out a little, shall we?

(more…)