Evaluation
Maximizing the use of evaluation findings
February 21, 2018
0
, , ,

In a 2006 survey of 1,140 American Evaluation Association members, 68% reported that their evaluation results were not used. This suggests a serious need for evaluation results to make it off the bookshelf and into the hands of intended audiences. This week’s blog post looks at how we, as evaluators, can help maximize the use of the evaluations we produce. (more…)

Evaluation
Program evaluation basics
February 14, 2018
0
, , ,

The ScienceMetrics blog has so far focused on our scientometric, data mining and science policy activities, but we also have a long history of conducting evaluations of S&T-related programs and initiatives. In my opinion, the most fun to be had on the job is when we team up to combine our quantitative and qualitative analysis skills on a project. To kick off a series of posts from the evaluation side of Science-Metrix, in this post I’ll present an introductory Q&A on program evaluation, and next week I’ll discuss how to maximize the use of evaluation findings. Read on for the what, why, when and how of program evaluation. (more…)

Science policy
Science advice in Canada: reborn again?
February 7, 2018
1
, , ,

Science advice is apparently having a “moment” right now in Canada. Quebec has had a chief scientist since 2011, but both the federal government and the Ontario provincial government named chief science advisors in the second half of 2017. For the first time, at the end of January 2018, the three chief scientists appeared publicly together, on a panel organized by the Institute for Science, Society and Policy (ISSP) at the University of Ottawa. This attracted no small number of science policy nerds. While the event was billed as a new renaissance for science advice, that does raise the question of when exactly its prior incarnations occurred. In today’s post, we’ll present a summary of the discussions, along with some critical reflections. (more…)

Bibliometrics Web of Science
Bibliometric fun facts
January 24, 2018
2
, , , , ,

Bibliometrics is a complex and nuanced field, and at times, we’ll admit, it’s even a little arcane. At Science-Metrix we take great pride in explaining indicators to clients: what they mean, how they are calculated, their strengths and weaknesses, and what they can and cannot do. As with so many things, examples provide a good entry point. Inspired by this, today I begin a new occasional series that heads back to basics to explain some common indicators and how we use them in house. In this first post, I’ll explore some “fun facts” that can be derived from a bibliometric database.

(more…)

Bibliometrics Data mining Open access Science policy
2017: the best of ScienceMetrics.org
January 17, 2018
0
, , , , , , ,

Over the past year, the ScienceMetrics.org blog has grown considerably. We really appreciate our growing group of readers and all the interesting discussions that the blog has sparked. In today’s post, we’ll take a quick look back at 2017, and give you a year-in-review from our side of the “Publish” button. (more…)

Bibliometrics
Prestige and inequality in the ivory tower
December 18, 2017
2
, , , , , , ,

It’s no secret that we have an inequality problem within the hallowed walls of the academy. Much focus has been dedicated to problems of inequality—of status, of wage, of job security, of resulting social mobility, and beyond—mainly between tenured faculty and the growing precariat of contract teaching labour. The central importance of published research is often fingered as a central culprit in entrenching this inequality, and in this post I’ll explore the inequality of citations via a citation distribution analysis. The analytical approach is borrowed from Thomas Piketty’s Capital in the Twenty-First Century, his landmark work on economic inequality. (more…)

Bibliometrics Science policy
Nobel laureates and the economic impact of research: a case study
December 15, 2017
0
, , , , , , ,

In the course of another project, I recently ran some data on the publications of 37 laureates of the Nobel prizes in Medicine, Physics and Chemistry. The results raised eyebrows in the office: they showed that those laureates, recognized for the tremendous contribution their discoveries have made to humanity, have over the course of their careers produced knowledge that has been taken up in innovation—as measured by patent citations—more widely than the work of the average US or world scientist. While this was a “quick and dirty” case study, the results exemplify the great potential of the prizewinners’ work for producing economic returns to society.
(more…)

Bibliometrics
Team diversity widget: how do you measure up?
December 6, 2017
0
, , , ,

Collaboration and disciplinary diversity are hot topics in the science policy and research communities. At Science-Metrix, we’ve been working on multi-/inter-/trans-disciplinarity issues for a while now, and we figured that some of you might find it useful to have a tool you can use to take a quick measurement of the multidisciplinarity of your team. As part of our 15th anniversary celebrations, we’ve created a free widget that we’re sharing for just such a purpose. Any team can be measured—your research team in academia, a product team in your company, or even your Wednesday-night hockey team. In this post, we’ll explain what the disciplinarity widget does, how to interpret the measurements, and how you can use it yourself. We hope you enjoy the widget—a little birthday gift from us to you!

(more…)

Science policy
Canadian Science: mandate update from Minister Duncan
November 29, 2017
0
, , ,

Kirsty Duncan (the Canadian federal Minister of Science) gave a keynote address at the 9th annual Canadian Science Policy Conference in early November, during which she outlined the main priorities of her role and what she’s accomplished since being named two years ago. In our ongoing coverage of the keynote speeches from CSPC, this post will summarize her talk and highlight some critical questions. (more…)

Science policy
The new face of the science–policy interface
November 21, 2017
2
, , , ,

The new Chief Science Advisor position is the top job at the science–policy interface in Canada. While attending the 9th Canadian Science Policy Conference in Ottawa earlier this month, the other conference-goers and I were lucky to get a glimpse of how Dr. Mona Nemer—newly named to the job—understands evidence-based decision-making. In this week’s post, I’ll give a summary of her remarks at the CSPC and distill the main views on evidence-based decision-making that they seem to reflect. (more…)