Posts Tagged: methodology

Bibliometrics
Rationalizing the extremes: introducing the citation distribution index
May 10, 2018
0
, , ,
The distribution of citations among the scientific literature is in many respects similar to the distribution of wealth in the Western World: a handful of articles receive most of the citations, while most of the articles receive few or no citations at all. The distribution of citations is indeed highly skewed and not well represented by its average (the so-called “Bill Gates effect” in the wealth distribution analogy). In fact, when the average is computed for entities that publish a small number of articles, a few highly cited articles could be enough to propel these entities into research superstardom. In this post, we’ll look at an alternative metric Science-Metrix has developed to address the limitations of averages, as well as some of the other metrics we explored to get to that point.
Bibliometrics
Positional analysis: from boring tables to sweet visuals
March 28, 2018
7
, , , , , , ,
At Science-Metrix we are obviously very focused on data—sweet, sweet data! We are also very aware that bibliometric data or pages and pages of analysis can be overwhelming and that more intuitive data presentation can help our clients to better understand their study results, which in turn helps them to take action on the findings we return to them. One graphic presentation we find particularly helpful is the positional analysis chart. Positional analysis is a way to visually depict two, three or even more indicators for a given set of entities instead of using a standard (and boring) table. Click through to the post to see how it works.
Evaluation
Contribution analysis: How did the program make a difference – or did it?
March 6, 2018
3
, , , , ,
Contribution analysis is an evaluation method that was developed by John Mayne in the early 2000s to enable evaluators to produce rigorous impact analyses in the context of programs that cannot be evaluated using an experimental or quasi-experimental design. While the Science-Metrix Evaluation team has been conducting contribution analyses informally for a while now, the workshop conducted by Thomas Delahais (of Quadrant Conseil) for the SQEP annual conference in October 2017 inspired us to use this technique in a more rigorous, systematic and comprehensive way. This method appeared to us as perfectly suited to the kinds of evaluations we, and probably many of you, carry out—that is, evaluations of complex, multifaceted programs for which counterfactuals cannot easily be used. In this post, I’ll give a brief synopsis of the method and provide suggestions for further reading.
Evaluation
Maximizing the use of evaluation findings
February 21, 2018
2
, , ,
In a 2006 survey of 1,140 American Evaluation Association members, 68% reported that their evaluation results were not used. This suggests a serious need for evaluation results to make it off the bookshelf and into the hands of intended audiences. This week’s blog post looks at how we, as evaluators, can help maximize the use of the evaluations we produce.
Evaluation
Program evaluation basics
February 14, 2018
0
, , ,
The ScienceMetrics blog has so far focused on our scientometric, data mining and science policy activities, but we also have a long history of conducting evaluations of S&T-related programs and initiatives. In my opinion, the most fun to be had on the job is when we team up to combine our quantitative and qualitative analysis skills on a project. To kick off a series of posts from the evaluation side of Science-Metrix, in this post I’ll present an introductory Q&A on program evaluation, and next week I’ll discuss how to maximize the use of evaluation findings. Read on for the what, why, when and how of program evaluation.
Bibliometrics Data mining Open access Science policy
2017: the best of ScienceMetrics.org
January 17, 2018
0
, , , , , , ,
Over the past year, the ScienceMetrics.org blog has grown considerably. We really appreciate our growing group of readers and all the interesting discussions that the blog has sparked. In today’s post, we’ll take a quick look back at 2017, and give you a year-in-review from our side of the “Publish” button.
Bibliometrics
Prestige and inequality in the ivory tower
December 18, 2017
2
, , , , , , ,
It’s no secret that we have an inequality problem within the hallowed walls of the academy. Much focus has been dedicated to problems of inequality—of status, of wage, of job security, of resulting social mobility, and beyond—mainly between tenured faculty and the growing precariat of contract teaching labour. The central importance of published research is often fingered as a central culprit in entrenching this inequality, and in this post I’ll explore the inequality of citations. The analytical approach is borrowed from Thomas Piketty’s Capital in the Twenty-First Century, his landmark work on economic inequality.
Bibliometrics Science policy
Metrics: state of the alt
November 8, 2017
3
, , , , , , ,
Discussions of research having impact were for a long time limited to citation tracking, to estimate how much one piece of work influences subsequent explorations. However, with calls for research to have impact on the broader society—breaking out of the closed circle of research feeding yet more research—there’s a lot of interest in seeing how we might trace that impact pathway as it breaks through the membrane insulating the world of research. Altmetrics has held the promise of tracking just such traces. At STI 2017, several leading researchers on the topic gave valuable updates on the state of the art, and their estimation is that we should be seriously cooling it with all the hype. This post sums up the points that stuck out to me from their various presentations, and tries to outline my takeaway of what we should be learning from altmetrics.
Higher education Science policy
Policy: whose problem is it anyway?
March 14, 2017
0
, , , , , , ,
In January, Sir Peter Gluckman—Chief Science Advisor to the PM of New Zealand, and global point man for science advice to government—gave the inaugural address at the Canadian Science Policy Centre lecture series. The discussion covered a lot of important points of difficulty for science and governance—and science in governance—that are emerging in the 21st […]
Science policy
Capturing imaginations, not wallets and podiums
February 28, 2017
0
, , , , , , , ,
The notion of capture—when one group in a partnership is allowed “home-field advantage”—is helpful in understanding some hurdles to successful collaboration across disciplinary and sectoral boundaries. Last week, I outlined how sectoral capture undermines the very notion of transdisciplinary research. In this week’s installment of the capture series, I’ll talk about how sectoral capture is […]