Data mining Science policy
Budgets, timelines and expectations: The influence of project novelty
August 30, 2017

In the previous post in our ongoing series on data mining for policy, I touched on the importance of organizational context for successful management of these projects. In today’s installment, we dig into novelty and the pressure it exerts on the budgets, timelines and expectations of the project. Novelty is an interesting beast. Alongside the excitement and exhilaration of the new comes the fear caused by venturing down unknown roads, as well as the frustration that comes from hitting a few dead ends you can’t avoid because you don’t already know the way. Below the fold, I’ll discuss how to manage novelty effectively in data mining projects for policy.

From a project management standpoint, unchecked novelty can be a frightening element that risks derailing plans and sending budgets up in smoke. Project expectations should be adjusted according to the level of novelty, which can be assessed by asking the following questions: Have similar projects of this size/scope/nature been carried out in my company/context (or one that is sufficiently similar)? If so, how successful have these been? What were the key success/failure factors, and do these same enablers/constraints exist today?

As with several points raised in the previous post, organizational context is a key factor here. One of the challenges of working in a consulting firm is that we are often drawing methodological inspiration from peer-reviewed research papers that do not provide any sense of the time (in person-days) that was required to carry out a project or study, nor do they provide the budget that was required. These constraints would make it very challenging for us to ever implement the following type of study, for example, described in this interview with Einav, as documented by Taylor, Schroeder, & Meyer (p. 7):

So we kind of came to [eBay auction data] not having a particular idea of what exactly we want to do. We just wanted to formulate reasonable questions that could kind of leverage the idea that you have the Big Data rather than some sort of a smallish portion of it. So initially we were basically for six months just playing with the data, trying to understand, you know, what we could do with it and what could be interesting.

One of the most important aspects to consider when dealing with a novel data source and/or analysis method—and another key point that arose from our first expert workshop—is to know whether failure is an option in the context at hand. As is the case with individual people, some clients and project settings are less tolerant to risk than others. In some cases, what truly matters is addressing the operational need at hand, whereas in other cases a client may be hoping for a solution but happy to see progress on the development of a new analysis technique or a feasibility assessment of a novel data source, developing towards a solution that is still a way off.

Know whether failure is an option in your data mining project Click To Tweet

We can define failure as the inability to deliver a solution that meets the user’s expectations (noting that solutions can partially meet expectations, and that expectations can shift). In traditional project management contexts, the failure of a project can usually be chalked up to a failure of planning or a failure of execution. When working in a situation of high uncertainty—which data mining certainly is—that relationship simply no longer holds. For instance, if something derails your project that could not have been reasonably foreseen, then your planning cannot be reasonably termed to be a failure. You acted reasonably based on the information available at the time. Planning and execution were sound, and yet the project may fail.

When a project looks like it is heading towards failure, it exerts pressure on the nexus of timelines and budgets for delivery and the expectations of the solution delivered. It’s important to understand your client’s tolerance for failure (and the drivers of their level of tolerance). If failure is not an option, then a different part of the system must bend—namely, the budget, timeline or expectations—until “success” is attainable. You can spend more time trying to solve the problem (assuming it’s solvable, which is not always the case); you can spend more money to overcome some barrier you’ve encountered (assuming you can pay your way out of it, which isn’t always possible); or you can realign your expectations (which may not be acceptable to your client or other stakeholders). Some would argue, perhaps correctly, that this is a workaround to failure, and depending on the context that may indeed be the case. The key, then, is to understand how success is defined and where there is room to adjust so that the project outcome is acceptable to all parties involved.

An initial investment in a scoping phase is an effective way to better planning Click To Tweet

The inherent novelty (and therefore uncertainty and risk) within data mining projects is what led us to refine the framework to include a scoping phase, whereby data access constraints could be explored—and expectations adjusted—prior to embarking on a full-fledged study. The value of the scoping phase, in project management terms, is to reduce the uncertainty and the risk surrounding a data mining project. As noted above, planning is complicated by the introduction of uncertainty, so an initial investment in a scoping phase is an effective way to better planning. With a better understanding of the data available, data preparation required and analyses to be undertaken, a more informed decision can be made about how much the project will cost, how long it will take to conduct, and ultimately whether these investments are worth the solution that the project would provide.


Read the next post in this series.


Science-Metrix’s final report for this data mining project is available from the Publications Office of the European Union.

Data Mining. Knowledge and technology flows in priority domains within the private sector and between the public and private sectors. (2017). Prepared by Science-Metrix for the European Commission. ISBN 978-92-79-68029-8; DOI 10.2777/089


Note: All views expressed are those of the individual author and are not necessarily those of Science-Metrix or 1science.


About the author

Chantale Tippett

Chantale Tippett is a Senior Analyst at Science-Metrix. She has been instrumental in the project management of several large-scale contracts the company has conducted for the European Commission in recent years, including She Figures 2015, Data Gathering and Information for the 2016 European Research Area Monitoring, and Data Mining for the Development, Monitoring and Evaluation of Research & Innovation Policies. Ms. Tippett holds an MSc in Public Health Nutrition from the London School of Hygiene and Tropical Medicine and a BSc (with Distinction) in Nutrition and Food Science from the University of Alberta in Edmonton, Canada. She has also completed the Comprehensive Project Management course at McGill University in Montreal and is currently working towards obtaining the Project Management Institute’s Project Management Professional designation.

Related items

/ You may check this items as well

Impact assessment stories: decisions, decisions

It appears that the research & innovation policy c...

Read more

Budget 2018: The Evidence Budget

In our post last week on the 2018–19 Canadian fe...

Read more

Budget 2018: the fundamental question of research funding

Science has been quite prominent on the Canadian p...

Read more

There are 0 comments