Conventional measures of the “impact” of research are not keeping pace

0

A survey of how academics use social media to encourage people to interact with their research shows that much of the public value of their work is likely to be overlooked in official impact assessments.

The study analyzed 200+ examples of how academics are discussing their grants on social media and encouraging uptake. Based on the uncovered usage patterns, it suggests that the current approach to assessing the public impact of universities, enshrined in the Research Excellence Framework (REF), should be updated as academics are more socially connected today than they were when the model was developed.

The REF is the UK’s official system for measuring the quality of university research and provides information on the distribution of research funding. The results of the latest round of assessments were released last week (May 12).

As part of the exercise, university departments are asked to demonstrate the impact of their work: effectively showing how it has enriched society. While the new study endorses the need for impact case studies, it questions how these will be assessed. It is argued that a gap is opening up between measuring impact in the REF and the true scope and range of scientific engagement on social media platforms – some of which didn’t even exist when it was first developed.

In particular, the REF focuses on the extent to which the final results of completed research projects are perceived by the public. In contrast, the study found that today’s academics are often engaged in continuous ‘feedback loops’ with organisations, community groups, political actors and other publics throughout the life of a project. These lead to opportunities to collaborate and share expertise while the research is still ongoing, often in ways that the REF is unlikely to cover.

The author of the study, Dr. Katy Jordan, of the University of Cambridge’s Faculty of Education, said: “Official language portrays impact as a top-down, outward flow from universities to a waiting public, but that’s an outdated characterization – if it is at all was ever valid. Ask researchers about their most impactful social media interactions and you’ll get a much broader range of examples than the REF covers.”

“One could argue that this means that too many researchers misunderstand what impacts are; but it’s also potential evidence that times have changed. There is a lot to be said for asking universities to demonstrate their value to wider society, but it might be time to rethink how we measure this.”

The REF measures impact through two main dimensions: ‘significance’ (the significant difference a project makes) and ‘reach’ (the quantifiable extent to which it does so). Beyond that, the definition of impact is very open, varies between disciplines, and is often seen as ambiguous.

The study points out that the REF also offers somewhat confusing advice on public engagement, which generally encourages it but discourages it in evaluation metrics. The official guidelines state: “Public involvement in research does not count as impact. Impact is what happens when people interact with, pick up on, react to, or respond to research. Public engagement does not only happen when the research is completed.”

Jordan’s poll invited academics to provide examples of powerful impacts they’d made through social media. She received responses from 107 scientists from 15 different countries, but most of the participants, ranging from postgraduate researchers to established professors, were from the UK. Their research analyzed 209 of the submitted examples.

Significantly, less than half concerned cases where research results were disseminated ‘outside’ to the public in the form of products, as the REF assumes. In such cases, scientists had typically used social platforms to share their findings with a larger audience, stimulate discussions with colleagues, or generate evidence of positive involvement in the research.

However, about 56 percent of the responses spoke of impacts resulting from exchanges that are not just one-way. Specifically, participants used social media to test research ideas, report interim results, collect information and data, or solicit research participants.

These discussions appear to have produced more than official impact. As a result of the exchange, researchers were invited to give public lectures, take part in panel discussions, provide information and advice to organizations or provide training.

Crucially, these opportunities were not always focused on the research that stimulated the initial interaction. In many cases, researchers who posted about their project were asked to share their broader expertise – often with advocacy groups or policymakers interested in learning more about their research in general. In one case, for example, a post on social media led a senior Cabinet Office official to visit an entire group of academic colleagues to explore how their work as a whole could influence and shape policy.

Jordan argues that social media blurs the distinction between impact and “public engagement.” When information flows into academic projects—from people, companies, and organizations submitting ideas, questions, and feedback through social platforms—it creates opportunities for external exchange, both formal and informal. This circle of interaction seems to influence and enhance society in a variety of ways that are not tracked by the REF.

However, these more nuanced effects are difficult for reviewers to monitor or measure. “One solution could be to change the assessment so that universities are not only asked to provide evidence of research results, but to explain the research process throughout the life of a project,” Jordan said. “This is not a call for more ambiguity about what impact is, but for more open-mindedness about what researchers are achieving. In an increasingly complex, socially networked culture, this would help ensure that the broad impact of their work is not forgotten.”

– This press release was provided by the University of Cambridge

Share.

Comments are closed.