5557266356_b2934f383b_o.jpg

Impact is not just coming, it is already here. Rant and rage, it’s a fact of academic life. In my view that’s (mostly) not such a bad thing. As a researcher I want my work to reach broad audiences, though I recognise that attempting to measure the impact of research is a fraught exercise with potential perverse effects. Our physicist friends tell us that observing a phenomenon changes it, and even “quality metrics” are pretty bad at capturing the kinds of value produced by culture. Scholarly work not only has intrinsic merit, but its “real world” effects or applications can take decades to manifest.

Yet like it or not, impact assessment is upon us. This means that historians need to spend a bit of time getting their head around what it means and how it might shape their work. And to my mind, there’s good political and disciplinary reasons for doing so.

The audiences for scholarly work

Whether they’ve thought about it in these terms or not, academic historians have long seen their work has having an impact. Mostly the audience for this impact has been other scholars within the discipline of history. Historians have sought to shape what their colleagues think and teach and write, how they understand the past and go about trying to understand it, and this form of disciplinary impact has been measured through mechanisms such as publications, reviews, referees reports, professional memberships and more latterly grants and citations. Often historians have also sought to influence scholarship in other disciplines and the methods for tracking this interdisciplinary impact have been similar.

But historians have also long looked beyond academic audiences to different kinds of publics. From the nineteenth century German prophets of the nation-state and the early twentieth century propagandists of empire and war, to those who have more recently advised on the writing of school curricula and spoken into public debates about the past and its legacies (not least our own history wars), historians have not be shy in seizing their role as secular priests and looking to influence the social and political world beyond the academy.*

What kind of impact are we talking about now?

There are three important things to remember about the new version of impact. The first is that its focus is very much on the latter of my three traditional audiences – the world beyond academia. While impact on scholarly audiences is important (indeed, it’s essential), it is not sufficient in itself to meet the new criteria. The definition of research impact put out by the Australian Research Council (ARC) makes this clear. Impact is:

‘the demonstrable contribution that research makes to the economy, society, culture, national security, public policy or services, health, the environment, or quality of life, beyond contributions to academia.’

The second and third things to know about impact are encapsulated within the phrase demonstrable contribution. What is a demonstrable contribution? The definition of impact in the UK 2014 REF (which is a bit like Australia’s ERA), helps to concentrate the mind. There impact is defined as ‘an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia’. The focus here is again on the world beyond the university but the importance of change (as against the Australian contribution) is more explicitly spelt out.

How did a piece of scholarly work alter something (the presumption is, for the better) out there in the world beyond campus? What was different as a consequence of the research? Answering these questions in turn requires thinking about ways such change might be enabled, evidenced and – yes – measured.

So to re-cap, what we are talking about here is:

  1. audiences beyond the university
  2. change in behaviour, concepts, practice
  3. ways to measure and demonstrate that change

Engagement: pathways to impact

Which brings me to the difference between impact and engagement. Simply put, it is pretty hard to have impact if you don’t have engagement, but engagement is not impact. Engagement is all the stuff you do to try and communicated your work to different kinds of audiences. The two kinds of scholarly impact I detailed above (within the discipline and into other disciplines) fall it into this bucket. So too does engaging with social media (see ‘The scholar and the social media’, elsewhere on this blog), writing policy briefings, teaching, talking to the media, industry collaboration and establishing new patents. These forms of engagement are part of what is called the pathway to impact. They’re vitally important in that you don’t usually get impact without them, but they are not demonstrable change.

The ARC has published a Research Impact Pathway Table that lays out what it calls the ‘research pipeline’ and indicates where the different aspects of research would normally sit along this pathway. I have given it a few modifications (as I see them) to highlight the difference between engagement and impact:

Screen Shot 2017-06-22 at 16.43.52

Researchers can do a lot to build pathways to impact (RCUK has a helpful guide for how to spell this out in a grant application) and I’ve got a whole set of brewing ideas on what this means for historians (subscribe to get a notification of that post). But as far as impact goes, the key thing to focus on is demonstrable change.

Measuring impact: metrics, #altmetrics and case studies

Which brings us to measurement. How do you demonstrate change? At one end of the spectrum there are those who advocate metrics such as citations, impact factors & commercialisation (which are acknowledged to favour the sciences) and at the other end of the spectrum are those who champion case studies (which are thought to be more inclusive). In the 2014 REF, the UK went for case studies and that looks to be the way Australia is going to go, at least initially (although it seems the UK might now be revising their approach). Somewhere in the middle (perhaps) sit #altmetrics, which attempt to give a quantative measure of the quality and quantity of attention that a scholarly article has received, mostly through the social web (ie. twitter, facebook, academia.edu etc).

It is worth the historian’s while to spend a bit of time working out the different tools for both bibliometrics and #altmetrics (subscribe to get notified when I post a metrics explainer) and not only because your DVCR, appointment and promotion process will soon be requiring this, if it doesn’t already. It’s also important for historians to engage with research metrics, because it is part of the politics of knowledge in the 21st century. It is part of the way our disciplinary practices are being governed and commercialised. Platforms for impact measurement promise big biccies to publishing giants (among others) who are busy enclosing the academic commons and turning our research into their money while we sleep. Social and political approaches to data are more important now than ever and if historians and humanists don’t engage in these processes we cede this territory to those who come with very different agendas. (Laboratory Adelaide is a great example of critical thinking about cultural value in the world of impact.)

Planning for data collection

But I digress. Before we get lost in the politics of measurement I want to take us back to the notion of change itself, which should be a bread and butter concept for historians. Change is about time. It’s about processes of alteration and how something becomes different.  In short, if change means change (!) you need to be able to show how your audiences (intended and unintended) thought / felt / believed / behaved / worked / delivered their product before they encountered your work, and how they [insert relevant verb here] differently as a consequence of engaging with your work. Cause(s) and effect(s) – or affect(s) (sorry couldn’t resist the lame history joke).

Again the ARC gives us some massive pointers (which you’d hope it would really). What they boil down to is this: plan for data collection as part of your engagement activities. This means building it into your project design. Think about what you want to track, work with any industry parters on ways you might collect data, assess it at the start of your project and then assess it at the end. Numbers can be part of this, but they are by no means the whole of it.

For example, if you are working on the history of fashion, using archival materials held in the State Library and you are planning on publishing some scholarly articles and a monograph, also plan for engagement activities. These might include an exhibition at the Powerhouse or the State Library, a series of blog posts or photo essays, or articles for Vogue magazine. But they might also include activities that move beyond broadcast (the notion that history is done inside the academy and communicated to outside audiences who listen) and towards collaboration (which sees different kinds of audiences as participants in the project of knowledge creation).

Then think about change as part of these engagement activities. In the case of the fashion exhibition, speak with the museum about how items in the collection have previously been used and at the end of the project talk to them again. Co-run workshops with curators and artists. Track visitor numbers. Promote the exhibition and your research via different forms of media and track visitor numbers again. Build forms of participation (eg. dressing-up and talking a selfie, or writing a postcard protesting sweatshops) into the exhibition and collate them, use creative ways of surveying attendees about what they learnt from the exhibition, how it made them think about the work that dress does in their own lives and how they might dress differently after their visit. (And then, if you are really keen, use your data to write a scholarly article about research measurement!)

Some of these examples might sound flippant and of little consequence (and there are plenty of other options) but my point here is that building data collection into engagement is a way of measuring impact.

Building data collection into engagement is a way of measuring impact.

Early Career Scholars (ECRs) are perhaps advantaged over their more established colleagues in this new world of impact and engagement, precisely because they can build such approaches into their academic practice from day one.** This, in turn, has disciplinary implications in that it has the potential to force academic historians to rethink the boundaries of our fields, to reevaluate how we locate authority and knowledge, to reconsider the kinds of sources we use, and quite possibly to write new kinds of academic histories as a consequence.

As scholars we are often occupied in the production of abstract knowledge about our subjects and for various audiences. If we let it, impact has the potential to push us into making relationships with them rather than writing about or talking to them. That’s challenging, but it’s also exciting. And it might just be a way of forging a new kind of trust compact between the public and the university in a world in which the question “how do you know?” is no longer answered by deference to old knowledge institutions and their credentials.

***

* Research Councils UK has a broader definition of research impact that takes account of these different audiences. For them, research impact can mean academic impact (shifting understanding and advancing scientific, method, theory and application across and within disciplines) and/or economic and societal impact (contribution to society and the economy, and benefits to individuals, organisations and/or nations).

** If you’ve ever been frustrated at journal rankings that discourage scholars from publishing in local or specialised forums, then impact is potentially a way of showing that those platforms matter because they communicate scholarly work to engaged audiences and lead to demonstrable change in specific communities.


This is part of an ad-hoc series of posts on engagement and impact for the humanities. If you have found these posts helpful or would like to hear about other topics, please let me know and by all means share them on the interwebs.

Advertisements