A few takeaways which have stuck with me from the PRAXIS workshop on heritage researchers and their engagement / potential engagement with policy-making:
We shouldn’t overthink it / expertise counts.
We are used as academics to couching arguments within a hypothesised context often, which can lead to writing that can be less than definite in its views. In developing evidence which may be used as policy, whilst data and argument does matter, the format and amount of back-up for arguments is the same as may be required in a typical peer-reviewed paper which is open to close scrutiny. I am not saying that evidence should be lightweight, but equally in our approach to engagement, we shouldn’t overthink things before submitting. Likewise, academics do have expertise, and can be recognised as such due to their experience and positions – we should not be over-concerned about setting out our credentials to show that our views are valid.
The quality of input considered evidence isn’t that high.
Within the workshop, an interesting exercise was undertaken where we worked in groups critiquing materials that had already been submitted and accepted as evidence by the House of Lords enquiry which we were using as a case study. The range of materials was interesting, ranging from simple statements of fact, through to closely argumented positions statements. Not all pieces of evidence stuck to the brief for the call for evidence, and some pieces did not pass the ‘accept as first year undergraduate coursework’ critical viewpoint. The bar for what is considered evidence suitable for consideration is not the same as a journal article or what might be acceptable in an academic setting – however this doesn’t mean it isn’t used in high level situations, and therefore as academics we very clearly could provide considerable benefit to contributing (given the quality of what else might be considered).
It can be challenging to think about potential impact in the arts and humanities space.
There was broad recognition that within the arts and humanities space, engagement with policy was not as widespread; the benefits (for all parties) was not well recognised; the potential contribution not realised; and the thinking process may be more difficult for researchers who are used to approaching the subject in a different way (based on the background of the researcher). Findings or outputs from arts and humanities research might need considerable repurposing to make them suitable for input into policy, and, as one senior academic in the room neatly put it, their brain has to work in two different modes of thinking as to whether they are using material and knowledge for research or for evidence production and submission. This is not necessarily easy, as the mindsets are different.
Provide killer facts and do some packaging of solutions.
Evidence that might be picked up, quoted and used within discussions, often takes the form of a ‘killer fact’ or turn of phrase which suits arguments being made in a sometimes political setting. Blinding policy-makers or committees with data might be what we want to do, to ensure people have the right facts to back up arguments, but sometimes it can be too much. Statistics and data needs simple explanation for the everyday reader who may not be an expert in the specialised field of the academic. Where solutions can be provided for a problem, inference and obfuscation can sometimes hide the intended meaning (especially in the language of academic papers) – a bit of packaging of solutions is therefore worthwhile and appreciated.
Don’t create work for those reading.
Allied to much of what has been said already, and perhaps obvious – the reader has to be able to understand what is being said. Inference, nuanced arguments and theoretical frameworks, whilst important, do not necessarily help a reader who is looking for clarity in what is being said. Think about the abstract rather than the full paper as the equivalent. Particularly for Parliamentary enquiries and evidence gathering, simplicity in language is really important – as the weight of evidence from across the board may be substantial.
Relate to the brief.
We pick up our students on this – but in looking at what had been asked for and what was submitted, in a number of cases the submission really wasn’t entirely relevant. (This may have been deliberate in some cases where a different point was being put across, but it doesn’t necessarily help the analysers of the evidence).
- Potentially great impact from arts and humanities researchers.
The interest in the wider role of culture, society, creativity and comparators than can be brought from different societies, viewpoints, locations, and points in time has the potential to add great depth to policy consideration and analysis of evidence. We should get more involved in submitting evidence that we do currently.
Academic incentives don’t always relate to policy (impact is often thought of differently).
There was much reflection on the incentivisation scheme for getting involved in policy work – recognising that structures in which many of us work rate research, teaching and particular forms of impact metric in such a way that the reasons for investing in policy engagement may not be high. Even with the impact agenda expanding, there was talk about the way in which impact has to be evaluated and demonstrated, meaning that policy engagement along the way, as opposed to clearly seeing an outcome or change as a result of input can prove problematic.
Like this:
Like Loading...