These are the third and fourth entries in a 5-part series from Dialogue by Design (part of the OPM Group), showing how they consult and report on contentious subjects, in this case a recent consultation conducted on behalf of the HFEA. This post is by Remco van der Stoep.
Analysis of responses is a crucial stage of any consultation process, but all the more so when the subject of the consultation is potentially controversial. Consultations such as the one the HFEA ran on techniques to avoid mitochondrial disease are likely to attract significant numbers of responses, ranging from detailed comments on the science of the techniques to emotive comments on ethics. The duty of the analysis team is to accurately capture the essence of each of these comments, enabling decision makers and others to obtain an overview of respondents’ views.
At Dialogue by Design, when we brief analysts we make sure they are aware just how important their task is. We remind them that the data that they are organising consists of people’s views, and that these people have taken the time and effort to respond to the consultation because the subject means something to them. As independent contractors, we like to think of ourselves as working not only for our client, but also on behalf of the respondents. This notion informs our approach to the analysis process, which is characterised by transparency and rigours auditing.
Without the right tools it would be very difficult and time consuming to analyse the sometimes intimidating volumes of data that big consultations can generate. The HFEA consultation on mitochondria replacement attracted almost 2,000 responses, many of which answered most or all of the seven consultation questions. Also, as discussed in our previous blog, respondents used different channels to respond, and indeed quite a few sent handwritten letters or response forms. We are lucky to have systems that were purpose-built for dealing with consultation responses, and these allowed all data to be ready for analysis in one central database soon after the close of the consultation.
Tools alone, however, are no guarantee for a thorough and efficient analysis process. They facilitate the analysis, but at the heart of the process it is the analysts who structure and organise the responses. And that starts with the content of the responses. For any analysis process we design something we call a coding framework, an organised list of codes. Codes, in turn, are short phrases summarising points respondents make, such as ‘small quantity of mitochondrial DNA’ or ‘impact on family relationships’. We develop the coding framework on the basis of what respondents say, so that our codes reflect the views respondents actually express rather than what we expect them to say beforehand. The coding framework develops throughout the analysis process, shaped by respondents’ comments and coordinated by our lead analyst.
Equipped with the analysis database and the coding framework, the analysts read each individual response to each consultation question and make sure that they assign codes to every bit of the response – if one phrase contains several arguments we will typically assign multiple codes to it. This is the crucial bit of the analysis: it organises an endless volume of qualitative data into manageable little sections. This allows our writers to summarise the responses into a report (more about reporting in the next blog in this series), but also allows decision makers to access the detail of specific arguments respondents make in relation to particular themes and issues.
For consultations as high-profile as the HFEA mitochondria replacement one, we like to make sure throughout the analysis that we are drawing out the right issues, and that our codes aren’t short on detail or clarity. Therefore we provided the HFEA with a live connection to our analysis database, so that they could review how our analysts assigned codes to responses. This mechanism ensures we get helpful and timely feedback on our work, also meaning that respondents’ comments are presented to decision makers in a relevant way.
Once the analysis process is completed, the next step is to write a summary report, which gives an account of the views expressed in response to the consultation.
The reporting process of a public consultation is the final culmination of all the stages we have previously discussed in this series of blog posts. During report writing, the views of all respondents are condensed into a document that is often available to the public, which represents a definitive output of the consultation process, along with the analysed data. This final output must accurately depict the full range and scope of the opinions raised in the consultation and effectively convey nuance and meaning in the key themes and arguments identified in the analysis stage.
For this reason Dialogue by Design takes its role of safeguarding the quality and the integrity of the report very seriously. Our duty is not just to the client in providing the information they require, but to the consultation’s respondents too, specifically in demonstrating that their points of view are captured and understood.
One of the guiding principles behind the HFEA report on techniques to avoid mitochondrial disease was that it should represent the range of views held by respondents, rather than simply focusing on the most frequently discussed themes. The report therefore covers themes and viewpoints raised by hundreds of respondents as well as those raised by a handful. This is because the purpose of this consultation – and others like it – is to uncover the whole range of public views on this subject and allow those with a particular stake or interest in the proposed techniques to have their say.
The report reminds readers that the consultation was held as part of a wider engagement programme also including public dialogue and research and that some of these strands sought to obtain information about the mix of views among the general public. In this sense this consultation was very deliberately ‘open’ to all those who wanted to respond, in contrast to ‘sampled’ research which seeks the views of a statistically representative population sample.
In our consultation reports we always aim to strike a balance by indicating, on the one hand, where a view is held by a greater or smaller number of respondents, while on the other hand acknowledging that quantitative information of this kind is no measure for the balance of opinion in the wider population. For this reason we don’t use percentages when reporting on findings from the analysis of consultation responses.
Considerations about reporting on quantitative information – e.g. how many respondents indicate that they agree or disagree with proposed new techniques to avoid mitochondrial disease – are especially important in consultations on controversial topics, as these often inspire organisations to mobilise their supporters to respond to the consultation in a particular way. We talk about organised responses (or campaign responses) when we see numerous consultation responses, submitted by different respondents, which all use the same words to convey a particular viewpoint or set of arguments. When reporting we make sure that what is being said in organised responses gets fair coverage, while also informing the reader of their ‘organised’ appearance.
This is not because Dialogue by Design wants to express an opinion on whether campaign responses are more or less valuable than individual responses, but rather for the sake of transparency, particularly with regard to numbers in the report. It is then up to each individual reader of the report to decide how important the quantitative information is to them. Representing a mix of responses, from very detailed considerations by statutory stakeholders to snappy arguments echoed in hundreds of organised responses, is one of the main challenges thrown up by the reporting process. We believe it is more an art than a science to make sure that the final report is accurate and fair to all participants.