Case Study: Open Standards Consultation (Cabinet Office )

Comments were pre-moderated and once approved were attributed and displayed on the site to encourage debate between respondents. A facility to like or dislike comments was also provided so that contributors could easily show support or otherwise for a particular point of view. Once the consultation closed, all comments were exported from the online tool in a spreadsheet to facilitate analysis of the feedback from this and other sources.

As the site maintenance was intuitive, the policy official was able to manage content and menus to add additional material such as notifications of public roundtables, links to further information or answers to frequently asked questions during the course of the consultation, allowing us to develop the information throughout, according to the needs of the stakeholders.

Additional tools included a SoundCloud account to upload audio recordings of public roundtable discussions and an Eventbrite account to provide information, booking and event management for the roundtable sessions.

About the project:
A written consultation document was prepared, then adapted for display on the online tool. The tool reused a solution provided for another consultation but added some additional tailored functionality. As well as the online consultation, respondents could engage through face to face roundtable discussion (or via telephone conferences). We also accepted emails, written letters and meeting notes.

The tool was developed and delivered by our in-house digital team, who also arranged the hosting and did the initial content upload and structuring. The policy official leading the consultation was responsible for setting out the requirements. Including requirements gathering, bespoke development on the existing, reused tool and initial content upload I estimate it took no longer than 5 or 6 working days to have the site ready but this was spread out over a period of around a month.

Additional content (such as roundtable information / frequently asked questions) and pre-moderation was taken care of by the policy adviser; this took no longer than about 2-3 hours per week until the final week of the consultation when a flurry of activity was seen and 1-2 hours per day was needed.

Maintenance of the Eventbrite listings and bookings took up to an hour per day for each event (7 events were held during the course of the consultation). Twitter and blogs were used to keep people up to date on developments, in particular with relation to the roundtables. A telephone number and email address for our servicedesk were also published in the consultation document to provide people with a route to register to receive information as direct mail-outs if they preferred. Very few people took up this offer.

We received around 480 responses to the consultation through the various channels available. This was an excellent response rate as this was a technical and rather niche consultation that in some cases asked particularly searching questions. Around a quarter of the responses came through the online channel. The feedback we received was used in an independent analysis of consultation feedback which was carried out by Bournemouth University on our behalf. It was very valuable, particularly because it was structures and separated issues into specific question areas.

Reaching our target audience required proactive engagement with existing networks using various channels such as phone, face to face meetings, direct emails, blogs and tweets in addition to press releases. A blend of approaches to gather feedback was considered the best approach as we had such a diverse range of stakeholders, many of whom would not normally engage in government consultation activity (in particular small and medium businesses).

The online platform displayed each question with all of the responses to that question listed below, along with the name of the contributor. Note that this only contained responses received through the online tool. The interrogation and processing were carried out outside of the tool. A .csv file export provided the answers in a spreadsheet which could then be annotated and interrogated as part of the analysis e.g. by adding typologies of respondents and categories for response types.

In parallel to our official online channel, an independent developer who discovered the consultation created his own site building on the question template that we had created. His promotion of this alternative channel delivered around 20% of the responses and reached communities that we may not have been able to otherwise reach. The output was provided to us via email, also as a spreadsheet making it easy to interrogate.

Testimonials & lessons learned:
For the consultation as a whole, we received written feedback that it was exemplary. Being able to see attributed comments throughout the process also seemed to be considered as valuable and a good way of generating debate on some contentious issues.

The tool performed well throughout the process and was flexible so could cope with additional information needs as we progressed. For a next iteration, ideally some of the processing such as typologies could be carried out within the online tool – e.g. by self selection or as part of the moderation process. This would save time in collating the results at the close of the consultation. We also received feedback that it would be helpful to allow people to complete partial responses and to save their answers before finally submitting as they may not want to complete all questions in one sitting. The capability to review all answers given at the same time as inputting their own response was also requested. The site only allowed review on one screen with submission on another – although the work-around was to keep two browser windows open. I would also spend more time re-purposing some of the content so that it makes more sense to an online audience who may not read the document and questions in a linear way.