An update (now updated on 13 November).
Firstly, there are the Timely Information pilots being funded by CLG (more information at the ESD-toolkit site). The general consensus there is that there is a need data standards for public sector info to allow a degree of quality marking to be applied to mash ups. Presumably these standards will be the compatible with the ones Emma Mulqueeny spoke about?
There are some developing via a project toolkit is involved in with the Local e-Government Standards Body (an organisation I hadn’t heard of before). The LEGSB is working on the semantic web and standards for web services relating to re-use of data, service directories and links to DirectGov, NHS Choices and Business Link websites etc.
An example of the sorts of ongoing discussions on the changes that are needed to supprt mashing of data can be seen over at the esd-toolkit site: A new look at the standards underpinning business processes and managing records.
Already, local authorities in England get annual data quality auditing as part of the Audit Commission regime and random checks each year on National Indicators to make sure they comply. Which is good, but as the Audit Commission’s report points out, the focus of the quality proces is then on data required by central government – which might be different from a local activist wanting to mash it together for another purpose.
For an example of where all this is going, have a look at the Nolrfolk Community Mash-Up project which is described across at the esd-toolkit site.
But still: how many of the mashers will understand the limitations of the data? Will bad decisions be taken as a result? What happens when you mash together data of varying quality, and how can you be open about the resulting error margins? How many people understand propagation of uncertainty anyway?
It’d be interesting to do a couple of case-studies to explore this. There must be something from America by now?
Update 13 Nov: Link to Tim Anderson and Norfolk’s open data work