Skip to content

Report of the Digital Government Review


Security: moving beyond fear

So far, there are 2 comments on this section. Jump to comments

Any approach to data sharing must include an approach to security and risk management. If we move too far ahead based on assumed benefits but without understanding and communicating the risks then we are doing the public a disservice. We will make avoidable mistakes. We will increase fear. On the other hand a highly risk averse approach will lead to lost opportunity for better public services.

Many of the world’s largest and most private organisations suffer from security breaches: the NSA, Apple, Mastercard and Visa. So does the public sector.

2014 alone has seen a number of security breaches of UK public sector data: some of our health records were incorrectly (possibly illegally) transferred to the US; a prison lost a disc containing detailed information on prisoners; multiple local authorities disclosed complete electoral registers rather than the smaller, public version. These breaches are happening all the time.

We need to understand them, and we need to learn our lessons. But we also need to recognize that they happen and will go on happening. We should not stop because of fear: we need to balance the risks with the benefits. The government has a longstanding, but often forgotten, reference work on risk management produced by the National Audit Office in their frustration at the civil service’s inability to get this right – The ‘Orange Book on Risk Management’ [56]. As Bruce Schneier puts it [57], we need to move beyond fear and think sensibly about security.

There are a number of strong models to build security and privacy into processes for data sharing (often captured under the terms ‘security by design’ and ‘privacy by design’).

We recommend approaches such as using an architecture where data is not moved to large central databases but instead is kept within smaller data stores with processing performed either as close to the data as possible or only with the specific data elements required [58].

In the academic world, models with gatekeepers and data safe havens / research laboratories are being explored. These can both improve security and provide access to skilled resources. The effort that the security, academic and statistical communities are putting in is laudable. Much of this work has also been translated into government standards by Government Digital Service (GDS) [59].

But we still have largely old solutions in place and we are still building new solutions without following new standards. For example:

  • The example above of UK health records being taken to the US was part of the NHS care.data programme [60]. This should have been a flagship programme for government data analytics and data sharing – not an example of making basic mistakes
  • The MyLicence programme to share driver data with the insurance industry [61]provides no details of the audits that government will perform to ensure that the insurance companies do not misuse data. Where is the openness and transparency in this?

Given these failures we will need to improve our approach to security and privacy.

In particular, as government gradually opens up data to external services, such as MyLicence, and explores the possibilities created by opening up APIs to other parties, a strong governance model will be required to retain trust and confidence in both public sector and non-public sector services that use government data.

“Data access for research should be subject to privacy safeguards” – Professional body

[56] https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/220647/orange_book.pdf
[57] http://cyber.law.harvard.edu/cybersecurity/Beyond_Fear
[58] Two interesting example of this design model were provided to us during the review. One was a safer version of the a congestion charge system where a greater amount of processing was performed within the cameras to reduce the amount of personal data bought back to central databases; the second provided a safety alert service to sex workers in a given geographic area without the alert service ever being aware of which workers were in that area. Both of these designs reduce the transfer of personal data and hence create a more secure and trustworthy environment.
[59] https://www.gov.uk/service-manual/technology/security-as-enabler.html
[60] http://www.pcworld.com/article/2108580/dont-upload-health-care-data-to-google-cloud-uk-groups-say.html
[61] https://www.abi.org.uk/Insurance-and-savings/Topics-and-issues/Insurance-industry-access-to-driver-data

This page reformats automatically when printed. Print this section

Please note that comments left here are public - you can also make a private submission.

Your email address will not be published. Name, email address and comment are required fields. Please note we may moderate comments.

2 comments

  1. Peter says:

    Reference 58 could also have linked to this article comparing Reason Digital’s work producing a safety alert service for sex workers with Adobe’s work on eReaders: http://www.computerworlduk.com/in-depth/public-sector/3580989/westminster-view-what-sex-work-and-e-reading-teach-us-about-federated-architecture/

  2. Peter says:

    Corrected a silly typo. Rick -> Risk