Personal Data and Data Analytics
So far, there is one comment on this section. Jump to comments
In ending this section it is useful to consider future uses of personal data, data analytics and technology. Some of these cases are already in small-scale trials in parts of government.
The current government has been exploring ‘nudge’ techniques through the Behavioural Insight Team . Academics are developing new scientific techniques such as social physics  that might in the future provide personalized services to help people understand patterns of behaviour and make decisions that could improve their lives.
These techniques could be promising but the private sector, which has been exploring these techniques for some time, has become increasingly aware of multiple issues around their power and potential .
The explosion of data and the power to manipulate it promise intimate insights into people’s lives at a near population scale. This could fundamentally change social policy, just as mapping the human genome has affected medicine.
Put simply, people, organisations and governments are now playing with incredibly powerful big data tools and technologies that they can’t claim fully to understand. Risk management is vital so that we don’t lose the benefits to society caused by a backlash when things go wrong. Having a regime that manages risk well can create a competitive advantage for the UK. At the heart of this should be consideration of the ethics of a particular process, considered in the round outside the day-to-day managerial and political pressures that exist within organisations.
Medicine and academia have shown this is possible and practical. They have long standing ethical governance mechanisms that allow high-level deliberation of ethical issues and rapid tactical, pragmatic ethical governance at a working level. Government needs to come to a similar arrangement within technology and public policy learning from best practice elsewhere.
Some large organisations have already set up Ethics committees  to advise on these future issues. But it is hard to see where, say, a small software development team or a third sector body might go for ethical advice. The Samaritans Radar fiasco  is just dying down as we go to press – superficially it seems that one of Britain’s outstanding mental health support charities made a terrible mistake in not understanding ethical conventions in data governance during product design and testing that would potentially affect millions of people. It seems highly likely that simple, informed external ethical advice with a digital dimension could have prevented this.
The scope of this ethical framework could usefully extend beyond big data and personal data to areas that the public and private sectors can reasonably be expected to trial during the next term of office, such as wearable technologies, health monitoring and robotics . It could also advise government on complex issues at the boundary of technology and society such as the ongoing European disputes over the “Right to be Forgotten” .
Given the scale of the challenge and concepts involved the membership of the governance structure should extend beyond public sector employees, it should represent society and the many voices and experts within it.
The ethics framework would assist policy makers and delivery teams both within and outside the public sector to make appropriate decisions for the long-term good of society.
“Building trust must also be at the centre of digital government thinking. Citizens must have confidence in the ways that their sensitive data will be used and privacy is also an important part of trust.” – Professional body
 Adrian Short wrote a set of articles exploring Samaritans Radar and the ethical and legal consequences. This is a good starting point: https://adrianshort.org/unethical-twitter/
 For example the ‘trolley problem’ and robotic cars: http://www.wired.com/2014/08/heres-a-terrible-idea-robot-cars-with-adjustable-ethics-settings/
This page reformats automatically when printed. Print this section