Data Protection Data Privacy Legislation Legal Government UK
Following a series of high-profile data scandals, is increased regulation the only way to recover consumer trust?
How can we rebuild public trust in data collection and analysis? This was the question posed by delegates at yesterday’s Westminster eForum conference.
The event, held in central London, brought together policymakers, government advisors, and senior representatives of top international firms to discuss the future of data handling in the UK.
“It’s essential to get this right,” said Roger Taylor, chair of the Centre of Data, Ethics, and Innovation, who kicked off proceedings by outlining the government’s forthcoming National Data Strategy.
“The obstacles to the collection and use and sharing of data are many and varied. There are technical problems, legal problems, problems around market dynamics.
“But I just want to focus on one aspect that I think captures a lot of the underlying tensions, and that’s to do with public trust. What is a system that would rightfully command public trust?
“The first is obviously security – you need to be confident that there’s not going to be illegal or unauthorized access to your data.
“The second thing comes around the degree to which authorized access is controlled – how do I know that you’re only going to do with the data either what I’ve agreed you can do, or what I thought I’d agreed you can do?
“There [are] a number of aspects to this – what’s legal and what’s technically possible? But what do we collectively feel okay about?
“Because that might not be consistent with what is legally and technically okay.”
Aimed at “unlocking the power of data” in the UK economy and government, the National Data Strategy (PDF) will be introduced in the coming months.
Crisis of trust
The issue of public trust around data collection ran throughout the conference.
A panel of experts from industry bigwigs such as the Information Commissioner’s Office (ICO) and the Open Data Institute deliberated over the key ethical questions posed by the use of personal data by third parties.Simon McDougall, executive director of technology policy and innovation at the ICO, told delegates that the main goal of the privacy regulator, at this moment in time, is to increase the public’s trust and confidence in how data is used and made available.
“I have to say that as that as one single metric, right now, we ain’t doing so good. Nobody here is doing that good,” he said.
He added: “We are having a crisis of trust right now in how innovation is happening, [and] how data is being used. And that is a challenge for us as a regulator, it’s a challenge for everybody in this room, it’s a challenge for anybody visiting new and innovative uses of data in the future.
“So how have we got here, and how do we get out of that?”McDougall admitted that failings of the UK public sector to manage data securely had eroded trust – but noted that big tech firms, comparatively, had done even more to damage consumer confidence.
In the US, the legal right to privacy is generally understood in relation to government bodies, as an outflow from our protections against things like unreasonable search and seizure and self-incrimination. Many of the data problems over which technology commentators agonize have well-developed bodies of analogous precedent in common law.
He said: “We’re building up a trust deficit where every time we do something different within the technology community and the public sector community, we don’t bring the public with us. They’re not engaged, and the trust issue widens.”
The way to tackle this is to process data in a privacy-respectful way, he said.
McDougall told representatives of private sector companies that the ICO is committed to tackling the problems through a range of initiatives such as introducing frameworks for data collection, and helping businesses to employ privacy controls when collecting sensitive information.
He added: “We want to work with innovators who want to do new things in a privacy-respectful way and if they’re willing to come to us, we’re willing to invest a lot of time to try and make sure they can do that.”
Caution over innovation?
The use of algorithms was a hot topic for the panel, from the ICO’s controversial Gangs Matrix to healthcare facilities using artificial intelligence for medical diagnoses.
Talk has moved to whether new laws and regulations to govern the use of growing technologies could help to restore public faith in both public sector and private sector data handlers.
Mark O’Conor, a partner at law firm DLA Piper, commented: “I’m not sure that law needs to catch up, as people like to say and I’m a keen advocate against any type of dual legal system for the real world and the artificial world. It needs to knit together. I think we need to focus on interpretation.
“Without some sort of framework, we do run the risk of caution over innovation, which would be to everyone’s detriment, but I’ll leave you with this thought.
“With regard to national strategy, shouldn’t we be talking about an international strategy?”
Referring to the so-called "tech lash," which was in part sparked by Facebook's giant Cambridge Analytica data breach in March 2018, Smith said the incident "captivated people's attention in a way that data breaches or issues in the past had not." The European Union significantly strengthened privacy laws with the introduction of GDPR, while California broke ground with its own privacy legislation last year.
Jeni Tennison, CEO at the Open Data Institute, agreed that a framework should be introduced, and said that allowing Big Tech firms to regulate themselves will no longer suffice.
She said: “The trap we need to avoid is that thinking that self-regulation and codes of conduct within the industry are enough.
“Even Mark Zuckerberg is saying that Facebook needs to be regulated. So it is up to governments to provide the enabling environment that provides the trust.”
RELATED GCHQ to share threat intelligence with UK businesses
The End of Trust (McSweeney's 54)