Irish Data Protection Commissioner Helen Dixon is taking the long view on how best to navigate and protect privacy in an increasingly post-truth world. She talks to John Kennedy.
As we talk about data privacy, never far from Ireland’s Data Protection Commissioner (DPC) Helen Dixon is a well-thumbed and very annotated copy of the General Data Protection Regulation (GDPR) legislation that was activated in May of this year. I suspect she never lets it out of her sight.
Since taking the office in 2014, Dixon has presided over a quadrupling of the DPC’s annual budget, the opening of a new office in Dublin and an increase in staff to more than 100 people. She has also spearheaded the awareness and education around GDPR for Irish businesses and citizens while simultaneously finding herself at the heart of landmark legal battles concerning social networks and the transmission of Europeans’ data to the US.
‘The truth about technology is that you can’t just lock it outside the door of the school’
– HELEN DIXON
The genesis of our conversation began with a panel I moderated during the Irish Government’s recent Data Summit in September. Alongside Dixon was Google’s former ads boss, Sridhar Ramaswamy, and Lisbon Council’s Luukas Ilves.
Earlier that week, I penned an op-ed arguing for data privacy education to be introduced in Irish schools. I reasoned that even though we are now decades into the digital revolution, we need to start somewhere as this kind of education is necessary.
I didn’t realise at the time how serendipitous this would because the Office of the Data Protection Commissioner (ODPC) was at that very moment about to start piloting data privacy education modules in Irish classrooms ahead of potential policy decisions in this area. Three schools in Dublin and Meath recently began a pilot of lesson plans devised by staff from the ODPC who have a background in education. It was arranged with the support of the Ombudsman for Children’s Office in Ireland. The lesson plans are designed to engage with children in three different age groups: 9 to 10, 14 to 15, and 16 and upwards. The move is a first and a potentially timely move for a generation that never knew what life was like before the internet, social media and smartphones.
The initial pilot and feedback will inform potential creation of a national lesson plan. “We hope the curriculum would take it on,” Dixon said. “We have written to the secretary general of the Department of Education to notify him of what we are doing in this area. We have worked with the Ombudsman for Children Niall Muldoon’s office, which is supporting us with this pilot.”
Privacy in the post-truth world
You get the feeling that as well as stewarding a nation through one of the biggest cultural and economic upheavals brought about by digital, Dixon is intellectually immersed and invested in the topic. Her conversation is punctuated by references to A Theory of Creepy by Omer Tene and Jules Polonetsky, and the brilliant work of Yuval Noah Harari in Homo Deus: A Brief History of Tomorrow.
As we talk about how social media, for example, has infiltrated our very being and consciousness, Dixon compared comments posted in anger on Facebook or Twitter to tattoos. “The difference is, you can get laser treatment to remove tattoos.”
She continued: “I remember a conversation I had with Christine Lagarde of the IMF when she visited Dublin during the summer and we talked briefly about the whole issue of data and she very succinctly said to me: ‘Three things with data: competition, privacy and citizen education.’
“And you have hit on the least-focused aspect and we do need to talk about it more, and the difficult bit is citizen education.”
The problem that Dixon is trying to address is how technology is being assimilated by society faster than how people are figuring out how to best protect themselves.
“This is an issue because tech is crashing up against social norms and forcing us to evolve. I used the example a few times that Jules Polonetsky wrote about in the Theory of Creepy, and one of the things that was talked about in terms of evolving social norms as a backdrop to this whole area is the example of caller ID.
“Back when you got your first mobile phone in the 1990s, caller ID was introduced and there was outrage that if a person had to ring AA, for example, your number might be displayed and individuals sought to have that concealed.
“Now it has come 360 degrees and none of us will answer our phone until we see and have the number identified to us. If you are a regulator making a decision on if it is correct that numbers are displayed or not, it’s a very hard call. And, of course, as with many areas of data protection, it is an issue of consumer choice to control in what circumstances they want their number displayed and so on.”
This scenario of how sentiment can change as people become more digital-savvy is something that is top of Dixon’s mind as she factors in how a society such as Ireland can best protect its children in an increasingly digital-first world.
“As an office, we are very busy dealing with GDPR and we have a lot of investigations underway at a particularly large scale as well as investigating high-profile data breaches, but we don’t want to lose sight of this.
“We can leverage a lot of what we put in place. GDPR is very high-level and principles-based and, while it helpfully calls out that children should have specific protections, there is no specificity of how to do it. And that’s what we want to engage in now.”
Dixon is a realist when it comes to the issue of data, privacy and children, especially during a year in which Ireland elected to place the digital age of consent at 16.
“The truth about technology is that you can’t just lock it outside the door of the school. I’m interested to see what children themselves understand about privacy and what parts of their lives are they putting out there. It is not about being prescriptive.”
Regulating blockchain
Another separate issue looming on the horizon for data authorities all over the world is the rise of blockchain technology. It is inevitable that the myriad of uses for blockchain will put it on the radar of data regulators.
“The very nature of blockchain with the distributed and peer-to-peer aspects of it pose a challenge in terms of identifying who are the actors that we recognise under data protection regulation.”
Dixon said that even though the technology is evolving fast, any conversations that data protection authorities have had so far have been at a high, conceptual level.
“At certain conferences I’ve attended, there have been very vigorous debates about how blockchain could be a very privacy-enhancing technology. And then you have the counter-arguments that it is the opposite, not based on data minimisation but lengthy retention periods, and uncertain means of control and exercise of rights.
“Our job is to monitor the application of the GDPR but also to help interpret it so that it can be applied in a way that safeguards rights. We will watch that space and watch it carefully.”