Privacy Commissioner tackles GDPR, the regulation of AI and more
- 16 May, 2018 07:00
Privacy Commissioner John Edwards
New Zealand's Privacy Commissioner, John Edwards, is firing on all cylinders, tackling some of the most vexing issues facing technology developers and users alike.
With a new Privacy bill before Parliament, Edwards has still found time to comment and act on new EU data protection regulations and what could be the debate of the century - the role of artificial intelligence and automated decision making.
This week, Edwards fired a warning salvo across the bows of supermarket operator Foodstuffs after the company admitted it had deployed facial recognition technology to tackle shoplifting.
The Commissioner even took a stand late last year in a long running privacy case involving Microsoft, supported by other cloud providers, before the US Supreme Court.
In a speech to the Privacy Forum at Te Papa last week, Edwards suggested the country's own Privacy Act could benefit from the adoption of aspects of the EU's General Data Protection Regulation (GDPR), which comes into effect next week.
Among other issues, GDPR addresses automated decision making and artificial intelligence, giving individuals the right to human intervention in cases where adverse decisions are made.
The controller of the data used to make the decision under GDPR will have a duty to safeguard the rights, freedoms and interests of the subjects of those decisions, at least to the point of being able to contest the decision and receive human review of it.
"These protections are significant in the context of international benchmark setting," Edwards said. "They are an influential signpost to future regulatory settings on automated decision making for greater transparency.
"I’m going to draw them to Parliament’s attention when we make our submission on the Privacy Bill and suggest that they might make a useful addition to the regulatory framework here, now that we have that rare opportunity."
Edwards said the use of algorithmic decision making might affect resource allocation, the availability of goods or services to particular individuals in the economy provided by commercial or government agencies.
CEOs and ministers are asking about automation, he said. What can be learned from datasets to inform policy, or business strategy?
"The pressure to look to technology to provide answers to complex social problems is increasing, and is supported by consultancies, data scientists and software vendors," he added.
And that brings the conversation back to the facial recognition technology used by the likes of Foodstuffs because it works through the use of AI and machine learning.
Edwards cited a study on bias in facial recognition software by Joy Buolamwini at the MIT Media Lab, which showed that gender was misidentified in less than one per cent of lighter-skinned males; in up to seven per cent of lighter-skinned females; up to 12 per cent of darker skinned males; and up to 35 per cent in darker-skinned females.
“Overall, male subjects were more accurately classified than female subjects and lighter subjects were more accurately classified than darker individuals,” Edwards said.
It was one of the problems the Commissioner foresaw when reporting reservations about the now cancelled Ministry of Social Development plan to tie funding to data in the NGO sector.
"If you drive away the most vulnerable, their data won’t be in your system," Edwards said. "They won’t be reported on, and therefore they’ll be less likely to be targeted for assistance."
It could even make the problem disappear, he said: "No data, no problem!"
Even a small error rate can create big problems, Edwards said.
Read more on the next page...
"Half a per cent of the 120,000 registered unemployed in January for example, is 600 people misidentified or ineligible, which might be a lot of disruption and grief for an already vulnerable group, depending on what you are going to do with that data."
In a separate speech to the Domain Name Commission earlier this month, Edwards tackled another thorny emerging GDPR issue: the effect of the new rules on the register of domain name holders.
Traditionally, it has been possible to search for, or "Look Up", domain name owners through a simple search.
Edwards said there is "an interpretation" of GDPR that it prohibits companies from publishing information that identifies at least some individual domain name holders.
WHOIS information about European-based registrants will be in breach of GDPR rules while existing ICANN agreements with registrars about WHOIS data will also be in breach.
Domain name companies say it's not clear whether ICANN wants them to apply the new rules to all domain registrants or just to those that live in Europe.
Registrant GoDaddy has already decided to redact email, names, and phone numbers from all of its published WHOIS records.
Edwards warned that if this approach catches on, it will be a blow to security researchers who depend on bulk access to WHOIS data, as well as to data analysis services, journalists and web archivists among others.
However, New Zealand domain registrars have a head start, Edwards said.
The Privacy Commissioner said his office had "strongly encouraged" the creation of a privacy option, consistent with the information privacy principles.
"We’ve also supported the Domain Name Commission by assisting in its guidance to users," Edwards said.
As part of monitoring the IRPO, the Domain Name Commission has said it will publish transparency reports about the access and disclosure of withheld information as well as developing Memorandums of Understanding where there is an ongoing and legitimate need for access to information that is not on the public register.
Last week, a result of that consultation emerged in the form of guidance on privacy and domain name data to "foster greater awareness of good privacy practices in the .nz domain name space".
Amongst all that, Edwards and his team even had time to launch a new Privacy Trust Mark to give New Zealanders assurance that a product or service has been designed with their privacy interests in mind.
Edwards said the Privacy Trust Mark demonstrates that a “privacy by design” approach was used and is intended to give consumers confidence in particular products or services.