l
l
blogger better. Powered by Blogger.

Search

Labels

blogger better

Followers

Blog Archive

Total Pageviews

Labels

Download

Blogroll

Tuesday, November 6, 2018

Facebook must change and policymakers must act on data, warns UK watchdog

The UK’s data watchdog has warned that Facebook must overhaul its privacy-hostile business model or risk burning user trust for good.

Comments she made today have also raised questions over the legality of so-called lookalike audiences to target political ads at users of its platform.

Information commissioner Elizabeth Denham was giving evidence to the Digital, Culture, Media and Sport committee in the UK parliament this morning. She’s just published her latest report to parliament, on the ICO’s (still ongoing) investigation into the murky world of data use and misuse in political campaigns.

Since May 2017 the watchdog has been pulling on myriad threads attached to the Cambridge Analytica Facebook data misuse scandal — to, in the regulator’s words, “follow the data” across an entire ecosystem of players; from social media firms to data brokers to political parties, and indeed beyond to other still unknown actors with an interest in also getting their hands on people’s data.

Denham readily admitted to the committee today that the sprawling piece of work had opened a major can of worms.

“I think we were astounded by the amount of data that’s held by all of these agencies — not just social media companies but data companies like Cambridge Analytica; political parties the extent of their data; the practices of data brokers,” she said.

“We also looked at universities, and the data practices in the Psychometric Centre, for example, at Cambridge University — and again I think universities have more to do to control data between academic researchers and the same individuals that are then running commercial companies.

“There’s a lot of switching of hats across this whole ecosystem — that I think there needs to be clarity on who’s the data controller and limits on how data can be shared. And that’s a theme that runs through our whole report.”

“The major concern that I have in this investigation is the very disturbing disregard that many of these organizations across the entire ecosystem have for personal privacy of UK citizens and voters. So if you look across the whole system that’s really what this report is all about — and we have to improve these practices for the future,” she added. “We really need to tighten up controls across the entire ecosystem because it matters to our democratic processes.”

Asked whether she would personally trust her data to Facebook, Denham told the committee: “Facebook has a long way to go to change practices to the point where people have deep trust in the platform. So I understand social media sites and platforms and the way we live our lives online now is here to stay but Facebook needs to change, significantly change their business model and their practices to maintain trust.”

“I understand that platforms will continue to play a really important role in people’s lives but they need to take much greater responsibility,” she added when pressed to confirm that she wouldn’t trust Facebook.

A code of practice for lookalike audiences

In another key portion of the session Denham confirmed that inferred data is personal data under the law.(Although of course Facebook has a different legal interpretation of this point.)

Inferred data refers to inferences made about individuals based on data-mining their wider online activity — such as identifying a person’s (non-stated) political views by examining which Facebook Pages they’ve liked. Facebook offers advertisers an interests-based tool to do this — by creating so-called lookalike audiences comprises of users with similar interests.

But if the information commissioner’s view of data protection law is correct, it implies that use of such tools to infer political views of individuals could be in breach of European privacy law. Unless explicit consent is gained beforehand for people’s personal data to be used for that purpose.

“What’s happened here is the model that’s familiar to people in the commercial sector — or behavioural targeting — has been transferred, I think transformed, into the political arena,” said Denham. “And that’s why I called for an ethical pause so that we can get this right.

“I don’t think that we want to use the same model that sells us holidays and shoes and cars to engage with people and voters. I think that people expect more than that. This is a time for a pause, to look at codes, to look at the practices of social media companies, to take action where they’ve broken the law.”

She told MPs that the use of lookalike audience should be included in a Code of Practice which she has previously called for vis-a-vis political campaigns’ use of data tools.

Social media platforms should also disclose the use of lookalike audiences for targeting political ads at users, she said today — a data-point that Facebook has nonetheless omitted to include in its newly launched political ad disclosure system.

“The use of lookalike audiences should be made transparent to the individuals,” she argued. “They need to know that a political party or an MP is making use of lookalike audiences, so I think the lack of transparency is problematic.”

Asked whether the use of Facebook lookalike audiences to target political ads at people who have chosen not to publicly disclose their political views is legal under current EU data protection laws, she declined to make an instant assessment — but told the committee: “We have to look at it in detail under the GDPR but I’m suggesting the public is uncomfortable with lookalike audiences and it needs to be transparent.”

We’ve reached out to Facebook for comment.

Links to known cyber security breaches

The ICO’s latest report to parliament and today’s evidence session also lit up a few new nuggets of intel on the Cambridge Analytica saga, including the fact that some of the misused Facebook data — which had found its way to Cambridge University’s Psychometric Centre — was not only accessed by IP addresses that resolve to Russia but some IP addresses have been linked to other known cyber security breaches.

“That’s what we understand,” Denham’s deputy, James Dipple-Johnstone told the committee. “We don’t know who is behind those IP addresses but what we understand is that some of those appear on lists of concern to cyber security professionals by virtue of other types of cyber incidents.”

“We’re still examining exactly what data that was, how secure it was and how anonymized,” he added saying “it’s part of an active line of enquiry”.

The ICO has also passed the information on “to the relevant authorities”, he added.

The regulator also revealed that it now knows exactly who at Facebook was aware of the Cambridge Analytica breach at the earliest instance — saying it has internal emails related to it issue which have “quite a large distribution list”. Although it’s still not been made public whether or not Mark Zuckerberg name is on that list.

Facebook’s CTO previously told the committee the person with ultimate responsibility where data misuse is concerned is Zuckerberg — a point the Facebook founder has also made personally (just never to this committee).

When pressed if Zuckerberg was on the distribution list for the breach emails, Denham declined to confirm so today, saying “we just don’t want to get it wrong”.

The ICO said it would pass the list to the committee in due course.

Which means it shouldn’t be too long before we know exactly who at Facebook was responsible for not disclosing the Cambridge Analytica breach to relevant regulators (and indeed parliamentarians) sooner.

The committee is pressing in this because Facebook gave earlier evidence to its online disinformation enquiry yet omitted to mention the Cambridge Analytica breach entirely. (Hence its accusation that senior management at Facebook deliberately withheld pertinent information.)

Denham agreed it would have been best practice for Facebook to notify relevant regulators at the time it became aware of the data misuse — even without the GDPR’s new legal requirement being in force then.

She also agreed with the committee that it would be a good idea for Zuckerberg to personally testify to the UK parliament.

Last week the committee issued yet another summons for the Facebook founder — this time jointly with a Canadian committee which has also been investigating the same knotted web of social media data misuse.

Though Facebook has yet to confirm whether or not Zuckerberg will make himself available this time.

How to regulate Internet harms?

This summer the ICO announced it would be issuing Facebook with the maximum penalty possible under the country’s old data protection regime for the Cambridge Analytica data breach.

At the same time Denham also called for an ethical pause on the use of social media microtargeting of political ads, saying there was an urgent need for “greater and genuine transparency” about the use of such technologies and techniques to ensure “people have control over their own data and that the law is upheld”.

She reiterated that call for an ethical pause today.

She also said the fine the ICO handed Facebook last month for the Cambridge Analytica breach would have been “significantly larger” under the rebooted privacy regime ushered in by the pan-EU GDPR framework this May — adding that it would be interesting to see how Facebook responds to the fine (i.e. whether it pays up or tries to appeal).

“We have evidence… that Cambridge Analytica may have partially deleted some of the data but even as recently as 2018, Spring, some of the data was still there at Cambridge Analytica,” she told the committee. “So the follow up was less than robust. And that’s one of the reasons that we fined Facebook £500,000.”

Data deletion assurances that Facebook had sought from various entities after the data misuse scandal blew up don’t appear to be worth the paper they’re written on — with the ICO also noting that some of these confirmations had not even been signed.

Dipple-Johnstone also said it believes that a number of additional individuals and academic institutions received “parts” of the Cambridge Analytica Facebook data-set — i.e. additional to the multiple known entities in the saga so far (such as GSR’s Aleksandr Kogan, and CA whistleblower Chris Wylie).

“We’re examining exactly what data has gone where,” he said, saying it’s looking into “about half a dozen” entities — but declining to name names while its enquiry remains ongoing.

Asked for her views on how social media should be regulated by policymakers to rein in data abuses and misuses, Denham suggested a system-based approach that looks at effectiveness and outcomes — saying it boils down to accountability.

“What is needed for tech companies — they’re already subject to data protection law but when it comes to the broader set of Internet harms that your committee is speaking about — misinformation, disinformation, harm to children in their development, all of these kinds of harms — I think what’s needed is an accountability approach where parliament sets the objectives and the outcomes that are needed for the tech companies to follow; that a Code of Practice is developed by a regulator; backstopped by a regulator,” she suggested.

“What I think’s really important is the regulators looking at the effectiveness of systems like takedown processes; recognizing bots and fake accounts and disinformation — rather than the regulator taking individual complaints. So I think it needs to be a system approach.”

“I think the time for self regulation is over. I think that ship has sailed,” she also told the committee.

On the regulatory powers front, Denham was generally upbeat about the potential of the new GDPR framework to curb bad data practices — pointing out that not only does it allow for supersized fines but companies can be ordered to stop processing data, which she suggested is an even more potent tool to control rogue data-miners.

She also said suggested another new power — to go in and inspect companies and conduct data audits — will help it get results.

But she said the ICO may need to ask parliament for another tool to be able to carry out effective data investigations. “One of the areas that we may be coming back to talk to parliament, to talk to government about is the ability to compel individuals to be interviewed,” she said, adding: “We have been frustrated by that aspect of our investigation.”

Both the former CEO of Cambridge Analytica, Alexander Nix, and Kogan, the academic who built the quiz app used to extract Facebook user data so it could be processed for political ad targeting purposes, had refused to appear for an interview with it under caution, she said today.

On the wider challenge of regulating a full range of “Internet harms” — spanning the spread of misinformation, disinformation and also offensive user-generated content — Denham suggested a hybrid regulatory model might ultimately be needed to tackle this, suggesting the ICO and communications regular Ofcom might work together.

“It’s a very complex area. No country has tackled this yet,” she conceded, noting the controversy around Germany’s social media take down law, and adding: “It’s very challenging for policymakers… Balancing privacy rights with freedom of speech, freedom of expression. These are really difficult areas.”

Asked what her full ‘can of worms’ investigation has highlighted for her, Denham summed it up as: “A disturbing amount of disrespect for personal data of voters and prospective voters.”

“The main purpose of this [investigation] is to pull back the curtain and show the public what’s happening with their personal data,” she added. “The politicians, the policymakers need to think about this too — stronger rules and stronger laws.”

One committee member suggestively floated the idea of social media platforms being required to have an ICO officer inside their organizations — to grease their compliance with the law.

Smiling, Denham responded that it would probably make for an uncomfortable prospect on both sides.



from Social – TechCrunch https://ift.tt/eA8V8J Facebook must change and policymakers must act on data, warns UK watchdog Natasha Lomas https://ift.tt/2qur7H0
via IFTTT

0 comments

Post a Comment

blogger better Headline Animator