Showing posts with label safety. Show all posts
Showing posts with label safety. Show all posts

Wednesday 4 April 2018

Are those nasty digital chickens coming home to roost for Mark Zuckerberg and Facebook?


In 2014 rumours began to spread about the about Strategic Communication Laboratries (SLC) Cambridge Analytica.

By 12 December 2015, after contacting Facebook's public relations representatives in London, The Guardian (UK) was reporting that:

"A little-known data company, now embedded within Cruz’s campaign and indirectly financed by his primary billionaire benefactor, paid researchers at Cambridge University to gather detailed psychological profiles about the US electorate using a massive pool of mainly unwitting US Facebook users built with an online survey.
As part of an aggressive new voter-targeting operation, Cambridge Analytica – financially supported by reclusive hedge fund magnate and leading Republican donor Robert Mercer – is now using so-called “psychographic profiles” of US citizens in order to help win Cruz votes, despite earlier concerns and red flags from potential survey-takers.

Documents seen by the Guardian have uncovered longstanding ethical and privacy issues about the way academics hoovered up personal data by accessing a vast set of US Facebook profiles, in order to build sophisticated models of users’ personalities.

By 6 January 2016 The Guardian was reporting on what was likely to turn up in Facebook feeds by way of political advertising:

If you lived in north-east Iowa, the evangelical stronghold where the battle for the soul of conservative American politics will play out in person on Monday, and happened to have given Senator Ted Cruz’s campaign your email address sometime in the last few months, you might find something especially appealing this weekend in your Facebook feed.

Even the most obtuse member of Facebook Inc.'s board or senior management would have been aware that the company was fast becoming an active participant in the US presidential primaries campaign. 

Fast forward to now as the chickens come home to roost.......
Google Search, 3 April 2018

The Guardian, 26 March 2018:

In rejecting the media’s characterisation of this large-scale privacy violation as a “data breach”, Facebook claims “everyone involved” in the 2014 data-siphoning exercise had given their consent. “People knowingly provided their information,” the company claimed. As with its interpretation of the word “clear”, Facebook seems to have a skewed understanding of what “knowingly” really means.

Facebook’s senior executives may now be feeling apologetic, “outraged” even. But in January 2016, as Trump surged in the polls, Facebook’s COO, Sheryl Sandberg, told investors the 2016 election was “a big deal in terms of ad spend”. In other words, a major commercial opportunity. The ability to target voters, she said, was key: “Using Facebook and Instagram ads you can target by congressional district, you can target by interest, you can target by demographics or any combination of those,” she boasted. “And we’re seeing politicians at all levels really take advantage of that targeting.”

It’s perhaps worth remembering, then, that until recently Facebook was encouraging political operatives to take full advantage of its garden of surveillance. And while aspects of the Cambridge Analytica affair may be surprising, and offer a disturbing glimpse into the shadows, the routine exploitation of information about our lives – about who we are – is what’s powering Facebook. It’s the behemoth’s lifeblood.

This was a statement from the U.K. Parliament House of Commons Digital, Culture, Media and Sport Committee on 28 March 2018:

Christopher Wylie gave evidence to the Committee on Tuesday 27th March 2018 during which he referred to the evidence the Committee is publishing today. This session is available to watch. Please note the transcript will be published online shortly.

On Tuesday 20th March, the Committee Chair Damian Collins MP wrote to Mark Zuckerberg, CEO of Facebook, requesting oral evidence. Facebook have responded offering two senior executives. The Committee has accepted evidence from Chris Cox, Chief Product Officer, but has written today to Facebook to clarify whether  Mr. Zuckerberg will also appear himself, as requested. This matter was also raised with The UK Prime Minister Theresa May, in her evidence before the Liaison Committee on the evening of the 27th March. She said that Facebook should be taking the matter seriously.

On Thursday 22nd, the Committee wrote to Alexander Nix, the suspended CEO of Cambridge Analytica, recalling him to Parliament to give further evidence. Mr. Nix has agreed to come before the Committee again. You can watch the evidence session that took place on 27th February 2018 where Mr. Nix gave evidence on Parliamentlive.tv and read the transcript.


Wednesday 28 March 2018

Turns out that Facebook Inc is the biggest baddie of all on the Internet


“The FTC is firmly and fully committed to using all of its tools to protect the privacy of consumers. Foremost among these tools is enforcement action against companies that fail to honor their privacy promises, including to comply with Privacy Shield, or that engage in unfair acts that cause substantial injury to consumers in violation of the FTC Act. Companies who have settled previous FTC actions must also comply with FTC order provisions imposing privacy and data security requirements. Accordingly, the FTC takes very seriously recent press reports raising substantial concerns about the privacy practices of Facebook. Today, the FTC is confirming that it has an open non-public investigation into these practices.”  [US Federal Trade Commission (FTC), Statement, 26 March 2018]

It may have been the Cambridge Analytica-Facebook situation as first set out by Carole Cadwalladr at The Guardian & The Observer (UK) that recently alerted the average Internet user to the issue of digital privacy on social media and, it was certainly the situation which caught the eye of the US Federal Trade Commission which is now investigating.

The story of that data harvest so far.....

The Guardian UK, 25 March 2018:

The story of how those data made the journey from Facebook’s servers to Cambridge Analytica’s is now widely known. But it is also widely misunderstood. (Many people were puzzled, for example, by Facebook’s vehement insistence that the exfiltration of a huge trove of users’ data was not a “breach”.) The shorthand version of what happened – that “a slug of Facebook data on 50 million Americans was sucked down by a UK academic named Aleksandr Kogan, and wrongly sold to Cambridge Analytica” – misses an important point, which is that in acquiring the data in the first place Kogan was acting with Facebook’s full knowledge and approval.

In 2013, he wrote an app called “Thisisyourdigitallife” which offered users an online personality test, describing itself as “a research app used by psychologists”
Approximately 270,000 people downloaded it and in doing so gave their consent for Kogan to access information such as the city they set on their profile, or content they had liked, as well as more limited information about friends who had their privacy settings set to allow it. This drew more than 50 million unsuspecting Facebook users into Kogan’s net.

The key point is that all of this was allowed by the terms and conditions under which he was operating. Thousands of other Facebook apps were also operating under similar T&Cs – and had been since 2007, when the company turned its social networking service into an application platform.

So Kogan was only a bit player in the data-hoovering game: apps such as the insanely popular Candy Crush, for example, were also able to collect players’ public profiles, friends lists and email addresses. And Facebook seemed blissfully indifferent to this open door because it was central to its commercial strategy: the more apps there were on its platform the more powerful the network effects would be and the more personal data there would be to monetise.

That’s why the bigger story behind the current controversy is the fact that what Cambridge Analytica claimed to have accomplished would not have been possible without Facebook. Which means that, in the end, Facebook poses the problem that democracies will have to solve. [my yellow highlighting]

However, it is not the only way Facebook is collecting personal information to enrich Zuckerberg and his shareholders.

Now we find out that Facebook Inc is scraping information from Android devices such as mobile phones and adding phone logs to its Big Brother database.

Global News, 25 March 2018:

In the same week Facebook found itself in the middle of a massive data scandal, recent reports indicate that the social media giant has also scraped records of phone calls and SMS data from its users with Android devices without explicit permission.

New Zealand-based software developer Dylan McKay tweeted earlier this week that upon downloading his Facebook data in zip file (which is an option for all users) he claims to have discovered records of phone calls and a historical data of every contact on his phone., including contacts he no longer had, from a period between 2016 and 2017.
After he made the discovery, McKay set up a Google poll to gather evidence from other users who’ve been affected.

So far, just under 900 people have responded to the poll, and more than 20 per cent confirmed they found call records and/or text metadata in their Facebook data archive. Another 74 people responded to the poll saying that MMS data was collected, 106 people responded saying that SMS data was collected, and 104 responded saying that cellular calls were collected.

The story was first published by the tech news website Ars Technica on Saturday, who interviewed several Facebook users, and had a member of its staff download their Facebook data archive. Following, this, the site could confirm that the data file downloaded by the staff member contained call logs from a device that individual used between 2015 and 2016, as well as SMS and MMS message data.

Several Global News staff members also requested their data archives as well in the preparation of this story and some found that the contact lists from their mobile devices were recorded in the file. No one noted any text message or call logs in the data files they downloaded.

Ars Technica reached out to Facebook for comment before the publication of its story, who said that the practice was a common one among social networking and messaging apps.
“The most important part of apps and services that help you make connections is to make it easy to find the people you want to connect with. So, the first time you sign in on your phone to a messaging or social app, it’s a widely used practice to begin by uploading your phone contacts.”

Following McKay’s tweets, other users came out on social media expressing similar concerns about what they discovered after downloading their data archives.

In recent years, the company has updated this process to clarify that when requesting access to your contact list, it intends to access all call logs and SMS text messages as well, but Android users in the past may have unknowingly given Facebook access to this data. [my yellow highlighting]

It is also wise to remember that even Internet users who do not have a Facebook account have their PC or other digital device scanned for information each time they click on a link to Facebook



Facebook image via ZDNet, 3 January 2014

ZDNet on 3 January 2014: By "content" Facebook means “anything you or other users post on Facebook”. By "information" Facebook means “facts and other information about you, including actions taken by users and non-users who interact with Facebook”. [my yellow highlighting]

Nor should we ignore this report about Facebook's surreptitious activities.......

Law360 (March 2, 2018, 7:02 PM EST) -- A California federal judge held Friday that Facebook can’t shake a proposed class action over its allegedly unlawful collection and storage of non-users’ facial scans, declining to toss the matter for lack of standing, just as he recently did in a related suit involving users of the site.

U.S. District Judge James Donato rejected Facebook Inc.’s renewed motion to dismiss litigation led by Frederick William Gullen for lack of subject-matter jurisdiction, pointing to his Feb. 26 decision in a related proposed class action accusing the social media... 
[my yellow highlighting]

Then there is the lobbying to discourage federal regulation of Facebook.......

According to SOCIAL MEDIA CASEROUNDUP (selected cases) in April 2015, by 2013 Facebook Inc had spent more than US$1 million on lobbying efforts to water down the US Children's Online Privacy Protection Act (COPPA). It was particularly concerned about any change of status of third party "add ons"/"plug-ins" which might by default make platforms like Facebook legally liable for any harm to a minor/s which occurred, as well asbeing resistant to any increase in general protections for minors or any expanded definition of protected "personal information" being included in the Act.

Quartz, 22 March 2018:

Facebook CEO Mark Zuckerberg said yesterday that the company welcomes more regulation, particularly to bring transparency to political advertising online. But in recent months, Facebook has been quietly fighting lawmakers to keep them from passing an act that does exactly that, campaign transparency advocates and Congressional staff tell Quartz.

The Honest Ads Act was introduced last October to close a loophole that has existed since politicians started advertising on the internet, and was expected by many to sail through Congress. Coming as Congress investigated how Russia used tech companies to influence the 2016 election, it was considered by many in Washington DC to be the bare minimum lawmakers could do to address the problem.

The act introduces disclosure and disclaimer rules to online political advertising. Tech companies would have to keep copies of election ads, and make them available to the public. The ads would also have to contain disclaimers similar to those included in TV or print political ads, informing voters who paid for the ad, how much, and whom they targeted.

“The benefit of having disclaimers on all political ads [is] the more suspicious ads would be more identifiable,” said Brendan Fischer, the director of federal and Federal Election Commission reform at theCampaign Legal Center (CLC) in Washington.

In a vote of confidence from bitterly-divided Washington, the act was rolled out by a bipartisan group of senators—John McCain, the Republican from Arizona, and Democrats Amy Klobuchar from Minnesota and Mark Warner of Virginia—and it currently has the support of 18 senators. But it hasn’t moved from the committee on “Rules and Administration” since was first introduced, thanks in part to Facebook’s lobbying efforts.

Fischer, who is a co-author of a CLC report on US vulnerabilities online after the 2016 election, accuses Facebook of “working behind the scenes using the levers of power to stop any legislation from moving forward.”

Facebook’s lobbying clout

Lobbyists for the company have been trying to dissuade senators from moving the Honest Ads Act forward, some Congressional aides say

Facebook’s argument to Congress behind the scenes has been that they are “voluntarily complying” with most of what the Honest Ads Act asks, so why pass a law, said one Congressional staffer working on the bill. Facebook also doesn’t want to be responsible for maintaining the publicly accessible repository of political advertising, including funding information, that the act demands, the staffer said.

Facebook spent nearly $3.1 million lobbying Congress and other US federal government agencies in the last quarter of 2017, on issues including the Honest Ads Act according to its latest federal disclosure form. It also signed on Blue Mountain Strategies, a lobbying firm founded by Warner’s former chief of staff, an Oct. 30, 2017 filing shows.

It’s part of a massive uptick in lobbying spending in recent years. [my yellow highlighting]

Despite all its lobbying Facebook Inc is not immune from official censure for its deceptive business practices.

Take this analysis of a 2011 binding agreement between the US Federal Trade Commission and Facebook Inc.....


FEDERAL TRADE COMMISSION [File No. 092 3184], 2 December 2011:

The Federal Trade Commission has accepted, subject to final approval, a consent agreement from Facebook, Inc. (‘‘Facebook’’)……

The Commission’s complaint alleges eight violations of Section 5(a) of the FTC Act, which prohibits deceptive and unfair acts or practices in or affecting commerce, by Facebook:

* Facebook’s Deceptive Privacy Settings: Facebook communicated to users that they could restrict certain information they provided on the site to a limited audience, such as ‘‘Friends Only.’’ In fact, selecting these categories did not prevent users’ information from being shared with Apps that their Friends used.

* Facebook’s Deceptive and Unfair December 2009 Privacy Changes: In December 2009, Facebook changed its site so that certain information that users may have designated as private— such as a user’s Friend List —was made public, without adequate disclosure to users. This conduct was also unfair to users.

* Facebook’s Deception Regarding App Access: Facebook represented to users that whenever they authorized an App, the App would only access the information of the user that it needed to operate. In fact, the App could access nearly all of the user’s information, even if unrelated to the App’s operations. For example, an App that provided horoscopes for users could access the user’s photos or employment information, even though there is no need for a horoscope App to access such information. 

* Facebook’s Deception Regarding Sharing with Advertisers: Facebook promised users that it would not share their personal information with advertisers; in fact, Facebook did share this information with advertisers when a user clicked on a Facebook ad.

* Facebook’s Deception Regarding Its Verified Apps Program: Facebook had a ‘‘Verified Apps’’ program through which it represented that it had certified the security of certain Apps when, in fact, it had not. 

* Facebook’s Deception Regarding Photo and Video Deletion: Facebook stated to users that, when they deactivate or delete their accounts, their photos and videos would be inaccessible. In fact, Facebook continued to allow access to this content even after a user deactivated or deleted his or her account.

* Safe Harbor: Facebook deceptively stated that it complied with the U.S.-EU Safe Harbor Framework, a mechanism by which U.S. companies may transfer data from the European Union to the United States consistent with European law.
The proposed order contains provisions designed to prevent Facebook from engaging in practices in the future that are the same or similar to those alleged in the complaint.

Part I of the proposed order prohibits Facebook from misrepresenting the privacy or security of ‘‘covered information,’’ as well as the company’s compliance with any privacy, security, or other compliance program, including but not limited to the U.S.-EU Safe Harbor Framework. ‘‘Covered information’’ is defined broadly as ‘‘information from or about an individual consumer, including but not limited to: 
(a) A first or last name; 
(b) a home or other physical address, including street name and name of city or town; (c) an email address or other online contact information, such as an instant messaging user identifier or a screen name; 
(d) a mobile or other telephone number; 
(e) photos and videos; (f) Internet Protocol (‘‘IP’’) address, User ID, or other persistent identifier; (g) physical location; or 
(h) any information combined with any of (a) through (g) above.’’

Part II of the proposed order requires Facebook to give its users a clear and prominent notice and obtain their affirmative express consent before sharing their previously-collected information with third parties in any (a) through (g) above.’’ Part II of the proposed order requires Facebook to give its users a clear and prominent notice and obtain their affirmative express consent before sharing their previously-collected information with third parties in any way that materially exceeds the restrictions imposed by their privacy settings. A ‘‘material . . . practice is one which is likely to affect a consumer’s choice of or conduct regarding a product.’’ FTC Policy Statement on Deception, Appended to Cliffdale Associates, Inc., 103 F.T.C. 110, 174 (1984).

Part III of the proposed order requires Facebook to implement procedures reasonably designed to ensure that a user’s covered information cannot be accessed from Facebook’s servers after a reasonable period of time, not to exceed thirty (30) days, following a user’s deletion of his or her account.

Part IV of the proposed order requires Facebook to establish and maintain a comprehensive privacy program that is reasonably designed to: 
(1) Address privacy risks related to the development and management of new and existing products and services, and 
(2) protect the privacy and confidentiality of covered information. The privacy program must be documented in writing and must contain controls and procedures appropriate to Facebook’s size and complexity, the nature and scope of its activities, and the sensitivity of covered information. Specifically, the order requires Facebook to:
* Designate an employee or employees to coordinate and be responsible for the privacy program;
* Identify reasonably-foreseeable, material risks, both internal and external, that could result in the unauthorized collection, use, or disclosure of covered information and assess the sufficiency of any safeguards in place to control these risks;
* Design and implement reasonable controls and procedures to address the risks identified through the privacy risk assessment and regularly test or monitor the effectiveness of these controls and procedures;
* Develop and use reasonable steps to select and retain service providers capable of appropriately protecting the privacy of covered information they receive from respondent, and require service providers by contract to implement and maintain appropriate privacy protections; and
* Evaluate and adjust its privacy program in light of the results of the testing and monitoring, any material changes to its operations or business arrangements, or any other circumstances that it knows or has reason to know may have a material impact on the effectiveness of its privacy program.

Part V of the proposed order requires that Facebook obtain within 180 days, and every other year thereafter for twenty (20) years, an assessment and report from a qualified, objective, independent third-party professional, certifying, among other things, that it has in place a privacy program that provides protections that meet or exceed the protections required by Part IV of the proposed order; and its privacy controls are operating with sufficient effectiveness to provide reasonable assurance that the privacy of covered information is protected. Parts VI through X of the proposed order are reporting and compliance provisions. Part VI requires that Facebook retain all ‘‘widely disseminated statements’’ that describe the extent to which respondent maintains and protects the privacy, security, and confidentiality of any covered information, along with all materials relied upon in making such statements, for a period of three (3) years. Part VI further requires Facebook to retain, for a period of six (6) months from the date received, all consumer complaints directed at Facebook, or forwarded to Facebook by a third party, that relate to the conduct prohibited by the proposed order, and any responses to such complaints. Part VI also requires Facebook to retain for a period of five (5) years from the date received, documents, prepared by or on behalf of Facebook, that contradict, qualify, or call into question its compliance with the proposed order. Part VI additionally requires Facebook to retain for a period of three (3) years, each materially different document relating to its attempt to obtain the affirmative express consent of users referred to in Part II, along with documents and information sufficient to show each user’s consent and documents sufficient to demonstrate, on an aggregate basis, the number of users for whom each such privacy setting was in effect at any time Facebook has attempted to obtain such consent. Finally, Part VI requires that Facebook retain all materials relied upon to prepare the third-party assessments for a period of three (3) years after the date that each assessment is prepared. 

Part VII requires dissemination of the order now and in the future to principals, officers, directors, and managers, and to all current and future employees, agents, and representatives having supervisory responsibilities relating to the subject matter of the order. Part VIII ensures notification to the FTC of changes in corporate status. Part IX mandates that Facebook submit an initial compliance report to the FTC and make available to the FTC subsequent reports. Part X is a provision ‘‘sunsetting’’ the order after twenty (20) years, with certain exceptions.

The purpose of the analysis is to aid public comment on the proposed order. It is not intended to constitute an official interpretation of the complaint or proposed order, or to modify the proposed order’s terms in any way. 

By direction of the Commission. 
Donald S. Clark, Secretary. [FR Doc. 2011–31158 Filed 12–2–11; 8:45 am [my yellow highlighting]

Wednesday 21 March 2018

The large-scale personal data release Facebook Inc didn't tell the world about



“Back in 2004, when a 19-year-old Zuckerberg had just started building Facebook, he sent his Harvard friends a series of instant messages in which he marvelled at the fact that 4,000 people had volunteered their personal information to his nascent social network. “People just submitted it ... I don’t know why ... They ‘trust me’ ... dumb fucks.”  [The Guardian, 21 March 2018]

“Christopher Wylie, who worked for data firm Cambridge Analytica, reveals how personal information was taken without authorisation in early 2014 to build a system that could profile individual US voters in order to target them with personalised political advertisements. At the time the company was owned by the hedge fund billionaire Robert Mercer, and headed at the time by Donald Trump’s key adviser, Steve Bannon. Its CEO is Alexander Nix”  [The Guardian,18 March 2018]

Alexander James Ashburner Nix is listed by Companies House UK as the sole director and CEO of Cambridge Analytica (UK) Limited (formerly SCL USA Limited incorporated 6 January 2015). The majority of shares in the company are controlled by SCL Elections Limited (incorprated 17 October 2012) whose sole director and shareholder appears to be Alexander Nix. Mr. Nix in his own name is also a shareholder in Cambridge Analytica (UK) Limited.

Companies House lists ten companies with which Mr. Nix is associated.

NOTE: In July 2014 an Alastair Carmichael Macwillson incorporated Cambridge Analytica Limited, a company which is still active. Macwilliam styles himself as a management consultant and cyber security professional.

Nix's Cambridge Analytica was reported as indirectly financed by leading Republican donor Robert Mercer during the 2015 primaries and 2016 US presidential campaign.

On 15 December 2017 The Wall Street Journal reported that:

Special Counsel Robert Mueller has requested that Cambridge Analytica, a data firm that worked for President Donald Trump’s campaign, turn over documents as part of its investigation into Russian interference in the 2016 U.S. election, according to people familiar with the matter.

Concerns about Cambridge Analytica and its relationship with Facebook Inc. resurfaced this month.

The Guardian, 18 March 2018:

The data analytics firm that worked with Donald Trump’s election team and the winning Brexit campaign harvested millions of Facebook profiles of US voters, in one of the tech giant’s biggest ever data breaches, and used them to build a powerful software program to predict and influence choices at the ballot box….

Documents seen by the Observer, and confirmed by a Facebook statement, show that by late 2015 the company had found out that information had been harvested on an unprecedented scale. However, at the time it failed to alert users and took only limited steps to recover and secure the private information of more than 50 million individuals.

Recode, 17 March 2018:

Facebook is in another awkward situation. The company claims that it wasn’t breached, and that while it has suspended Cambridge Analytica from its service, the social giant is not at fault. Facebook contends that its technology worked exactly how Facebook built it to work, but that bad actors, like Cambridge Analytica, violated the company’s terms of service.

On the other hand, Facebook has since changed those terms of service to cut down on information third parties can collect, essentially admitting that its prior terms weren’t very good.

So how did Cambridge Analytica get Facebook data on some 50 million people?
Facebook’s Chief Security Officer, Alex Stamos, tweeted a lengthy defense of the company, which also included a helpful explanation for how this came about…..

Facebook offers a number of technology tools for software developers, and one of the most popular is Facebook Login, which lets people simply log in to a website or app using their Facebook account instead of creating new credentials. People use it because it’s easy — usually one or two taps — and eliminates the need for people to remember a bunch of unique username and password combinations.

When people use Facebook Login, though, they grant the app’s developer a range of information from their Facebook profile — things like their name, location, email or friends list. This is what happened in 2015, when a Cambridge University professor named Dr. Aleksandr Kogan created an app called “thisisyourdigitallife” that utilized Facebook’s login feature. Some 270,000 people used Facebook Login to create accounts, and thus opted in to share personal profile data with Kogan.

Back in 2015, though, Facebook also allowed developers to collect some information on the friend networks of people who used Facebook Login. That means that while a single user may have agreed to hand over their data, developers could also access some data about their friends. This was not a secret — Facebook says it was documented in their terms of service — but it has since been updated so that this is no longer possible, at least not at the same level of detail.

Through those 270,000 people who opted in, Kogan was able to get access to data from some 50 million Facebook users, according to the Times. That data trove could have included information about people’s locations and interests, and more granular stuff like photos, status updates and check-ins.

The Times found that Cambridge Analytica’s data for “roughly 30 million [people] contained enough information, including places of residence, that the company could match users to other records and build psychographic profiles.”

This all happened just as Facebook intended for it to happen. All of this data collection followed the company’s rules and guidelines.

Things became problematic when Kogan shared this data with Cambridge Analytica. Facebook contends this is against the company’s terms of service. According to those rules, developers are not allowed to “transfer any data that you receive from us (including anonymous, aggregate, or derived data) to any ad network, data broker or other advertising or monetization-related service.”

As Stamos tweeted out Saturday (before later deleting the tweet): “Kogan did not break into any systems, bypass any technical controls, our use a flaw in our software to gather more data than allowed. He did, however, misuse that data after he gathered it, but that does not retroactively make it a ‘breach.’”….

The problem here is that Facebook gives a lot of trust to the developers who use its software features. The company’s terms of service are an agreement in the same way any user agrees to use Facebook: The rules represent a contract that Facebook can use to punish someone, but not until after that someone has already broken the rules.

CNN tech, 19 March 2018:

Kogan's company provided data on millions of Americans to Cambridge Analytica beginning in 2014. The data was gathered through a personality test Facebook application built by Kogan. When Facebook users took the test they gave Kogan access to their data, including demographic information about them like names, locations, ages and genders, as well as their page "likes," and some of their Facebook friends' data.

There is some evidence that Cambridge Analytica is a bad actor according to a report by 4News on 19 March 2018:

Senior executives at Cambridge Analytica – the data company that credits itself with Donald Trump’s presidential victory – have been secretly filmed saying they could entrap politicians in compromising situations with bribes and Ukrainian sex workers.

In an undercover investigation by Channel 4 News, the company’s chief executive Alexander Nix said the British firm secretly campaigns in elections across the world. This includes operating through a web of shadowy front companies, or by using sub-contractors.

In one exchange, when asked about digging up material on political opponents, Mr Nix said they could “send some girls around to the candidate’s house”, adding that Ukrainian girls “are very beautiful, I find that works very well”.

In another he said: “We’ll offer a large amount of money to the candidate, to finance his campaign in exchange for land for instance, we’ll have the whole thing recorded, we’ll blank out the face of our guy and we post it on the Internet.”

Offering bribes to public officials is an offence under both the UK Bribery Act and the US Foreign Corrupt Practices Act. Cambridge Analytica operates in the UK and is registered in the United States.

The admissions were filmed at a series of meetings at London hotels over four months, between November 2017 and January 2018. An undercover reporter for Channel 4 News posed as a fixer for a wealthy client hoping to get candidates elected in Sri Lanka.

Mr Nix told our reporter: “…we’re used to operating through different vehicles, in the shadows, and I look forward to building a very long-term and secretive relationship with you.”

Along with Mr Nix, the meetings also included Mark Turnbull, the managing director of CA Political Global, and the company’s chief data officer, Dr Alex Tayler.

Mr Turnbull described how, having obtained damaging material on opponents, Cambridge Analytica can discreetly push it onto social media and the internet.

He said: “… we just put information into the bloodstream of the internet, and then, and then watch it grow, give it a little push every now and again… like a remote control. It has to happen without anyone thinking, ‘that’s propaganda’, because the moment you think ‘that’s propaganda’, the next question is, ‘who’s put that out?’.”

It should be noted that Cambridge Analytica has set up shop in Australia and the person named in the filing documents as the only shareholder was Allan Lorraine. Cambridge Analyitica is said to have met with representatives of the Federal Liberal Party in March 2017.

Despite denials to the contrary, It is possible that Cambridge Analytica has been consulted by state and federal Liberals since mid-2015 and, along with i360, was consulted by South Australian Liberals concerning targeted campaigning in relation to their 2018 election strategy.

Once the possibility of Australian connection became known, the Australian Information and Privacy Commissioner made preliminary inquiries.

News.com.au. 20 March 2018:

Facebook could be fined if Australians' personal information was given to controversial researchers Cambridge Analytica, the privacy watchdog says.

Australian Information and Privacy Commissioner Timothy Pilgrim says he is aware profile information was taken and used without authorisation.

"My office is making inquiries with Facebook to ascertain whether any personal information of Australians was involved," Mr Pilgrim said on Tuesday.

"I will consider Facebook's response and whether any further regulatory action is required.".

Cambridge Analytica is facing claims it used data from 50 million Facebook users to develop controversial political campaigns for Donald Trump and others.

The Privacy Act allows the commissioner to apply to the courts for a civil penalty order if it finds serious breaches of the law......

UK Information Commissioner Elizabeth Denham is also investigating the breach, promising it will be "far reaching" and any criminal or civil enforcement actions arising from it would be "pursued vigorously".

Facebook Inc's initial response to this issue was a denial of resonsibility which did not play well in financial markets

The Guardian, 21 March 2018:

It appears that while Facebook had been aware of what the Observer described as “unprecedented data harvesting” for two years, it did not notify the affected users.

What’s more, Facebook has displayed a remarkable lack of contrition in the immediate aftermath of the Observer’s revelations. Instead of accepting responsibility, its top executives argued on Twitter that the social network had done nothing wrong. “This was unequivocally not a data breach,” Facebook vice-president Andrew Bosworth tweeted on Saturday. “People chose to share their data with third party apps and if those third party apps did not follow the data agreements with us/users it is a violation. No systems were infiltrated, no passwords or information were stolen or hacked.”

In a sense, Facebook’s defence to the Cambridge Analytica story was more damning than the story itself. Tracy Chou, a software engineer who has interned at Facebook and worked at a number of prominent Silicon Valley companies, agrees that there wasn’t a hack or breach of Facebook’s security. Rather, she explains, “this is the way that Facebook works”. The company’s business model is to collect, share and exploit as much user data as possible; all without informed consent. Cambridge Analytica may have violated Facebook’s terms of service, but Facebook had no safeguards in place to stop them.

While some Facebook executives were busy defending their honour on Twitter over the weekend, it should be noted that Zuckerberg remained deafeningly silent. On Monday, Facebook’s shares dropped almost 7%, taking $36bn (£25.7bn) off the company’s valuation. Still, Zuckerberg remained silent. If you’re going to build a service that is influential and that a lot of people rely on, then you need to be mature, right? Apparently, silence is Zuck’s way of being mature.