Kreller Hot Topic Report | Facebook, Privacy Awareness and Investigations

Kreller Hot Topic Report

By Lauren Caryer, PhD

Privacy in the Spotlight
July 24th, 2019 may prove to be a watershed day for privacy advocates, following statements from the Federal Trade Commission announcing a staggering $5 billion dollar civil penalty against Facebook over breaches of a 2012 FTC order regarding the company’s user privacy settings and a suit against data analytics company, Cambridge Analytica, for allegedly employing “deceptive tactics to harvest personal information from tens of millions of Facebook users for voter profiling and targeting.” Also on July 24th, separate from its settlement with the FTC, Facebook agreed to a $100 million settlement with the US Securities and Exchange Commission, “for making misleading disclosures regarding the risk of misuse of Facebook user data.” The SEC alleged that following its discovery of data misuse by the third-party developer, Cambridge Analytica, in 2015, Facebook continued to present such risks as hypothetical until March of 2018. On the same day that the SEC and FTC announced these settlements Netflix released the documentary The Great Hack, tying the alleged details of the Cambridge Analytica scandal to broader issues surrounding data mining and its sociopolitical implications. What’s more, Facebook’s regulatory woes seem far from over; the company is embroiled in similar investigations from the Privacy Commissioner of Canada and the Irish Data Protection Commission, which is currently fielding eleven investigations into the social media company’s potential violations of European data privacy regulations (GDPR). According to an August 1, 2019 report from The Wall Street Journal, the FTC and Department of Justice have also launched separate antitrust investigations into Facebook, examining whether the company’s acquisition practices “were part of a campaign to snap up potential rivals to head off competitive threats.”

The FTC’s $5 billion dollar complaint and settlement order against Facebook, Inc. follow from a 2012 FTC order regarding Facebook’s application programming interface “Graph API,” which allowed third party developers to access a wide swathe of data regarding app users and, notably, their friends, including dates of birth, employment history, education history, relationships, religious and political views, hometown, current town, interests, activities, and photos. The 2012 complaint alleged that Facebook misled its users by placing the opt-out settings relating to third-party developers outside of the main Privacy Settings page, leading Facebook’s users to believe that the selections chosen on the Privacy Settings page would also apply to access by third-party developers. In August of 2012, the FTC ordered that Facebook cease misrepresenting the means by which consumers could control privacy settings with relation to third-party developers; however, as alleged in the 2019 complaint, Facebook continued to bury information regarding third-party developers’ access to consumer data and the data of their friends.

According to the FTC, by August 2013, “Facebook was aware of the privacy risks posed by allowing millions of third-party developers to access and collect Affected Friend data” through the Graph API. Facebook subsequently commissioned an audit of its third party apps and found that “third-party developers were making more than 800 billion calls to the API per month and noted that permissions for Affected Friends’ data were being widely misused” [emphasis in complaint]. According to the complaint, Facebook ultimately decided to discontinue the access third-party developers would have to the data belonging to the app users’ friends and the company announced this decision on April 30, 2014, as part of a campaign to give “people power and control over how they share their data with apps.” However, unbeknownst to its users, Facebook continued to give pre-existing apps access to friend data for a full year following these statements. Reportedly, some so-called “Whitelisted Developers” were provided Graph API access without consumer knowledge through June 2018. According to the FTC’s related complaint against Cambridge Analytica, LLC, University of Cambridge researcher, Aleksandr Kogan controlled one of these whitelisted apps, through his company, Global Science Research, Ltd., which would go on to provide Graph API data to Cambridge Analytica and its UK-based parent group, SCL Group Ltd. Kogan’s app, which was originally developed as part of the University of Cambridge’s Prosociality and Well-Being Lab, operated in conjunction with “an algorithm that could predict an individual’s personality based on the individual’s ‘likes’ of public Facebook pages.” Reportedly, over the course of the project, which ended in May 2015, Kogan’s app harvested data from 250,000-270,000 users and 60-65 million of the users’ friends. The FTC alleged that this data was collected “through false and deceptive means” in violation of the EU-US Privacy Shield framework.

The Federal Trade Commission touted the $5 billion penalty against Facebook as “the largest ever imposed on any company for violating consumers’ privacy and almost 20 times greater than the largest privacy or data security penalty ever imposed worldwide.” Indeed, the FTC fine eclipses a January 2019 fine of €50 million ($57 million), imposed by France’s National Data Protection Commission (CNIL) on Google for similarly opaque data collection practices. In the press release, FTC Chairman Joe Simons stated that the “unprecedented” fine is meant to demonstrate that “The Commission takes consumer privacy seriously.” The fine also coincides with the public’s increasing awareness of both the value of personal identifying information and the precarious control individuals are able to exercise over this information. This awareness has culminated in the General Data Protection Regulation (GDPR) in Europe, the Digital Privacy Act in Canada, and increasingly vociferous calls from tech leaders, including Tim Cook, for similar legislation in the United States. As discussed in a March 29, 2019 post by The National Law Review, members of congress appear increasingly concerned with developing data privacy legislation.

Investigative Challenges and Strategies
As might be expected, the increased focus on digital privacy presents a number of challenges for online open source investigations. On the most general level, individuals have grown more circumspect about the information they provide online, ranging from abiding by the SEC’s advice to limit the amount of biographical and identifying information made public on social media, to a growing interest, perpetuated by tech theorists such as Jaron Lanier, in deleting social media accounts all together. More specifically, privacy concerns have altered some common tools used in open source research. As discussed in a June 10, 2019 Vice article, Facebook has recently disabled a search feature known as “Graph Search,” wherein investigators could construct very specific Facebook searches so as to identify, for example, overlapping check-ins by two Facebook users or all the photos commented on by a user. As noted in the article, while such features have been abused by bad actors, they have also been used by journalists and members of the open source intelligence (OSINT) community to investigate everything from sex trafficking to airstrikes in Yemen. As another example, domain name registration research can also be an invaluable tool for open source research. A record of a domain registrant can provide breadcrumbs linking a known website to an unknown individual or business, thus providing additional insight into a subject entity’s business affiliations. In the past year, many registrars have chosen to redact domain registrant (WHOIS) information, including the registrant’s name, address, telephone, and contact email, in compliance with European GDPR guidelines.

However, such challenges need not pose a serious threat to the viability of open source investigations, particularly in the field of corporate compliance. While some governments are moving to protect consumer privacy, many are simultaneously acting to increase corporate transparency, especially in response to concerns regarding money laundering. These moves toward transparency are a boon to investigations operating beyond the parameters of social media. For example, as noted by the BBC on June 19, 2019, the crown dependencies of Jersey, Guernsey, and the Isle of Man have announced that they will be taking steps to publicize their corporate beneficial ownership registers by 2023. Identifying information regarding the owners and directors of certain types of private companies is already available in many European corporate registries. Land/deed registries and legal filings can also be rich sources of publicly available information, both in the United States and abroad. In the U.S., due diligence research in the service of fraud prevention or detection is allowed under the Drivers’ Privacy Protection Act (DPPA) and Gramm-Leach-Bliley Act (GLBA), providing additional avenues of research to licensed and vetted firms. Finally – and especially in jurisdictions where public information is thin – proper due diligence may call for the human touch, i.e. contacting references, visiting a business to verify its operations, conducting character inquiries, etc. A Facebook profile may tell you what an individual “likes,” but with a few discreet phone calls, a good investigator can get a sense of what an individual is like.

The Kreller Hot Topics Report is a monthly publication dedicated to insights on international issues and incidents.

About Kreller Group

The Kreller Companies were founded in 1988 by a former D&B national account manager who envisioned a straight forward and cost-effective way to conduct business investigations and share results with clients. Today the Kreller Companies are comprised of Kreller Group, Kreller Credit and Kreller Consulting.

Want to discuss how our expertise can help? Click here.

Share on facebook
Facebook
Share on twitter
Twitter
Share on linkedin
LinkedIn