Campbell et al v. Facebook Inc.
Filing
162
Exhibits in Support of 147 Administrative Motion to File Under Seal and Documents in Support of Facebooks Opposition to Plaintiffs Motion for Class Certification (Dkt. 149) filed by Facebook Inc.. (Attachments: # 1 Replacement for Dkt. 147-1 (Declaration of Nikki Stitt Sokol In Support Of Defendant Facebook, Inc.s Administrative Motion to File Documents in Support of its Opposition to Plaintiffs Motion for Class Certification Under Seal), # 2 Replacement for Dkt. 147-2 ([Proposed] Order Authorizing the Filing of Documents Under Seal), # 3 Replacement for Dkt. 147-5 ((Exhibit 3) Unredacted Chorba Declaration Motion to Seal), # 4 Replacement for Dkt. 147-6 ((Exhibit 4) Redacted Chorba Declaration Motion to Seal), # 5 Replacement for Dkt. 149-1 (Redacted Chorba Declaration Opposition to Class Certification), # 6 Replacement for Dkt. 149-7 (Redacted Expert Report of Dr. Catherine Tucker Opposition to Class Certification), # 7 Replacement for Dkt. 154-11 ((Exhibit 66) Unredacted Tucker Expert Report Motion to Seal), # 8 Replacement for Dkt. 154-12 ((Exhibit 67) Redacted Tucker Expert Report Motion to Seal), # 9 Replacement for Dkt. 155-1 (Unredacted Appendix of Evidence (Part 1 of 19) Motion to Seal), # 10 Replacement for Dkt. 156-9 (Unredacted Appendix of Evidence (Part 19 of 19) Motion to Seal), # 11 Replacement for Dkt. 157-1 (Redacted Appendix of Evidence (Part 1 of 13) Opposition to Class Certification), # 12 Replacement for Dkt. 157-13 (Unredacted Appendix of Evidence (Part 13 of 13)) Motion to Seal))(Chorba, Christopher) (Filed on 1/22/2016) Modified on 1/22/2016 (vlkS, COURT STAFF).
Replacement for
Dkt. 157-1
1
2
3
4
5
6
7
8
9
10
11
12
13
GIBSON, DUNN & CRUTCHER LLP
JOSHUA A. JESSEN, SBN 222831
JJessen@gibsondunn.com
JEANA BISNAR MAUTE, SBN 290573
JBisnarMaute@gibsondunn.com
PRIYANKA RAJAGOPALAN, SBN 278504
PRajagopalan@gibsondunn.com
ASHLEY M. ROGERS, SBN 286252
ARogers@gibsondunn.com
1881 Page Mill Road
Palo Alto, California 94304
Telephone: (650) 849-5300
Facsimile: (650) 849-5333
GIBSON, DUNN & CRUTCHER LLP
CHRISTOPHER CHORBA, SBN 216692
CChorba@gibsondunn.com
333 South Grand Avenue
Los Angeles, California 90071
Telephone: (213) 229-7000
Facsimile: (213) 229-7520
Attorneys for Defendant
FACEBOOK, INC.
14
UNITED STATES DISTRICT COURT
15
NORTHERN DISTRICT OF CALIFORNIA
16
17
18
OAKLAND DIVISION
MATTHEW CAMPBELL and MICHAEL
HURLEY,
19
Plaintiffs,
20
v.
21
FACEBOOK, INC.,
22
23
24
Defendant.
Case No. C 13-05996 PJH
PUTATIVE CLASS ACTION
APPENDIX OF EVIDENCE IN SUPPORT
OF DEFENDANT FACEBOOK, INC.’S
OPPOSITION TO PLAINTIFFS’ MOTION
FOR CLASS CERTIFICATION
HEARING:
Date: March 16, 2016
Time: 9:00 a.m.
Place: Courtroom 3, 3rd Floor
The Honorable Phyllis J. Hamilton
25
26
27
28
Gibson, Dunn &
Crutcher LLP
APPENDIX OF EVIDENCE IN SUPPORT OF DEFENDANT FACEBOOK, INC.’S OPPOSITION TO PLAINTIFFS’ MOTION
FOR CLASS CERTIFICATION
Case No. C 13-05996 PJH
1
2
3
4
5
6
7
8
9
10
11
12
GIBSON, DUNN & CRUTCHER LLP
JOSHUA A. JESSEN, SBN 222831
JJessen@gibsondunn.com
JEANA BISNAR MAUTE, SBN 290573
JBisnarMaute@gibsondunn.com
PRIYANKA RAJAGOPALAN, SBN 278504
PRajagopalan@gibsondunn.com
ASHLEY M. ROGERS, SBN 286252
ARogers@gibsondunn.com
1881 Page Mill Road
Palo Alto, California 94304
Telephone: (650) 849-5300
Facsimile: (650) 849-5333
GIBSON, DUNN & CRUTCHER LLP
CHRISTOPHER CHORBA, SBN 216692
CChorba@gibsondunn.com
333 South Grand Avenue
Los Angeles, California 90071
Telephone: (213) 229-7000
Facsimile: (213) 229-7520
Attorneys for Defendant
FACEBOOK, INC.
13
14
UNITED STATES DISTRICT COURT
15
NORTHERN DISTRICT OF CALIFORNIA
16
OAKLAND DIVISON
17
MATTHEW CAMPBELL and MICHAEL
HURLEY,
18
Plaintiffs,
19
v.
20
FACEBOOK, INC.,
21
Defendant.
22
23
24
25
26
27
28
Gibson, Dunn &
Crutcher LLP
APP. 1
Case No. C 13-05996 PJH
DECLARATION OF CHRISTOPHER
CHORBA IN SUPPORT OF DEFENDANT
FACEBOOK, INC.’S OPPOSITION TO
PLAINTIFFS’ MOTION FOR CLASS
CERTIFICATION
I, Christopher Chorba, declare as follows:
1
1.
2
I am an attorney admitted to practice law before this Court. I am a partner in the law
3
firm of Gibson, Dunn & Crutcher LLP, and I am one of the attorneys responsible for representing
4
Defendant Facebook, Inc. (“Facebook”) in the above-captioned action. I submit this declaration in
5
support of Facebook’s Opposition to Plaintiffs’ Motion for Class Certification (Dkt. 138). Unless
6
otherwise stated, the following facts are within my personal knowledge and, if called and sworn as a
7
witness, I could and would testify competently to these facts.
8
I.
9
10
Demonstratives
2.
and challenged practices. *
a. Attached as Exhibit A is a chart summarizing a number of individualized issues
11
12
concerning the named Plaintiffs and some putative class members.
b. Attached as Exhibit B is a graphical representation of the steps required to send
13
14
and receive a Facebook message with a URL preview attachment.
15
c. Attached as Exhibit C are graphical representations of the individualized inquiries
16
related to ascertainability.
17
d. Attached as Exhibit D are charts summarizing the variability for the challenged
18
practices.
19
20
21
22
23
24
Attached as Exhibits A–D are demonstrative graphics regarding the named plaintiffs
3.
Facebook and its messaging service have often been the subject of public news
reports, blog posts, and other publications. Attached as Exhibit E is a chart summarizing seventyseven publicly available online publications, including, inter alia, news reports, articles, editorials,
and Facebook developer documentation, published between May 6, 2009 and August 7, 2013.
Attached as Exhibits F, G, H, I, J, and K are the corresponding seventy-seven publications, arranged
by Bates numbers FB000000066 to FB000000424 and produced by Facebook during this litigation.
25
26
27
*
For the Court’s convenience, and to avoid duplication in the numbering of the exhibits submitted
by Plaintiffs, Facebook has used letters rather than numbers to designate its exhibits.
28
Gibson, Dunn &
Crutcher LLP
APP. 2
1
II.
Discovery Requests And Responses From Plaintiffs
A.
2
3
4
4.
5.
Attached as Exhibit M is a true and correct copy of relevant excerpts of the deposition
transcript of Plaintiff Michael Hurley on July 9, 2015.
7
8
Attached as Exhibit L is a true and correct copy of relevant excerpts of the deposition
transcript of Plaintiff Matthew Campbell on May 19, 2015.
5
6
Plaintiffs’ Deposition Testimony
6.
Attached as Exhibit N is a true and correct copy of relevant excerpts of the deposition
transcript of Mr. David Shadpour on October 1, 2015.
B.
9
10
7.
Plaintiffs’ Written Discovery Responses
Attached as Exhibit O is a true and correct copy of Plaintiff Campbell’s Corrected
11
Objections and Responses to Defendant Facebook, Inc.’s First Set of Interrogatories, dated April 2,
12
2015. As these responses reflect, Mr. Campbell has sent or received at least
13
containing URLs between the time he filed this action (December 30, 2013), and the date of his
14
responses (April 2, 2015).
15
8.
Facebook messages
Attached as Exhibit P is a true and correct copy of Plaintiff Hurley’s Objections and
16
Responses to Defendant Facebook, Inc.’s First Set of Interrogatories, dated April 1, 2015. As these
17
responses reflect, Mr. Hurley has sent or received at least
18
between the time he filed this action (December 30, 2013), and the date of his responses (April 1,
19
2015).
20
9.
Facebook messages containing URLs
Attached as Exhibit Q is a true and correct copy of (Former) Plaintiff Shadpour’s
21
Corrected Objections and Responses to Defendant Facebook, Inc.’s First Set of Interrogatories, dated
22
April 2, 2015. As these responses reflect, Mr. Shadpour has sent or received at least
23
messages containing URLs between the time he filed this action (January 21, 2014), and the date of
24
his responses (April 2, 2015).
25
10.
Facebook
On April 10, 2015, Plaintiffs supplemented their responses to Facebook’s
26
Interrogatories through a letter from counsel (David Rudolph). In particular, Plaintiffs supplemented
27
their responses to Facebook’s Interrogatory No. 5 to describe the manner in which they learned of the
28
Gibson, Dunn &
Crutcher LLP
APP. 3
1
2
(Id.) At the time of the Responses, Facebook determined that
3
4
5
6
(Id.) Facebook
produced documents related to its responses regarding the 19 messages. (Id. at 18 & Ex. A)
17.
Facebook also analyzed these messages to determine which of the messages (if any)
7
had a possibility of incrementing a social plugin count on a third-party website. Although Facebook
8
does not possess records to determine whether a particular third-party webpage displayed a social
9
plugin count at the time Plaintiffs’ selected messages were either sent or received, the Internet
10
Wayback Machine (https://archive.org/web/) is a “reliable” resource that Plaintiffs’ technical expert,
11
Dr. Jennifer Golbeck, uses “pretty frequently” to view archived webpages. (Ex. EE, Golbeck Depo.
12
Tr., at 20:7-21:3.)
13
18.
For each of the remaining twelve messages selected by Plaintiffs and for which a share
14
object was created, the Internet Wayback Machine revealed that for the 10 of 12 messages that did
15
have a share object, there was no corresponding social plugin on the websites referenced by the URLs
16
in Plaintiffs’ messages at or near the time the messages were sent. For example, on
17
18
19
20
21
22
23
Thus, 10 of the 19 messages identified by Plaintiffs had a share
object but did not have a corresponding social plugin on the third-party website.
19.
For 1 of the 12 messages that did have a share object, the Internet Wayback Machine
did not have the webpage archived. That message was sent by
24
25
26
20.
27
Gardner.
The remaining message was sent from Plaintiff Hurley to Plaintiffs’ counsel Melissa
28
Gibson, Dunn &
Crutcher LLP
APP. 6
1
III.
Other Discovery Issues
A.
2
3
21.
4
6
7
8
9
(Dkt. 138 at 20-21.)
22.
11
12
13
14
15
18
19
20
21
In fact, Facebook’s counsel never told Plaintiffs’ counsel that Facebook did not have a
“dedicated team of privacy professionals.” On the contrary, Facebook specifically denied Plaintiffs’
request to admit that there was no such team, and indeed there is such a team. Attached as Exhibit X
is a true and correct copy of Defendant Facebook, Inc.’s Responses and Objections to Plaintiffs’ First
Set of Requests for Admission dated June 29, 2015.
23.
16
17
In their Motion, Plaintiffs assert as follows:
Discovery also demonstrates that Facebook’s public-facing statements about
“procedural safeguards” for ensuring user privacy in product development are
false. Facebook has represented, inter alia, in its filings with the Security and
Exchange Commission that it has “a dedicated team of privacy professionals who
are involved in new product and feature development from design through
launch” and who conduct “ongoing review and monitoring of the way data is
handled by existing features and applications.” However, when asked to produce
documents sufficient to identify the individuals comprising this “dedicated team,”
Facebook responded that none existed.
5
10
Facebook’s “Public-Facing Statements” and “Dedicated Team of Privacy
Professionals”
Rather, Facebook’s counsel simply confirmed that, in response to a document request,
there was not a “specific list.” Plaintiffs’ request sought “documents” regarding “the ‘dedicated team
of privacy professionals’ identified on page 8 of Your Form 10-K for fiscal year ending
December 31, 2013.” (Dkt. 138-4, Ex. 31.) Facebook responded by explaining that it did not have a
document responsive to Request No. 29, listing members of its internal privacy team. Plaintiffs even
misstated the correspondence among counsel by omitting the bolded portion below in their brief:
With respect to Request No. 29, please be advised that there is no specific list of
the ‘dedicated team of privacy professionals’ referenced in the Request, but we
have already agreed to conduct a reasonable search for non-privileged
documents sufficient to identify Facebook’s current and former employees
who may possess knowledge relevant to the practice challenged in this action,
and we also have identified witnesses with relevant knowledge in Facebook’s
Initial Disclosures and responses to Plaintiffs’ Interrogatories.
22
23
24
25
26
Plaintiffs attached Facebook’s complete response to the request as Exhibit 32 (Dkt. 138-4, Ex.
27
32).
28
Gibson, Dunn &
Crutcher LLP
APP. 7
1
2
B.
24.
Plaintiffs’ Expanded Proposed Class Definition Exceeds The “Relevant Time
Period” For Discovery
Plaintiffs’ Consolidated Amended Complaint identified the following proposed class:
3
“All natural-person Facebook users located within the United States who have sent or received
4
private messages that included URLs in their content, from within two years before the filing of this
5
action up through and including the date when Facebook ceased its practice,” which Plaintiffs alleged
6
to be “at some point after it was exposed in October 2012.” (Dkt. 25 ¶ 59 & n.3.)
7
25.
In their Motion for Class Certification, Plaintiffs now seek to certify a proposed class
8
of all “Facebook users located within the United States who have sent, or received from a Facebook
9
user, private messages that included URLs in their content (and from which Facebook generated a
10
URL attachment), within two years before the filing of this action up through the date of
11
certification of the class.” (Dkt. 138 at 10 (emphasis added).) In other words, Plaintiffs have now
12
expanded their proposed class by over three years.
13
26.
Plaintiffs’ new proposed class definition extends well beyond the relevant time period
14
to which the parties expressly agreed for discovery. On April 7, 2015, Hank Bates, counsel for
15
Plaintiffs, proposed that the “Relevant Time Period” for “producing documents” should be April 1,
16
2010, to the date of filing the action, December 30, 2013. Attached as Exhibit Y is a true and correct
17
copy of Mr. Bates’ letter dated April 7, 2015.
18
27.
After some further discussions between the parties, Facebook agreed to this time
19
period in letters dated May 13 and June 12, 2015. Attached as Exhibits Z and AA are true and
20
correct copies of these letters.
21
28.
Regarding the production of source code, the parties agreed (and stipulated, see
22
Dkt. 90) to a slightly different time period—September 1, 2009 to December 31, 2012—reflecting the
23
fact that Plaintiffs had alleged that the challenged practice had ceased “at some point after it was
24
exposed in October 2012.” (Dkt. 25 ¶ 59 & n.3.)
25
29.
Additionally, during depositions of Facebook’s witnesses, counsel for Plaintiffs
26
repeatedly limited questions to the time period of “2010 to 2012” or “2010 to 2013.” Attached as
27
Exhibits BB and CC are true and correct copies of excerpts of the deposition transcripts of Facebook
28
Gibson, Dunn &
Crutcher LLP
APP. 8
1
witnesses, Jiakai Liu and Ray He, dated June 30, 2015 and September 25, 2015, respectively,
2
reflecting, inter alia, a handful of those questions.
3
C.
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
30.
Fernando Torres’ Expert Report And The Information He Claims That He
Needs To Complete His Damages Analysis
Plaintiffs’ proposed damages expert, Mr. Fernando Torres, testified that, in order to
complete his damages analysis, he needed additional information that is distinct from Plaintiffs’
previous damages discovery requests— which they represented were “critical to establishing” their
damages theory. Attached as Exhibit DD is a true and correct copy of relevant excerpts of the
deposition transcript of Mr. Fernando Torres on December 18, 2015.
31.
In support of prior discovery motions, Plaintiffs argued that they would be “unduly
prejudice[d]” without “discovery relevant to damages in this action.” (Dkt. 112 at 2; see also
Dkt. 109 at 2, 4 (arguing that “[w]ithout discovery into the revenue Facebook has generated . . .
Plaintiffs will be hampered in formulating a class-wide damages theory”).) Plaintiffs represented that
the discovery they sought was “critical to establishing” their damages theory and that “expert analysis
of the [] information sought” would allow them to “accurately model the profits attributable to the
challenged conduct.” (Dkt. 112 at 2-3.) And they also argued that the damages discovery sought was
“directly relevant to the issues of damages suffered by the class as well as the appropriate injunctive
relief . . . and [was] . . . necessary for Plaintiffs to fashion a theory of class-wide relief for their class
certification briefing.” (Dkt. 109 at 2, 4.)
32.
In light of these and other arguments, Plaintiffs received a 30-day extension of the
briefing schedule (Dkt. 117) and successfully compelled Facebook to produce extremely broad
discovery (Dkt. 130, 136.).
33.
In his expert report, however, Mr. Torres cited only 7 of the thousands of documents
produced by Facebook during the course of this litigation. (Dkt. 138-4, Ex. 33.) He also asserted in
his report that he needed other information from Facebook: “with additional information, including
production from Facebook, and inputs, these conclusions [in the Report] could be refined.”
(Dkt. 138-4, Ex. 33, ¶ 11 n.12.) In the final paragraph of his report, Mr. Torres explained, “With
quantitative data on the number of affected ‘Like’ counts, and identification of the affected URLs, it
Gibson, Dunn &
Crutcher LLP
APP. 9
1
prepare their motion for class certification” was “prejudice[ed]” by Facebook’s alleged “delay[s]
2
providing relevant discovery in this matter.” (Dkt. 138-3, ¶ 2.) More specifically, he claims that
3
Facebook “delayed production of its source code by over five months . . . and [] failed to produce a
4
significant number of documents responsive to Plaintiffs’ document requests” in a timely manner.
5
(Id.)
6
38.
Mr. Rudolph does not explain that this Court already was presented with these
7
arguments on two separate occasions. After considering Facebook’s Opposition to Plaintiffs’ Motion
8
to Enlarge Time and Extend Deadlines (Dkt. 114) and the supporting Declaration of Joshua Jessen
9
(Dkt. 114-1), which rebutted similar assertions from Plaintiffs’ counsel, this Court ruled that the “90-
10
day extension sought by plaintiffs would unnecessarily delay the case,” and instead ordered a 30-day
11
extension. (Dkt. 117; see also Dkt. 113-1 at 13.)
12
39.
Several weeks later, Plaintiffs filed a Renewed Motion to Continue, attempting to
13
revisit the issue and arguing that Facebook “delayed [] providing relevant discovery, including by
14
failing to produce a significant proportion of relevant and responsive documents until October 13,
15
and October 28.” (Dkt. 134-1.) Once again, Facebook responded to Plaintiffs’ false assertions and
16
corrected the record. (Dkt. 135, 135-1.) This Court denied Plaintiffs’ motion. (Dkt. 136.)
17
40.
Mr. Rudolph’s most recent declaration (Dkt. 138-3) again argues that Facebook
18
“delayed” production of its source code, “delayed” producing a significant portion of documents until
19
October 13-28, 2015, and “delayed” producing additional documents until November 3-7, 2015.
20
(Dkt. 138-3, ¶¶ 2–5.) Facebook already refuted the first two assertions were before the Court. (See
21
Dkt. 114-1 ¶¶ 8–36; 135-1 ¶¶ 2–10.) On Mr. Rudolph’s last point, he fails to mention that
22
Facebook’s November productions were in response to Plaintiffs’ Motion to Compel (Dkt. 112),
23
Magistrate Judge James’ Order on October 14, 2015 (Dkt. 130), and this Court’s Order on
24
November 3, 2015. (Dkt. 136.) In other words, the productions were the result of Plaintiffs’ motions
25
to compel. Facebook produced all responsive documents it could locate after a reasonable search in a
26
timely manner. Although Mr. Rudolph is correct to point out that the November 7 productions were
27
significant in volume, this was through no fault of Facebook—it had repeatedly warned Plaintiffs that
28
Gibson, Dunn &
Crutcher LLP
APP. 12
1
their requests were extremely overbroad and would yield many irrelevant documents, and Facebook
2
undertook extensive efforts to try to reach a reasonable compromise. (Dkt. 131-1.) For example,
3
Facebook offered to provide Plaintiffs with representative documents for certain of Plaintiffs’
4
requests, but Plaintiffs rejected all offers for compromise and continued to litigate these issues.
5
(Dkt. 131-1, Ex. 1.)
6
41.
Contrary to Mr. Rudolph’s declaration, Facebook’s production was substantially
7
complete as of September 30, 2015, with respect to the documents Facebook had agreed to produce at
8
that point. Productions after this date were primarily in response to Plaintiffs’ Motion to Compel
9
(Dkt. 112, 113, 122), which were not even decided until after September 30. (See Dkt. 130, 136.)
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
IV.
Authentication Of Remaining Exhibits
42.
Attached as Exhibit EE is a true and correct copy of excerpts of the deposition
transcript of Dr. Jennifer Golbeck (dated December 16, 2015).
43.
transcript of
44.
transcript of
45.
transcript of
46.
Attached as Exhibit FF is a true and correct copy of excerpts of the deposition
(dated August 7, 2015).
Attached as Exhibit GG is a true and correct copy of excerpts of the deposition
(dated August 10, 2015).
Attached as Exhibit HH is a true and correct copy of excerpts of the deposition
(dated August 11, 2015).
Attached as Exhibit II is a true and correct copy of excerpts of the deposition
transcript of Ray He (dated October 28, 2015).
47.
Attached as Exhibit JJ is a true and correct copy of excerpts of the deposition
transcript of Michael Adkins (dated October 28, 2015).
48.
Attached as Exhibit KK is a true and correct copy of a document that begins with
Bates number FB000006429, which Facebook produced during this litigation.
25
26
27
28
Gibson, Dunn &
Crutcher LLP
APP. 13
1
49.
Attached as Exhibits LL are true and correct copies of certain Google Analytics data
2
that begins with Bates numbers FB000009906 and FB000009914, respectively, and which Facebook
3
produced to Plaintiffs during discovery.
4
I declare under penalty of perjury under the laws of the United States of America and the
5
State of California that the foregoing is true and correct, and that I executed this Declaration in Los
6
Angeles, California, on January 15, 2016.
7
8
/s/ Christopher Chorba
Christopher Chorba
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
Gibson, Dunn &
Crutcher LLP
APP. 14
EXHIBIT A
APP. 15
App. 16-17
Filed Under Seal
EXHIBIT B
APP. 18
App. 19
Filed Under Seal
EXHIBIT C
APP. 20
App. 21-29
Filed Under Seal
EXHIBIT D
APP. 30
App. 31-37
Filed Under Seal
EXHIBIT E
APP. 38
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
1
FB000000079
Josh Constine, Facebook Allows Users to
Upgrade to the New Messages Product,
Why You Should, Inside Facebook
(January 10, 2011), available at
http://www.insidefacebook.com/2011/01/1
0/allows-upgrade-new-messages/.
This article describes how a Facebook Messages product update “automatically filters non-essential communications
into an Other folder, allowing the main inbox to show only important messages [and] also routes sent messages to
whichever device or interface Facebook deems is the most convenient for the recipient, whether that’s Chat, Messages,
SMS, or email.” The author praises the storage capacity and searchability of Facebook’s Messages product, but he
writes that some “important Messages may be being filtered out.” The author concludes that “[o]verall, Messages will
help most Facebook users [because] [i]t anticipates the shift to using multiple devices and interfaces to conduct a single
conversation . . . declutters the inbox by removing spammy and low-value Page and Event updates . . . [and] improves
one of the core uses of Facebook—instant communication with friends.”
2
FB000000156
Jiakai Liu, Inside Facebook Messages’
Application Server, Facebook (April 28,
2011), available at
https://www facebook.com/notes/facebookengineering/inside-facebook-messagesapplication-server/10150162742108920/.
In this Facebook Note, Facebook software engineer Jiakai Liu describes the technical details of the infrastructure
behind Facebook’s Messages product, including the internals of the application server used to manage this
infrastructure. He writes that the server will run “pre- and post-processing as needed, and determin[e] the folder and
thread where the message should be routed based on a number of signals.” He also indicates that “[w]hen reading
messages, the server gets various statistics about the user’s mailbox, like its capacity; number of messages, threads and
replies; and the number of friends with whom the user has interacted. It also gets folder statistics and attributes, the list
of threads by various search criteria (folder, attributes, authors, keywords, and so forth), and thread attributes and the
individual messages in the thread).” He also discloses that Facebook’s Messages product supports “full text search” by
maintaining a “reverse index from keywords to matched messages.” He describes how Facebook “parse[s]” and
“convert[s]” a new messages when it arrives, and writes that “[a]ll messages, including chat history, email, and SMS,
are indexed in real time.” He concludes by noting that Facebook did testing of the new application server software via
a “dark launch,” but that Facebook would “continue to roll out the new Messages system to all our users.”
3
FB000000163
This Internet Archive capture is Facebook’s developer documentation describing how to implement the “Like” button
Facebook, Facebook Developer
Documentation: “Like” Button, Facebook - and “FAQs” about that implementation. It discloses that the number shown on a “Like” button “is the sum of [items
Internet Archive Wayback Machine (March including] [t]he number of inbox messages containing this URL as an attachment.”
7, 2011), available at
http://web.archive.org/web/201103072139
24/http://developers facebook.com/docs/ref
erence/plugins/like/.
4
FB000000166
This Internet Archive capture is Facebook’s developer documentation describing how to implement the “Like” button
Facebook, Facebook Developer
Documentation: “Like” Button, Facebook- and “FAQs” about that implementation. It discloses that the number shown on a “Like” button “is the sum of [items
Internet Archive Wayback Machine
including] [t]he number of inbox messages containing this URL as an attachment.”
(October 19, 2012), available at
http://web.archive.org/web/201210190956
52/http://developers facebook.com/docs/ref
erence/plugins/like/.
Page 1 of 22
APP. 39
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
5
FB000000173
Elliot Schrage, Proposed Updates to our
Governing Documents, Facebook
(November 21, 2012), available at
http://newsroom.fb.com/news/2012/11/pro
posed-updates-to-our-governingdocuments/.
In this News Post, a Facebook employee writes that Facebook is “proposing some updates to two documents which
govern our site: our Data Use Policy, which explains how we collect and use data when people use Facebook, and our
Statement of Rights and Responsibilities (SRR), which explains the terms governing the use of our services.” He notes
that Facebook would continue to post “significant” changes to the documents for review and comment and provide
“additional notification mechanisms” informating users of any changes to them. He also summarizes certain updates to
the Data Use Policy, including 1) “[n]ew tools for managing your Facebook Messages - replacing the ‘Who can send
you Facebook messages’ setting with new filters for managing incoming messages”; 2) changes to how Facebook
refers to certain products; 3) reminders about what’s visible to others on Facebook; and 4) tips for managing timelines.
He concludes by encouraging readers to read the proposed changes to the documents and leave comments.
10
FB000000240
Here, the author asks, “[D]id you know that the number of Likes is made up of more than just people clicking Like?”
Thomas McMahon, Facebook’s Like
Number is More than Just People Clicking Quoting Facebook’s developer site, he writes that “[t]he number shown on a Like button is the sum of: ‘[t]he number
Like, Twister MC (September 8, 2011),
of likes for the URL; [t]he number of shares for the URL – This includes copy/pasting a link back to Facebook; [t]he
available at
number of likes and comments on stories on Facebook about the URL; [and] [t]he number of inbox messages
http://www.twistermc.com/36579/facebook- containing the URL as an attachment.’” Providing an example, the author notes that if you have a blog post with a
like/.
“Like” button social plug in, the “Like number showing would include: . . . The number of times someone has sent the
blog post URL to a friend via Facebook’s messaging system.'” He concludes that “[b]asically, anytime that blog post
URL is active on Facebook, a Like is added to the count.”
6
FB000000204
Nicholas Carlson, The Truth About The
Latest Facebook Privacy Scare Everyone Is
Talking About, Business Insider (October
4, 2012), available at
http://www.businessinsider.com/facebookprivate-messages-likes-2012-10/.
After reporting that “[w]hen a Facebook user sends a link to a Web page via a private Facebook message, that Web
page will get an extra ‘Like,’ if it is a Facebook-‘Like’able Web page,” the author argues that the various publications
(including The Wall Street Journal, Forbes, and Gizmodo) characterizing Facebook’s conduct as “a privacy invasion”
are incorrect. He writes that there is “a simple reason why”: “That ‘Like’ is only added to the page’s counter. There is
no way to tell who added the like . . . [and] [i]f you do not reveal something said or shared in private to others, you are
not invading their privacy.” The author goes on to demonstrate the incrementation of the social plugin count, and he
explains that “[t]here is no reason for anyone to be upset about Facebook doing this.” He notes that “email providers
like Gmail scan user emails all the time. Gmail does it to show relevant ads, fight spam, and slow down viruses.” He
also writes that “services across the Internet use whatever method they can to keep track of the popularity of Webpages
Google has a list of trends. The New York Times keeps track of most emailed stories.” He concludes that “there is
nothing to see here,” telling people to “[m]ove along.” In an update to the article, the author quotes a Facebook
spokesperson’s statement on the issue, which noted that “‘[w]hat makes up the number shown on [a] Like button
[includes] [t]he number of inbox messages containing this URL as an attachment.’”
Page 2 of 22
APP. 40
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
7
FB000000213
Facebook, Update to Messaging and a Test,
Facebook (December 20, 2012), available
at
http://newsroom.fb.com/news/2012/12/upd
ate-to-messaging-and-a-test/.
In this News Post, Facebook announces updates to Facebook Messenger on mobile and Facebook Messages on the
web. The article notes that “[i]n 2010 we introduced the Other folder, where less relevant messages go,” and it
continues, “We’ve heard that messages people care about may not always be delivered or may go unseen in the Other
folder. As we announced last month, we’re replacing the ‘Who can send me Facebook Messages’ setting with up-front
filters that help to address this feedback.” The article then describes the two types of filtering that will be made
available (Basic and Strict), noting that “Facebook Messages is designed to get the most relevant messages into your
Inbox and put less relevant messages into your Other folder. [Facebook] [relies] on signals about the message to
achieve this goal. Some of these signals are social—we use social signals such as friend connections to determine
whether a message is likely to be one you want to see in your inbox. Some of these signals are algorithmic—we use
algorithms to identify spam and use broader signals from the social graph, such as friend of friend connections or
people you may know, to help determine relevance.” The article concludes that Facebook will “continue to iterate and
evolve Facebook Messages over the coming months.”
8
FB000000221
Samantha Murphy Kelly, Facebook: We’re
Not Liking Brand Pages For You,
Mashable (October 4, 2012), available at
http://mashable.com/2012/10/04/facebookbrand-like/.
In this article, the author notes that “[i]t’s been widely reported on Thursday that Facebook is scanning messages sent
to others with attached links to better gauge their interests and add to a brand’s Link count,” writing the information is
“used only on the back-end for publishers to see the analytics of articles and shared URLs.” She quotes a Facebook
spokesperson’s statement on the issue, and she writes that “[e]mail services such as Gmail have long taken this
approach to target its users with ads or fight against viruses based on content written.” She concludes that “Facebook’s
developer page related to the Like button states that the number of Likes is derived by the number of likes in the URL
and the number of shares. This includes copying and pasting a link back to Facebook. It also includes the number of
inbox messages containing the URL as an attachment.”
9
FB000000234
Ryan Singel, Facebook’s E-mail
Censorship is Legally Dubious, Experts
Say, Wired (May 6, 2009), available at
http://www.wired.com/2009/05/facebookse-mail-censorship-is-legally-dubiousexperts-say/.
Here, the author reports that “legal experts say Facebook may have gone too far [in] blocking not only links to torrents
published publicly on member profile pages, but also examining private messages that might contain them, and
blocking those as well.” The author writes that Facebook messages are “governed by the Electronic Communications
Privacy Act, which forbids communications providers from intercepting user messages, barring limited exceptions for
security and valid legal orders,” opining that “[w]hile the sniffing of e-mails is not unknown—it’s how Google serves
up targeted ads in Gmail and how Yahoo filters out viruses, for example—the notion that a legitimate e-mail would be
not be delivered based on its content is extraordinary.” The author notes that then-Facebook Chief Privacy Officer
Chris Kelly “acknowledged that the site censors user messages based on links [b]ut [ ] insisted that Facebook has the
legal right to do so, because it tells users they cannot ‘disseminate spammy, illegal, threatening or harassing content.’”
The author quotes Mr. Kelly: “‘Just as many e-mail services do scanning to divert or block spam, prevent fraudulent,
unlawful or abusive use of the service—or in the case of some services, to deliver targeted advertising—Facebook has
automated systems that have the capability to block links. ECPA expressly allows Facebook to operate these systems.
The same automated system that blocks these links may also be deployed where there is a demonstrated disregard for
intellectual property rights.’”
Page 3 of 22
APP. 41
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
11
FB000000248
Julian Evans, Facebook Launches AntiMalware URL Scanning Service, Julian
Evans Blog.com (October 3, 2011),
available at
https://www.julianevansblog.com/2011/10/
facebook-launches-anti-malware-urlscanning-service html/.
Here, the author reports that “Facebook is introducing URL (link scanning) protection for its users as from today (Oct
3rd, 2011).” The author notes that Facebook will “analyze each URL in real-time for potential malicious content.” He
acknowledges that “Facebook already scans URLs for malicious links,” noting that “by adding Websenses cloud-based
malware technology . . ., they further enhance the security offering to Facebook users.” He concludes that “[o]ne can
only applaud Facebook for continuing to build user privacy and protection, even if it is becoming rather more complex
for end users to understand.”
12
FB000000146
Jennifer Valentino-DeVries and Ashkan
Soltani, How Private Are Your Private
Facebook Messages?, Wall Street Journal
(October 3, 2012), available at
http://blogs.wsj.com/digits/2012/10/03/how
private-are-your-private-messages/.
In this article, the authors report that an online video shows that Facebook “scans the links you’re sending—registering
them as though you ‘Like’ the page you sent,” which the authors characterize as “one example of how online messages
that seem private are often actually examined by computers for data.” They note that “[e]mail providers such as Gmail
have long reviewed messages in order to spot spam and place ads,” and that Facebook has previously disclosed that the
company “analyze[s] messages to filter spam and to detect conversations that could be related to criminal behavior.”
The authors also indicate that Facebook’s Developer Guidance discloses that “‘the number of inbox messages
containing’ a link to a page will count as ‘Likes’—indicating that the recording of these links isn’t some sort of new
bug.” They opine that it is “not clear from Facebook’s data use policy that regular users would expect links in their
messages to be scanned this way,” because “the policy simply says generally that Facebook gets ‘data about you
whenever you interact with Facebook,’ including when you ‘send or receive a message.’” The authors update their
article with a Facebook spokesperson’s statement about the social plugin bug. Additionally, reader comments on the
article include a statement by someone who writes that he or she is “[n]ot surprised at all”; another who suggests that
“Facebook should update their data use policy to reflect what they already disclose in their guidelines to developers”;
and another who states that he is “fine with Google/FB scanning messages as long as private information isn’t released”
but does not like “the ability for scammers to inflate the like score with this method.”
13
FB000000281
Joey Tyson, Relevant Ads That Protect
Your Privacy, Facebook (September 30,
2012), available at
https://www facebook.com/notes/facebookand-privacy/relevant-ads-that-protect-yourprivacy/457827624267125/.
In this Facebook Note, the author describes several features that “give advertisers new ways of reaching people who
use Facebook.” The author writes that “[m]any sites across the web provide free services by including advertisements.
Facebook is no exception, and as we pursue our goal of making the world more open and connected, we have designed
our service to show ads that help people discover products that are interesting to them. We also recognize that our
users trust us to protect the information they share on Facebook.” The author also writes, “Advertising helps keep
Facebook free. We believe we can create value for the people who use our services every day by offering relevant ads
that also incorporate industry-leading privacy protections. In our view, this is a win-win situation for marketers and for
you.”
Page 4 of 22
APP. 42
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
14
FB000000331
KS Sandhya Iyer, Likes Shared Privately Here, the author writes that “Facebook is keeping an eye on your private messages for URLs that have Like buttons and
on Facebook Increase Page’s Like Count, should be increased,” noting that “The Next Web further pointed out that the Like button entry on the Facebook
NDTV Gadgets (October 5, 2012),
Developers page states that the number shown on Like buttons on other websites is a total of likes of that URL, shares
available at http://gadgets ndtv.com/social- of that URL, likes and comments on Facebook stories about that URL and inbox messages containing that URL as an
networking/news/links-shared-privately-on- attachment.” The author asks, “[H]ow big a deal is it and does it invade Facebook user privacy?” She answers,
facebook-increase-pages-like-count“Probably not,” because the “‘Like’ is only added to the page’s counter [and] It does not reveal who added the Like.”
275993/.
The article concludes that “[i]f you do not reveal something said or shared in private to others, you are not invading
their privacy.” She quotes a Facebook spokerperson’s statement on the issue, and she writes that “Facebook isn’t going
to be axed for this move for the simple reason that email providers like Gmail scan user emails all the time” in order to
“show relevant ads, fight spam, and slow down viruses.” The author further writes that “[w]hat Facebook is doing is
just adopting one of the many services of tracking the popularity of Webpages.” She quotes Facebook’s statement that
its “systems parse the URL being shared in order to render the appropriate preview, and to also ensure that the message
is not spam.”
15
FB000000372
Doug Mataconis, Your Facebook Chats are
Being Monitored, By Facebook, Outside
the Beltway (July 13, 2012), available at
http://www.outsidethebeltway.com/yourfacebook-chats-are-being-monitored-byfacebook/.
The author of this article writes that “Mashable is out with a report that Facebook routinely monitors user chats for
suspicious or criminal activity,” quoting several paragraphs of the Mashable article. He notes that the “news was first
broken” in a Reuters article from July 2012, which “describes one incident in which [Facebook’s] software did in fact
catch a child predator.” The author continues, “As it turns out, this is all covered in the company’s Privacy Policies,”
quoting two paragraph’s of Facebook’s Privacy Policy. He concludes that “[i]t’s hard to argue with what Facebook is
doing here,” stating that “there are some privacy concerns here, but Facebook is a private company and free to set its
own policies on these issues” and that “it has a corporate brand to protect, not to mention the potential of liability,
from being known as a place where parents can’t be sure that their teenager children can be safe.”
16
FB000000170
Britney Fitzgerald, New Facebook Bug
Scans Messages, Increases ‘Likes’: What
You Need To Know, Huffington Post Tech
(October 4, 2012), available at
http://www huffingtonpost.com/2012/10/04
/new-facebook-bug_n_1940339 html/.
In this article, the author opens by noting that “Facebook is experiencing a bug—but it’s not quite the privacy breech
[sic] that’s been previously reported by multiple news sources.” She reports that Facebook “has been scanning private
messages for links to third-party websites that use Facebook’s ‘Like’ button, a social plug-in that lets users interact with
a brand’s products, news articles and other types of content on webpages (without directly visiting the Facebook Page
for that brand).” She writes that “Likes also increase when a Facebook user sends another user a message containing a
URL to a page featuring the ‘Like’ button; this should only up the ‘Like’ count by one, but it’s actually inflating the
count by two.” The author writes that Facebook “insists that this is nothing new,” quoting a Facebook representative,
who noted that the issue where “‘counts are jumping by two . . . is a bug [but . . . the actual shares going up when
things are sent in messages—that is standard behavior and you can find that in our documentation.’” The author
further indicates, “All information posted on the social networking site is accessible for company use. Thus, if you
were to share the URL for this article through a Facebook message, Facebook can check out what you’re sending and
adjust the ‘Likes’ at the top of this page—whether you clicked ‘Like’ on it or not.”
Page 5 of 22
APP. 43
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
17
FB000000177
Michelle Fitzsimmons, Report: Facebook
scanning private messages, Liking on
users’ behalf, Tech Radar (October 4,
2012), available at
http://www.techradar.com/us/news/internet
/report-facebook-scanning-privatemessages-liking-on-users-behalf-1101999/.
In this article, the author writes that various news sources have indicated that “Facebook is scanning users’ private
messages and automatically issuing Likes on their behalf.” The author notes that a Facebook spokesperson directed
him to the the “Like Button” section of Facebook’s developer site, and “specifically to information titled ‘What makes
up the number shown on my Like button?’.” The author quotes the site, which she writes “explained that ‘the number
shown is the sum of’ [items including] [t]he number of inbox messages containing this URL as an attachment.” She
concludes that “[m]essage scanning isn’t anything new” and notes that “Gmail scans user emails to create targeted ads
and Facebook also reportedly scans user messages to look for sexual predators and child pornography.”
18
FB000000183
Matt Hicks, See the Messages that Matter,
Facebook (February 11, 2011), available
at
https://www facebook.com/notes/facebook/
see-the-messages-thatmatter/452288242130/.
In this Facebook Note, the author “announce[s] the next evolution of Messages,” providing detail about the new
functionality of the Messages product. He indicates that users may now use SMS, chat, email or Messages; Facebook
is providing an “@facebook.com” email address to every person on Facebook who wants one; Facebook messages will
take the form of a “single conversation”; and that “your Inbox will only contain messages from your friends and their
friends,” and “[a]ll other messages will go into an Other folder where you can look at them separately.” He concludes
by inviting readers to take a tour of Messages and asking for thoughts and feedback.
19
FB000000187
Jessica Lee, The Facebook Like Button,
Dissected, Bruce Clay (May 25, 2011),
available at
http://www.bruceclay.com/blog/facebooklike-button/.
Here, the author examines the “Like button phenomenon, its various uses and why you need to be liked to survive in
the age of online marketing.” She provides an overview of the “two variations” of the “Like” button, writing with
respect to the “Like” button social plugin that “Facebook notes that the number of likes shown on any given Web page
or object is the sum of [items including] [t]he number of inbox messages containing this URL as an attachment.”
20
FB000000251
In this article, the author discusses in detail how and why Facebook “completedly redesign[ed] its messaging system” in
Tekla S. Perry, The Reengineering of
Facebook Messages, IEEE Spectrum
2009. The author discusses the rationale behind the “informal formatting” of the Messages product and the decisions
(November 2, 2011), available at
to remove subject lines and store live chats in the same thread as messages. After explaining how Facebook engineers
http://spectrum.ieee.org/computing/softwar determined how to store messages, the author writes that “the engineers turned to the problem of spam, a bane of e-mail
e/the-reengineering-of-facebook-messages/. services.” She writes, “While traditional spam filters look mostly at message content, the spam filters built into
Facebook messages also pay a lot of attention to who the message senders are. Messages from your friends and friends
of friends bypass the spam filters and go directly into your in-box, unless you’ve changed the default or previously
moved messages from that person out of your in-box; messages from people you aren’t connected to through a friend,
along with announcements from organizations or businesses, go into a folder called ‘other.’ Messages with spamlike
content and no friend-of-friend connection go into a separate spam file, the link to which is tucked away at the bottom
of the ‘other’ mailbox and requires scrolling past every message in that mailbox to be seen.” The author concludes by
opining that “the future of everyday communications will look a lot more like Facebook Messages.”
21
FB000000278
Donna Tam, Facebook Processes More
Than 500 TB of Data Daily, Cnet (August
22, 2012), available at
http://www.cnet.com/news/facebookprocesses-more-than-500-tb-of-data-daily/.
Here, the author writes that “[s]ince Facebook uses [its] data to build its user experience, it wants teams from across the
company—whether they sell ads or build functions—to be able to access any of the data as needed.” The author notes
that the Facebook employee who “runs Facebook infrastructure” indicated that “this keeps the creation and
improvement of Facebook features as fast as possible . . . . [and [t]hese nearly real-time efforts apply to most functions
throughout the site because people won’t use the site if the personalized experience is poor, or slow.” The author also
writes that Facebook keeps all of its data in one place for “easy access,” but that Facebook “has a zero-tolerance policy
when it comes to any abuse from this broad access.”
Page 6 of 22
APP. 44
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
22
FB000000285
Sam Biddle, Facebook is Reading Your
Messages and Liking Things For You
(Updated: Not as Bad as We Thought),
Gizmodo (October 4, 2012), available at
http://gizmodo.com/5948948/facebook-isreading-your-messages-and-liking-thingsfor-you/.
Here, the author writes that Facebook’s “scanning increases the Like count for a given page Like-able link just by you
talking about it” in a Facebook message. He indicates that “[a]uto-scanning is nothing new: Gmail has done it since
day one to serve us ads,” and he opines that “there are serious potential personal consequences here.” The author
continues by noting that “[i]t turns out this was just a very unlikely coincidence that played out in more than one
place—the auto-liking only applies to external links with embedded Facebook liking.” He illustrates how a message
increments the social plugin count, writing that “your name isn’t being associated publicly with something you’re
talking about privately—but if even a mention is enough to kick up a Like, it seems like that’s pretty heavily diluting
(even further) what ‘like’ even means—from preference to mere reference.” The author updates the article by quoting
a Facebook spokesperson’s statement on the issues raised in the article.
23
FB000000301
Paul Shea, Facebook Confirms Peeking at
Private Messages, Value Walk (October 4,
2012), available at
http://www.valuewalk.com/2012/10/facebo
ok-inc-fb-peeking-at-private-messages/.
Here, the author reports that Facebook “confirmed that it scans private messages between users,” and that “these scans
[ ] caused the likes of linked content to increase.” He writes that “a series of bots [ ] scan private messages for links to
content that contains ‘Like’ buttons [and] [i]f a Like button is detected, however, a bug is activated, whereby the linked
content has its likes increased by two.” The author further writes that it “may be news to some long time Facebook
users” that the “‘Like’ counter measures not just clicks on a Like button, but takes into account sharing of the content,
as well as comments on the content, and now private messages.” He continues, “Emil Protalinski, the writer at
thenextweb.com who originally picked up on this story, rightly points out that the scanning of private messages for data
on content is not the same as scanning for the same data on comments, or public declarations of ‘Like,”‘ and he notes
that “[t]his is not the first time a company has been indicted for scanning the content of users’ private messages,”
writing that “Google Inc and other web mail providers have been scanning users’ emails for years, in order to pick the
advertisements best suited to them.” He quotes a Facebook spokesperson’s statement on the issue, and he concludes
that “Facebook Inc [sic] has certainly not crossed a line with this latest news, any more than they have on hundreds of
other occasions,” and that “[b]ecause of the nature of the business, the company will be dodging privacy issues for as
long as it operates.”
24
FB000000305
Brittany Darwell, Facebook Clarifies How In this article, the author reports that Facebook “responded to reports today that alleged the social network was
Like Plugin Works, Addresses Privacy
scanning private messages and Liking pages on users’ behalf” by stating that “the Like count of an article or webpage
Concerns, Social Times (October 4, 2012), will increase when users share the link via direct messages [but] that no private information is shared . . . URLs sent
available at
through private messages are not shown publicly on user profiles and users will not see a friend’s name or photo next
http://www.adweek.com/socialtimes/facebo to a Like button if the person shared the article privately.” The author opines that is it not “completely clear to
ok-clarifies-how-like-plugin-worksoutsiders that the total includes actions that were made by clicking the button directly, as well as the number of times
addresses-privacy-concerns/285167/.
the link was copy-pasted into a Facebook post or message, which is why some users thought the social network had a
security flaw.” She acknowledges that “Facebook explains this in the FAQ about the Like button plugin,” quoting the
relevant FAQ. The author writes that “we see this as similar to site visitor widgets, which increase whenever a user
visits a webpage but do not reveal who visited” and that accordingly, “privacy implications are minimal.”
Page 7 of 22
APP. 45
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
25
FB000000328
Rick Burgess, Facebook Bug Silently
Tallies Up Extra ‘Likes’, TechSpot
(October 5, 2012), available at
http://www.techspot.com/news/50416facebook-bug-silently-tallies-up-extralikes.html/.
This article reports that Facebook is facing “scrutiny” after a “security researcher uncovered a flaw in [Facebook’s]
‘like’ system which appears to be responsible for liking sites an unintended number of times.” The author writes that
“Facebook reiterated though that some behaviors will generate likes without explicitly liking something, such as
messaging a URL to a friend,” though “some find the sincerity of such a practice nebulous.” He writes that “[]on
Facebook for Developers . . . there are actually four ways to generate likes [and] . . . only one of those methods actually
requires users to click a like button.” He concludes by citing Facebook’s recent “initiative which intends to fortify the
integrity of Facebook’s likes and shares,” including deploying “automated tools . . . with the intent of deleting
disingenuous likes that were determined to be purchased or originate from malware or compromised accounts.”
26
FB000000376
Lisa Vaas, How Facebook Catches WouldBe Child Molesters by Analyzing
Relationships and Chat Content, Naked
Security (July 16, 2012), available at
https://nakedsecurity.sophos.com/2012/07/
16/facebook-child-molester/.
In this article, the author reports that “[]law enforcement is hailing Facebook for using its little-known data monitoring
technology to spot a suspicious conversation about sex between a man in his early thirties and a 13-year-old girl from
Florida,” and that “Facebook doesn’t talk much about this technology, which scans postings and chats for criminal
activity.” She notes that Facebook’s then-Chief Security Officer Joe Sullivan said that Facebook’s “monitoring
software analyzes relationships to find suspicious conversations between unlikely pairings” and “relies on archives of
real-life chats that preceded sexual assaults.” She notes that Mr. Sullivan said that “‘the last thing the company wants is
for its users to feel like they’re being eavesdropped on.’” The author ends by quoting Mr. Sullivan, who says, “‘We’ve
never wanted to set up an environment where we have employees looking at private communications, so it’s really
important that we use technology that has a very low false-positive rate.’”
27
FB000000391
Here, the author writes that “[w]hether you realize it or not, a bundle of sophisticated technology is constantly scanning
Adam Estes, Facebook’s Spying on You
For a Good Cause, Motherboard (July 13, through Facebook interactions—wall posts, messages, chats—looking for sexual predators.” The author reports that
2012), available at
Facboook’s program is “part of an aggressive effort the social network has made over the past few years to protect the
http://motherboard.vice.com/blog/facebook- safety of its 13- to 18-year-old users, and few would argue that the stated goals of the program aren’t sound.” He
s-reading-your-messages-but-it-s-for-awrites that “there’s something unnerving about Facebook reading your messages,” because “[p]reventing crime is one
good-cause/.
thing, but surveilling the most intimate user behavior is something completely different.” The author also notes that
“Facebook is obviously aware of the privacy concerns and insist that their technology only spots the bad guys,” quoting
Facebook’s then-Chief Security Officer Joe Sullivan, who says, “‘We’ve never wanted to set up an environment where
we have employees looking at private communications, so it’s really important that we use technology that has a very
low false-positive rate.’” He concludes that “it’s easy to think of Facebook’s anti-pedophile software as just another
form of moderation” because “[t]he vast majority of the scanning is also algorithmic, so it’s not like you have a bunch
of Facebook employees poring over your every word,” writing that “[i]n truth, it’s a machine that’s trying to spot
patterns and red flags” for “a good cause.”
Page 8 of 22
APP. 46
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
28
FB000000393
Cale Guthrie Weissman, Facebook Uses
Scanning Technologies, Alters Authorities
About Content, OpenNet Initiative (July
16, 2012), available at
https://opennet net/blog/2012/07/facebookuses-scanning-technologies-alertsauthorites-about-content/.
29
FB000000395
30
FB000000399
Summary
Here, the author writes that “Facebook regularly scans user content for criminal activity, but the monitoring program is
something the social media giant has generally kept quiet about. “ Quoting a report from Reuters, the author
acknowledges that “this sort of scanning is commonplace for platforms like Facebook—most large social media
companies scan chats for inappropriate language and exchange of personal information,” but he opines that many
social media platforms “walk a tightrope between utilizing these tactics to safeguard against illegal activity and
providing a less restrictive social media platform that will engage users.” The author writes that “[s]canning users’
content is not new terrain for Facebook,” citing information leaked that April [2012?] “showing the kinds of user
information Facebook releases to authorities when subpoenaed” and the “Information for Law Enforcement
Authorities” section of Facebook’s website. The author quotes Facebook’s then-Chief Security Officer Joe Sullivan,
who says, “‘We’ve never wanted to set up an environment where we have employees looking at private
communications, so it’s really important that we use technology that has a very low false-positive rate.’” He concludes
that “[s]canning technologies of this design are just beginning to come to the forefront for various websites,” and that
Facebook’s “example highlights the tactics used and suggests a possible upward trend in surveillance by social media
platforms.”
The author of this article writes that Facebook “uses monitoring software that can scan and flag suspicious messages to
Michael Walsh, Did You Know that
Facebook Monitors Postings and Chats for minors from potential predators.” Citing an instance in which Facebook’s software effectively flagged “suspicious
Sexual Predators?, NY Daily News (July conversations between a man in his early thirties and a 13-year old girl,” the author opines that Facebook’s
16, 2012), available at
“surveillance practice is fraught with legal complexity” and “Facebook tends to avoid comment on this practice,
http://www nydailynews.com/news/crime/f because the organization doesn’t want to create scare stories or stir surveillance paranoia.” The author quotes
acebook-monitors-postings-chats-sexual- Facebook’s then-Chief Security Officer Joe Sullivan, who says, “‘We’ve never wanted to set up an environment where
predators-article-1.1115392/.
we have employees looking at private communications, so it’s really important that we use technology that has a very
low false-positive rate.’” The author concludes that “[t]o minimize the risk of inappropriate surveillance the software
and procedures are designed to err on the side of monitoring caution.”
Jemima Kiss, Facebook Puts Faith in its
Software Smarts to See Off Sexual
Predators, The Guardian (April 15, 2010),
available at
http://www.theguardian.com/technology/20
10/apr/16/facebook-software-sexualpredators/.
This article reports that “Facebook has developed sophisticated algorithms to monitor its users and detect inappropriate
and predatory behaviour, bolstering its latest raft of initiatives to improve the safety of its users.” The author writes
that “[h]aving launched an education campaign, an improved reporting procedure and a 24/7 police hotline,” Facebook
indicated that it “has introduced a number of algorithms that track the behaviour of its users and flag up suspicious
activity.” The author also notes that Facebook has another filter, “common on web publishing sites [ ] [that] scans
photo uploads for skin tones and blocks problem images.” She quotes Facebook employee Matt Kelly, who says that
Facebook “balance[s] its duty to respect its users while meeting its legal obligations,” and that Facebook’s “corporate
philosophy about data is that the user is in control, and they choose how to share and distribute it. She ends by noting
that one privacy campaigner believes Facebook “needed to do more to stop persistent stalkers and bullies who could
use multiple identities, and cautioned against the automated profiling of users.”
Page 9 of 22
APP. 47
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
31
FB000000420
Josh Wolford, Should Facebook Monitor
Chats to Help Snag Child Predators?,
WebProNews (August 23, 2012),
available at
http://www.webpronews.com/rememberhow-facebook-is-monitoring-chats-forcriminal-activity-well-it-worked-kind-of2012-08/.
Here, the author begins by stating that “social networks are social – you’re actively sharing content with the world,”
and “anybody who thinks they can maintain a pristine level of privacy and security while still enjoying the benefits of a
social community is probably deluding themselves.” He writes that “recently, it was revealed that Facebook actively
patrols user communications for unlawful activities,” and that “Facebook is actively monitoring our chats and
messages.” The author notes that Facebook “revealed that it’s common practice for their teams to scan chats, searching
for criminal activity” and that “[i]t’s mostly algorithms that handle this part, but once something is flagged Facebook
employees make the final decision on whether or not it merits calling the authorities.” He posits that “[t]here’s really
no denying that it can work,” as “[s]canning chats for suspicious activity can help to thwart child predation,” noting
that ‘there are still privacy concerns to consider.” He writes that “[n]ot everyone is convinced that Facebook has the
right to monitor ‘private’ communications,” but states, “Then again, you are using their (free) service to send and
receive communications, and at least now it’s with the public knowledge that the company may be monitoring them.”
He adds that Facebook is “not the only one[ ] engaging in this type of monitoring.”
32
FB000000066
Sue Keogh, How accurate is the Facebook
Like count?, The Wall (April 8, 2011),
available at
http://wallblog.co.uk/2011/04/08/howaccurate-is-the-facebook-like-count/.
Referencing a blog post that suggests that the “accuracy” of the number of “Likes” indicated for a given Facebook page
on the website may be as low as 39%, the author of this article writes that the “Like” count on a third-party website also
reflects additional actions other than affirmative clicks on the “Like” button. The author quotes Facebook’s Developer
Guidance, which explains that “‘[t]he number shown is the sum of . . . [actions including] [t]he number of inbox
messages containing this URL as an attachment.’” She concludes that “the information is out there, if you know where
to look. I might even click Like to help spread the word.”
33
FB000000068
Bianca Bosker, Facebook’s Paid Messages
Test Taxes You For Being Social,
Huffington Post Tech (December 20,
2012), available at
http://www huffingtonpost.com/biancabosker/facebook-messages-test_b_2341521 html/.
This article concerns Facebook’s testing of a new program that permits individuals with whom a user is connected to
“pay to re-route their message from the ‘other’ heap straight to your inbox.” The author notes that this feature “is being
rolled out in conjunction with new filters for Facebook’s messaging system that aim to ensure important messages
don’t go unseen in the ‘Other’ inbox.” She also writes that “[a]s a user who receives no shortage of spam messages,
I’m all for cutting back on clutter or fining advertisers who want to get hold of me in my inbox, uninvited,” but she
labels the payment proposal “anti-social.” The author quotes Facebook’s statement explaining that the test is
“‘designed to address situations where neither social nor algorithmic signals are sufficient. For example, if you want to
send a message to someone you heard speak at an event but are not friends with, or if you want to message someone
about a job opportunity, you can use this feature to reach their Inbox. For the receiver, this test allows them to hear
from people who have an important message to send them.’” Acknowledging that “[k]eeping savvy scammers at bay is
a gargantuan challenge for Facebook,” she argues that “Facebook profits from allowing people—and, most likely,
brands—to take up your time when you made clear you didn’t want them to,” and she expresses concerns that the test is
“more like a fix for Facebook’s profit push than a solution to overcrowded inboxes.”
34
FB000000073
How to get share counts using graph API,
Stack Overflow (June 15, 2012), available
at
http://stackoverflow.com/questions/569927
0/how-to-get-share-counts-using-graphapi/.
On this developer forum, a user requests “a way to get share counts of an URL using graph API.” A user named “Jim
Rubenstein” responds on June 15, 2012, clarifying that Facebook’s Graph API is a way to get the “share count,” which
is “not equal to the one you see on the Like button, since that number is the sum of [items including] [t]he number of
inbox messages containing [the] URL as an attachment.”
Page 10 of 22
APP. 48
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
35
FB000000077
Jessican Guynn, Facebook Looks To Cash In this article, the author notes that “[p]rofiles, status updates and messages all include a mother lode of voluntarily
In On User Data, Los Angeles Times (April provided information,” and that Facebook uses this information “to help advertisers find exactly who they want to
17, 2011), available at
reach.” The author describes how Facebook is helping advertisers reach their target audiences with precision:
http://articles.latimes.com/2011/apr/17/busi “Facebook doesn’t have to guess who its users are or what they like. Facebook knows, because members volunteer this
ness/la-fi-facebook-ads-20110417/.
information freely—and frequently—in their profiles, status updates, wall posts, messages and ‘likes.’ It’s now
tracking this activity, shooting online ads to users based on their demographics, interests, even what they say to friends
on the site—sometimes within minutes of them typing a key word or phrase.” The author also suggests that Facebook’s
ability to “mine data and sell advertising based on what its members voluntarily share amounts to electronic
eavesdropping on personal updates, posts and messages that many users intended to share only with friends,” but she
concedes that “any information users post on the site—hobbies, status updates, wall posts—is fair game for ad
targeting.” The article concludes by quoting a Facebook user who says she enjoys “receiving ads from merchants”: “‘I
don’t feel any weird privacy thing. We are all putting everything out there already.’”
36
FB000000083
Rove Monteux, Facebook Graph API
In this post, the blogger explains that “simply private messaging an URL to some people on Facebook will increase the
exploit that let[s] you pump up to 1800
‘like’ count for the given URL.”
‘Likes’ in an hour, Rove Monteux (October
5, 2012), available at
http://rmonteux.wordpress.com/2012/10/05
/facebook-graph-api-exploit-that-lets-youpump-up-to-1800-likes-in-an-hour/.
37
FB000000087
Bianca Bosker, Facebook Home’s Ultimate
Goal: Ingesting Your Messages,
Huffington Post Tech (April 11, 2013),
available at
http://www huffingtonpost.com/2013/04/11
/facebook-homemessages_n_3063609 html/.
Reporting on the launch of Facebook Home, Facebook’s smartphone software, the author writes that “tech industry
analysts note that the sooner people channel their chatting through Facebook, the sooner Facebook can turn messaging
from communication between friends into a moneymaker that involves brands. More messaging will give Facebook
more data it may use to provide advertisers with personal, personalized ways of interacting with its members.” The
author notes that “Facebook’s Messenger app currently gives people the option to attach location information to each
post,” and that Facebook next “might analyze the content of messages to serve up ads targeted to each conversation,
much like Gmail.” She quotes an analyst who says, “‘[Facebook’s messaging platform] is not just for connecting
people. It’s for connecting brands, too.’” The author also suggests that “[w]hether users would accept advertising via
messages—private, personal and traditionally off-limits to brands—remains to be seen,” quoting an analyst who says
that “‘[t]here’s a certain creepy factor to that, but users are getting much more comfortable trading privacy for
convenience.’”
Page 11 of 22
APP. 49
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
38
FB000000097
Emil Protalinski, Facebook Confirms It Is
Scanning Your Private Message For Links
To Increase Like Counters, The Next Web
(October 4, 2012), available at
http://thenextweb.com/facebook/2012/10/0
4/facebook-confirms-it-is-scanning-yourprivate-messages-for-links-so-it-canincrease-like-counters/.
Here, the author writes that Facebook “is monitoring your private messages for links that have Like buttons and should
be increased,” and he quotes a Facebook spokerperson’s statement about the issue: “We did recently find a bug within
our social plugins where at times the count for the Share or Like goes up by two, and we are working on [a] fix to solve
the issue now. To be clear, this only affects social plugins off of Facebook and is not related to Facebook Page likes.
This bug does not impact the user experience with messages or what appears on their timelines.” The author indicates
that while this was “news” to him, “this was clearly the case before as on the Like button web page over on Facebook
Developers, the social networking giant says the number shown on a Like button is the sum of [items including] [t]he
number of inbox messages containing this URL as an attachment.” The author writes that he had “known for a while
that the Like button isn’t a counter of just Likes,” but he opines that “private messages . . . have privacy questions
attached to them.” He notes that Facebook confirmed that “[w]hen the count is increased via shares over private
messages, no user information is exchanged, and privacy settings of content are unaffected. Links shared through
messages do not affect the Like count on Facebook Pages.” In an update to the article, the author quotes Facebook’s
explanation of the situation: “‘Our systems parse the URL being shared in order to render the appropriate preview, and
to also ensure that the message is not spam.’”
39
FB000000104
Alicia Eler, Facebook Is Using Your Data
Whether You Like It Or Not, Read Write
(November 28, 2011), available at
http://readwrite.com/2011/11/28/facebook_
is_using_your_data_whether_you_like_it_
or/.
Here, the author discusses the implications a European Commission Directive that purportedly “ban[s] targeted
advertising unless users specifically say they want it” may have on Facebook. In response to the question, “Why isn’t
this happening in America?,” the author writes, “[Because] [a]ll 800 million Facebook users agree to let the company
use their personal information.” She further writes that “Facebook has information about a user’s friends, family,
education background in addition to Facebook such as ‘likes’ and everything that gets posted to Facebook Walls . . .
Messages and ‘chats’ are stored, too, even if the user deletes them.” She also highlights that “Facebook denies tracking
peoples’ behavior to serve advertising . . . denies selling users’ personal information to third parties . . . claims that
advertisers only see ‘anonymous and aggregate information,’ using that to serve up targeted ads . . . [and] does not
target individual users.” The author also notes that “with an IPO in the works, and Facebook’s move toward becoming
self-sufficient, there’s no denying that advertising on the site has increased,” concluding by noting recent Federal Trade
Commission scrutiny of Facebook’s practices.
40
FB000000106
Facebook Launches Messaging System
With In-Bound Message Filter, Out Law
(November 10, 2010), available at
http://www.out-law.com/page-11555/.
This article reports on Facebook’s launch of the Messages product and its features. The author quotes Facebook’s
statement that “‘[i]t seems wrong that an email message from your best friend gets sandwiched between a bill and a
bank statement . . . It’s not that those other messages aren’t important, but one of them is more meaningful. With new
Messages, your Inbox will only contain messages from your friends and their friends. All other messages will go into
an Other folder where you can look at them separately.’” Noting that “[c]ritics of Facebook’s past approaches to user
privacy will be assessing the system closely,” the author quotes a researcher who states that “‘Facebook announced it
will not utilize the content of users’ personal messages to target advertising . . . This is surprising, considering doing so
is typical among web-based email clients; both Gmail and Yahoo Mail scan users’ messages for keywords in order to
better serve relevant advertising.’” The author concludes by quoting a technology consultant who opines that
“‘Facebook would be able to track vital information through the Messages system.’”
Page 12 of 22
APP. 50
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
41
FB000000109
Ellis Hamburger, How To Find All The
Private Messages Facebook Is Hiding From
You, Business Insider (December 12,
2011), available at
http://www.businessinsider.com/facebookmessages-other-folder-2011-12/.
This article explains how to access the “Other Messages” folder in Facebook Messages, noting that “Facebook often
groups messages from people you aren’t ‘Friends’ with into a spam-box-evoking ‘Other Messages’ folder.” The author
writes, “While most messages that get filtered into your Other folder are in fact spam, it’s definitely worth digging
through once in a while.”
42
FB000000110
The author writes that Facebook “scans [] private messages to friends, and when it sees a link to a ‘Likeable’ page, it
Kashmir Hill, Facebook Scans Private
Messages To Hand Out Public ‘Likes’,
doles out ‘Likes’ accordingly.” The article summarizes the findings of the Wall Street Journal article on this topic
Forbes (October 4, 2012), available at
written by Ashkan Soltani and states, “We already know that Google scans our Gmail to target us with ads and that
http://www forbes.com/sites/kashmirhill/20 Facebook scans our inboxes looking for sexual predators and child porn, but many users may not have realized that the
12/10/04/facebook-scans-private-messages- links they were exchanging privately resulted in a public—though anonymous—endorsement of said links.” The
to-hand-out-public-likes/.
author includes a statement by a Facebook spokesperson on the issue: “‘Many websites that use Facebook’s “Like”,
“Recommend”, or “Share” buttons also carry a counter next to them. This counter reflects the number of times people
have clicked those buttons and also the number of times people have shared that page’s link on Facebook. When the
count is increased via shares over private messages, no user information is exchanged, and the privacy settings of
content are unaffected. Links shared through messages do not affect the Like count on Facebook Pages.’” The article
also includes an update that reads, “A spokesperson also emphasizes that this [is] a third-party Social Plug-in version of
the ‘Like’ button and that it is meant to reflect engagement rather than endorsement.”
43
FB000000119
Ken Yeung, Facebook Updates Messages
Feature With Filtering, Tests A Service To
Let People Pay To Send Them, The Next
Web (December 20, 2012), available at
http://thenextweb.com/facebook/2012/12/2
0/facebook-messages-now-with-filters/.
Here, the author reports on Facebook’s addition of two “filters to help solve the problem of finding messages that
should be seen and seeing those that shouldn’t” to the Messages product. The author notes that “[t]here are many
signals that go into determining what gets through to a user’s Message inbox, but one new signal Facebook is testing is
allowing some users the ability to pay in order to help get their messages delivered, regardless of friendship status on
the network.” He mentions the “Other” folder in Facebook Messages that “basically acts as the catch-all for all
communications that it deems to have low relevancy,” positing that “[i]t’s too bad that a few innocent messages get
tagged and removed because of this filter.” He concludes by explaining that the settings for the new filters will be
“visible right with the Messages screen.”
44
FB000000148
Doug Gross, How You Help Facebook
Make Millions, CNN Tech (May 16, 2012),
available at
http://www.cnn.com/2012/05/16/tech/socia
l-media/facebook-users-ads/.
In this article, the author writes that the following are the “building blocks” of Facebook as “a multibillion-dollar
company”: “Every post you ‘like.’ Every friend you add or fan page you join. Every place you check in, and every
Web page you recommend.” The author writes that “Facebook’s unprecedented advertising advantage is built upon the
service it provides. As users interact with the site, they gradually build a fuller and fuller picture of themselves. That,
in turn, lets Facebook sell advertisers on its ability to put their product in front of the people most likely to be
interested.” He notes that Facebook’s advertising model has “made some folks antsy,” quoting an Associated
Press/CNBC poll that found that “three out of five users say they have little or no faith that the company will protect
their personal information.” The author concludes by quoting an analyst who “‘expects’” that Facebook’s “‘datadriven model [will] keep making money well into the future.’”
Page 13 of 22
APP. 51
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
45
FB000000180
Phil Villarreal, Rescue Messages From
Facebook’s De Facto Spam Filter,
Consumerist (December 11, 2011),
available at
http://consumerist.com/2011/12/15/rescuemessages-from-facebooks-de-facto-spamfilter/.
In this article, the author discusses Facebook’s “Other” folder in the Messages product, writing that “[w]hen Facebook
thinks you don’t particularly want to read a message that’s sent your way, it redirects it into” that folder. He writes that
“[s]ome users forget to check the box regularly, and others may not even be aware that they have it,” and he provides
information about how to access the folder.
46
FB000000194
Greg Finn, The Formula Behind The
Facebook ‘Like’ Number, Marketing Land
(October 17, 2012), available at
http://marketingland.com/the-formulabehind-the-facebook-like-number-24069/.
Here, the author reports on the “uproar” surrounding reports that “messages were not only being crawled, but also used
towards the overall ‘Like data’ of a page.” He writes that “[o]ne of the important lessons that marketers learned from
the situation was that the “‘Like count’ wasn’t really about likes, rather other interactions (and messaging) that occurs
on Facebook.” He examines what goes into “Facebook Like data,” quoting Facebook’s Developer site and noting that
“four different variables make up the Like number [including] [t]he number of inbox messages containing this URL as
an attachment.” He writes that a Facebook spokesperson confirmed that the listed elements are “all counted into the
overall Facebook Like data.”
47
FB000000217
JVG, Facebook Tweaks Messages With
Inbox Filters and Tests Pay-to-Deliver
Option, Venture Beat (December 20,
2012), available at
http://venturebeat.com/2012/12/20/faceboo
k-messages-inbox/.
Here, the author reports on Facebook’s announcement that it was “tweaking the Facebook Messages inbox experience
to better ensure that relevant messages get to your inbox.” The author writes that “[t]he new inbox filters, which are
rolling out globally, are Facebook’s way of correcting a broken system. The company currently routs messages to your
inbox or other folder based on your settings, but it has found that it pushes too many ‘high signal’ messages (read as:
messages you probably want to go to your Other folder, where they likely go unseen. To fix the problem, Facebook has
created two filters, basic and strict, that will allow certain types of messages to reach your inbox that otherwise would
not . . . Both options, however, use the ‘mostly’ terminology to allow for instances when Facebook puts its algorithmic
magic to work and determines that a high signal message should get through.” The author also reports that Facebook is
lauching a “small test” to allow people that are not your Facebook friends to pay a fee to send messages directly to your
inbox.
48
FB000000237
Oliver Chian, Facebook Messages Isn’t a In this article, the author reports on the launch of the Facebook Messages product, arguing that it’s a “problem” that the
Gmail Killer—And That’s the Problem,
product is not a “Gmail killer.” He provides a “quick rundown” of the features of Facebook Messages, writing that the
Forbes (November 15, 2010), available at product “takes all these bits of conversation and keeps a collective conversation history” and “filters messages based
http://www forbes.com/sites/oliverchiang/2 on your Facebook social graph.” He notes, for example, that “[t]raditional spam emails will end up in Junk” and that
010/11/15/facebook-messages-isnt-a-gmail- Facebook users “will have control over this filtering too” because they can “move it manually [and] [t]he system will
killer-and-thats-the-problem/.
remember your choices and change your filtering preferences accordingly.” The author writes that the “main problem”
about Facebook Messages is that it “has no way of telling if your mother is sending an important email about your
cousin’s wedding, or a forwarded email about the billionth funny cat video ever made.”
49
FB000000276
Jay Yarow, How to Find Facebook
Messages that Facebook is Hiding From
You, Business Insider (June 15, 2012),
available at
http://www.businessinsider.com/facebookmessages-spam-filter-2012-6/.
Here, the author writes that Facebook “has a pretty aggressive spam filter to keep messages from creepers, and trolls
from hitting your Facebook message inbox,” opining that the spam filter is “sometimes too aggressive, and grabs
messages that aren’t spam.” The author also provides instructions about how to check Facebook messages in the
“Other” folder.
Page 14 of 22
APP. 52
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
50
FB000000312
Donna Tam, To Facebook, A Shared Link
is As Good as a Like, Cnet (October 4,
2012), available at
http://www.cnet.com/news/to-facebook-ashared-link-is-as-good-as-a-like/.
Here, the author reports that “[a] recent bug adds two Likes to the count instead of one, and Facebook said it’s working
to fix that.” The author writes that “the feature may rankle some users who don’t want to be part of an overinflated
counter” or others who “feel violated” that Facebook “know[s] what you’re sending a Friend.” She notes Facebook
says “this feature doesn’t affect Like counts on Facebook pages and it’s not an invasion of privacy since the Likes on
the plugs-ins are anonymous,” quoting a Facebook spokerperson’s statement on the issue in full. She writes that a
Facebook representative also confirmed that “Facebook does scan any links that pass through the network to look for
spam,” including “links you send to a friend through messages,” and that “Facebook automatically reviews the links
before generating a link preview.” The author concludes by noting that “even if you were sharing a link to show a
friend something you don’t like, you’d still be adding to the page’s Like numbers.”
51
FB000000314
Mark Langshaw, Facebook Adding Likes This author reports that a “security expert discovered that sending a web address to a friend automatically adds two
on Users’ Behalf, Says Report , Digital Spy likes to that page, suggesting that [Facebook] is scanning private messages.” The author notes that “Facebook has
(October 5, 2012), available at
responded to the report’s findings and issued a statement denying that privacy information has been exposed,” quoting
http://www.digitalspy.com/tech/news/a410 certain portions of Facebook’s official statement, including the following: “Absolutely no private information has been
470/facebook-adding-likes-on-users-behalf- exposed . . . . Each time a person Shares a URL to Facebook, including through messages, the number of Shares
says-report html#~p6spSJD5SKvz21/.
displayed on the social plugin for that website increases. Our systems parse the URL being shared in order to render
the appropriate preview, and to also ensure that the message is not spam. These counts do not affect the privacy
settings of content, and URLs shared through private messages are not attributed publicly with user profiles.” The
author concludes that “[a]lthough Facebook also stressed that the additional likes are anonymous and will not appear
on users’ timelines, critics have pointed out that people who share pages to highlight negative content are making the
site appear more popular.”
52
FB000000335
Liz Klimas, Why is the Privacy of Personal
Facebook Messages Being Called Into
Question—Again?, The Blaze (October 5,
2012), available at
http://www.theblaze.com/stories/2012/10/0
5/why-is-privacy-of-personal-facebookmessages-being-called-into-questionagain/.
This author writes that the other publications have reported that “including links in private messages [on Facebook]—if
these links had a ‘like’ button associated with them –would increase the “likes” on that actual page by two.” The
author writes that “Facebook has responded saying that there was a bug identified in the system that was accidentally
counting one ‘like’ or ‘share’ of a link or post as two” and was “working to fix it.” She writes that “Facebook
emphasized that ‘no private information has been exposed,’” and that this means that “if you receive a message
containing a link that has a ‘like’ button, you are not automatically ‘liking’ this item on your Timeline.” The author
continues by noting that “it’s probably news for many” that the social plugin counter “is not just measuring clicks but
sharing content as well,” and she quotes another publication that reported that “‘[O]n the Like button Web page over
on Facebook Developers, the social networking giant says the number shown on a Like button is the sum of: [t]he
number of likes of this URL; [t]he number of shares of this URL (this includes copy/pasting a link back to Facebook);
[t]he number of likes and comments on stories on Facebook about this URL; [t]he number of inbox messages
containing this URL as an attachment.’” The author concludes by quoting a reporter who “writes that Facebook has
‘not crossed a line with this latest news, any more than they have on hundreds of other occasions.’”
Page 15 of 22
APP. 53
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
53
FB000000347
Greg Finn, Your Private Facebook
Messages Aren’t So Private: Shared Links
Count Toward ‘Like’ Data, Marketing
Land (October 8, 2012), available at
http://marketingland.com/your-privatefacebook-messages-arent-so-private-aslinks-count-towards-like-data-23400/.
Here, the author reports that Facebook’s “Like button is an aggregate score from a variety of Facebook actions,
including links shared within private messages.” He continues, “Facebook did confirm that the issue of double counts
was a bug, but did also confirm that shared messages do count towards the overall ‘like’ data,” and he quotes the
Facebook Developer site that “clearly states the following about Like buttons”: “‘The number shown is the sum of:
[t]he number of likes of this URL; [t]he number of shares of this URL (this includes copy/pasting a link back to
Facebook); [t]he number of likes and comments on stories on Facebook about this URL; [t]he number of inbox
messages containing this URL as an attachment.’” He reports that “[t]he fact that private shares gave an endorsement
(even if an anonymous one) drew a bit of an uproar” because users sharing a link of a product they don’t like “will still
be counted as a ‘like.’” He concludes by quoting Facebook’s statement on the issue.
54
FB000000361
Louis Goddard, Facebook Analyzes
Relationships and Chats to Flag Up Sexual
Predators, The Verge (July 13, 2012),
available at
http://www.theverge.com/2012/7/13/31564
99/facebook-paedophile-scanning/.
In this article, the author reports that “Facebook automatically scans posts and chat logs for criminal activity, using big
data processing techniques similar to those used in targeting advertising to determine the most vulnerable users.” He
writes that Facebook’s “scanning tools use factors such as mutual friends, past interaction, distance and age
difference—alongside simple phrase searches—to flag potentially nefarious conversations for human moderators” and
“rely on archives of previous conversations that are known to have led to sexual assaults, identifying patterns and
searching for similar ones.” The author quotes Facebook’s then-Chief Security Officer Joe Sullivan, who says,
“‘We’ve never wanted to set up an environment where we have employees looking at private communications, so it’s
really important that we use technology that has a very low false-positive rate.’” The author writes, “Privacy issues
aside, it would be practically impossible for human moderators to effectively trawl through the vast amount of data
generated by more than 900 million users each day.”
55
FB000000388
Fahmida Y. Rashid, Facebook Scans Chats Here, the author reports that “Facebook has technology in place to monitor user conversations for suspicious activity
for Criminal Activity, PC Mag (July 13,
and notify police when necessary.” She writes that Facebook’s “scanning technology monitors chats for words or
2012), available at
phrases that may signal that something is wrong, such as personal information being exchanged or explicit language
http://securitywatch.pcmag.com/security/30 being used.” The author writes that Facebook stated, “‘We’ve never wanted to set up an environment where we have
0288-facebook-scans-chats-for-criminal- employees looking at private communications, so it’s really important that we use technology that has a very low falseactivity/.
positive rate,’” and she notes that “Facebook security employees don’t see any of the conversations until the scanning
technology actually flags the exchange.” She reports that one security advisor found the news to be “‘scary and more
than a bit surprising,’” but the author opines that “it’s nice to know that Facebook is keeping a distant eye on chat logs
for criminal behavior.” She provides “tips” for people to protect themselves on Facebook and other social networking
sites, and she writes that “Facebook relying on software to pre-scan chats protects the company from privacy concerns
that someone is monitoring all conversations.”
56
FB000000397
Walter Pacheco, Facebook Scans
Conversations For Criminal Activity,
Orlando Sentinel (July 13, 2012),
available at
http://www.orlandosentinel.com/business/t
echnology/os-facebook-scans-criminalactivity-20120713-post html/.
Here, the author cites a Reuters report stating that “Facebook is scanning users’ chats and posts for possible criminal
activity.” The author writes that Facebook’s then-Chief Security Officer Joe Sullivan stated that “Facebook monitors
conversations for words and phrases that suggest potential criminal activity, as well as the exchange of personal
information between users with a wide age gap,” and that Facebook software “‘searches for words often found in the
chat records of convicted criminals, including sex offenders, who used social media to find their victims.’”
Page 16 of 22
APP. 54
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
57
FB000000405
Justin Reynolds, Facebook Monitors
Potentially Illegal Posts, Chats, WestonReddington-Easton Patch (July 13, 2012),
available at
http://patch.com/connecticut/westonct/facebook-eyes-posts-about-illegalactivity/.
Here, the author reports that, “[u]sing data recognition software, Facebook employees monitor certain users’ posts and
chats, scanning them for potentially illegal activity which in some cases has led the social media giant to contact
police.” The author writes that CNET reports that Facebook “isn’t actively monitoring all communications on
Facebook, as it wants its users to maintain their privacy” and that “[t]he software the company uses to analyze
communications which are potentially illegal has a low false-positive rate.” The author concludes by noting that
Facebook’s “scanning program looks for certain phrases found in previously obtained chat records from criminals,
including sexual predators,” and that “[t]he relationship analysis and phrase material have to add up before a Facebook
employee actually looks at communications and makes the final decision of whether to ping the authorities.”
58
FB000000416
This article reports that “[w]hat you say in your private chats and messages on Facebook may not be as private as you
Jillian Ryan, Your Facebook Chats are
Being Monitored, Find out Why: The
think,” citing a Reuters report that indicated that Facebook “employs a mums-the-word technology that scans posts and
Social Media Privacy Report, The Private chats for criminal activity.” The author notes that Facebook’s “monitoring came to light” earlier that year when
WiFi Blog (July 20, 2012), available at
Facebook’s software detected an alleged sexual predator. She quotes Reuters’s description of Facebook’s efforts for
http://www.privatewifi.com/your-facebook- detecting criminal activity, which “‘generally start with automated screening for inappropriate language and exchanges
chats-are-being-monitored-find-out-why- of personal information, and extend to using the records of convicted pedophiles’ online chats to teach the software
the-social-media-privacy-report/.
what to seek out.’”
59
FB000000418
Lee Bell, Facebook Scans Private Chats
and Posts for Criminal Activity, The
Inquirier (July 13, 2012), available at
http://www.theinquirer net/inquirer/news/2
191599/facebook-scans-private-chats-andposts-for-criminal-activity/.
This article reports that “Facebook scans private chats and posts for criminal activity,” noting that Facebook’s software
for “‘scanning postings and chats’” detected an alleged sexual predator. The article quotes a security consultant who
states, “‘It shouldn’t surprise anybody that Facebook is trying to make its site a safer place by monitoring for illegal and
suspicious behaviour which might bring it into disrepute. Obviously we have to hope that Facebook acts responsibly,
and puts measures in place to prevent inappropriate monitoring - or risk a backlash from users.’”
60
FB000000423
Kate Tummarello, Facebook Knows When
You’re Chatting About Your Illegal
Activities, DCInno (July 13, 2012),
available at
http://dcinno.streetwise.co/2012/07/13/face
book-knows-when-youre-chatting-aboutyour-illegal-activities/.
In this article, this author writes that “[a]ccording to online reports, Facebook uses a software that screens private chats
to determine if participants are discussing illegal activities.” She quotes a Mashable article on this topic that explains
how Facebook’s software works, and she notes that Facebook commented on the issue as follows: “‘We’ve never
wanted to set up an environment where we have employees looking at private communications, so it’s really important
that we use technology that has a very low false-positive rate.’” She ends by noting that “[b]y keeping most of the
private chat records away from employees, Facebook is protecting itself from some privacy advocates.”
61
FB000000358
Here, the author opens by asking, “Ever wonder if Facebook is reading your posts?” He answers, “Well, it is—or, its
Will Oremus, Facebook Monitors Your
Posts and Chats to Catch Sexual Predators, computers are, at least,” quoting a Reuters article that recounted a case in which Facebook’s software detected a man in
Slate (July 17, 2012), available at
his thirties allegedly trying to set up a meeting with a 13-year-old girl for sex. The author writes that “[]in Facebook’s
http://www.slate.com/blogs/future_tense/20 case, the scanning hasn’t stirred outrage—probably because it seems to be focused on catching sexual predators.” He
12/07/17/online_privacy_facebook_monito concludes by writing, “It seems clear that this technology has the potential to do some good. But that shouldn’t blind
rs_your_posts_chats_to_catch_sexual_pred us to the fact that it represents a further erosion of our online privacy, one more serious than selling our personal
ators html/.
information to advertisers.”
Page 17 of 22
APP. 55
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
62
FB000000364
Emil Protalinski, Facebook Scans Chats
The author of this article reports that “Facebook has added sleuthing to its array of data-mining capabilities, scanning
and Posts for Criminal Activity, CNet (July your posts and chats for criminal activity.” He writes that Facebook’s “scanning program looks for certain phrases
12, 2012), available at
found in previously obtained chat records from criminals, including sexual predators,” noting that the “relationship
http://www.cnet.com/news/facebook-scans- analysis and phrase material have to add up before a Facebook employee actually looks at communications and makes
chats-and-posts-for-criminal-activity/.
the final decision of whether to ping the authorities.” The author also writes that “details of the tool are still scarce,”
and that “Facebook likely wants to avoid discussing the existence of the monitoring technology in order to avoid
further privacy concerns” because “[m]any users don’t like the idea of having their conversations reviewed, even if it’s
done by software and rarely by Facebook employees.”
63
FB000000367
Alex Fitzpatrick, Facebook Monitors Your
Chats for Criminal Activity, Mashable
(July 12, 2012), available at
http://mashable.com/2012/07/12/facebookscanning-chats/.
This article notes that “Facebook and other social platforms are watching users’ chats for criminal activity.” The author
writes that Facebook’s “screening process begins with scanning software that monitors chats for words or phrases that
signal something might be amiss,” and that Facebook’s “scanning program is also ‘smart’ [because] it’s taught to keep
an eye out for certain phrases found in the previously obtained chat records from criminals including sexual predators.”
He suggests that “[k]eeping most of the scanned chats out of the eyes of Facebook employees may help Facebook
deflect criticism from privacy advocates, but whether the scanned chats are deleted or stored permanently is yet
unknown.” He ends by quoting Facebook’s then-Chief Security Officer Joe Sullivan, who says, “‘We’ve never wanted
to set up an environment where we have employees looking at private communications, so it’s really important that we
use technology that has a very low false-positive rate.’”
64
FB000000095
Emil Protalinski, Facebook is hiding your
messages from you, ZDNet (December 10,
2011), available at
http://www.zdnet.com/blog/facebook/faceb
ook-is-hiding-your-messages-fromyou/6017/.
This article concerns Facebook’s “Other” Messages folder and Facebook’s filtering mechanism for the Messages
product. The author writes that the “Other” Messages folder is “supposed to work as a junk/spam folder, “ and that
while he mainly received “mass” and “spam” messages in his own folder, other people were “really annoyed that
Facebook classified some [messages] incorrectly because they missed important information.” The article quotes
Facebook’s description of the new filtering service: “‘It seems wrong that an email from your best friend gets
sandwiched between a bill and a bank statement . . . It’s not that those other messages aren’t important, but one of them
is more meaningful. With new Messages, your Inbox will only contain messages from your friends and their friends.
All other messages will go into an Other folder where you can look at them separately.’” The author writes, “I have no
problem with such a folder existing: even my friends who say they missed an important message admit that most of
their messages in there are not worth their time,” and he concludes by suggesting that Facebook make the “Other
Messages” folder more “obvious.”
65
FB000000123
Elizabeth Weingarten, Furious At
Facebook Again, Slate (December 9,
2011), available at
http://www.slate.com/articles/technology/te
chnology/2011/12/facebook_s_other_mess
ages_mail_you_are_probably_missing html
/.
In this article, the author recounts a story where Facebook filtered a message into the “Other” Messages folder that she
had wanted to receive. She indicates that she spoke with a Facebook representative, who explained Facebook’s
filtering technology and that its move to a “Social Inbox” in November 2010 was a means for “sift[ing] out
‘meaningful’ messages from less meaningful ones.” The author quotes a Facebook statement about the “Other”
Messages folder: “‘It seems wrong that an email message from your best friend gets sandwiched between a bill and a
bank statement. It’s not that those other messages aren’t important, but one of them is more meaningful. With new
Messages, your Inbox will only contain messages from your friends and their friends. All other messages will go into
an Other folder where you can look at them separately.’” She concludes by asking, “So do I curse Facebook because it
hid [the] messages, or praise it for allowing him to get in touch? I’m going to do both. Thanks, Facebook, for helping
this nice man return my laptop. But please try to explain your services better. I suspect many people would be
grateful ”
Page 18 of 22
APP. 56
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
66
FB000000139
Dave Copeland, Facebook’s Email
Reporting that “Facebook confirmed [that week] that it scans private messages for links and records them as likes,
Scanning Isn’t A Privacy Issue, It’s A
according to the Wall Street Journal and other news outlets,” the author opines that this practice injures Facebook’s
Credibility Issue, Read Write (October 5, credibility. The author writes that “Facebook has not kept secret its scanning of private messages for references to
2012), available at
criminal activity,” but he suggests that it is “new [ ] that it also looks for links and records those at likes,” which he
http://readwrite.com/2012/10/05/facebooks- claims “gives the appearance that more people are liking more things on the social network.” Quoting Facebook’s
email-scanning-isnt-a-privacy-issue-its-a- statement that the “scanned links were counted as engagement, not endorsement,” the author opines that Facebook’s
credibility-issue/.
statement “misses the point” because “Facebook’s practice of scanning messages and counting links as likes isn’t a
privacy issue.” He writes that “[i]t’s common knowledge that what users do online—even in so-called private
messaging—is potentially public.” Instead, the author opines, “Facebook’s activity raises a credibility issue” because
“[i]t shows that the company is fudging the numbers when it comes to advertising.”
67
FB000000209
Jim Edwards, This Flaw In Facebook Lets
You Create As Many Fake Likes As You
Want, Business Insider (October 5, 2012),
available at
http://www.businessinsider.com/this-flawin-facebook-lets-you-create-as-many-fakelikes-as-you-want-2012-10/.
Here, the author writes that his publication previously reported that Facebook “quietly scans your messages, searching
for URLs that you’ve sent to your friends [and] [w]hen it sees one, it increases the number of Facebook Likes on that
URL.” He notes that while other publications have “portrayed this as a privacy invastion,” he believes that “more
importantly, it appears to be a massive source of bogus Likes.” The author goes on to describe the Facebook bug that
increased the social plugin count on third-party websites by two instead of one, and he includes an example of an
instance where he attempted to “generate some fake clicks.” He quotes a Facebook spokesperson’s statement
acknowledging that the company “did recently find a bug with our social plugins where at times the count for the Share
or Like goes up by two” and that the company was working on a “fix” for that issue, and the author notes that “those
likes may actually reflect negative consumer sentiment.”
68
FB000000226
Social Networks: Can Robots Violate User
Privacy?, High-Tech Bridge (August 7,
2013), available at
https://www htbridge.com/news/social_net
works_can_robots_violate_user_privacy ht
ml/.
This article indicates that High-Tech Bridge conducted a “a simple technical experiment to verify how the 50 largest
social networks, web services and free emails systems respect—or indeed abuse—the privacy of their users.” The
article purportedly confirms that Facebook, Twitter, Google+, and Formspring are crawling URLs, with Facebook
allegedly crawling a “[p]rivate message with a link.” The article “tak[es] into consideration that some of the services
may have legitimate robots (e.g. to verify and block spam links) crawling every user-transmitted link automatically,”
and it notes that High-Tech Bridge “created a robots.txt file on our web server that restricted bots accessing the server
and its content.” There are several comments to the article written by readers, one of which notes, “‘[N]ot surprised at
all Facebook did it. Just take note that Facebook do[es] scrap[e] URLs . . . to construct previews . . . [and] to run those
URLs against a malicious signaled table of URLs for the ‘protection’ of their production and users. I can be wrong but
I see no space here for a lawsuit.’”
69
FB000000289
Facebook ‘Likes’ Automatically Added
Without User-Clicks, BBC News
Technology (October 4, 2012), available
at http://www.bbc.com/news/technology19832043/.
This article reports that a security researcher recently “found that simply sending a web address to a friend using
Facebook’s private messaging function would add two likes to that page.” The article quotes portions of a statement
from a Facebook spokesperson confirming that Facebook discovered a bug where the Share or Like goes up by two and
that Facebook was “‘working on a fix to solve the issue’” and that “‘no user information is exchanged.’” The article
continues by noting that “[i]n documentation relating to the function of the like button, Facebook details four criteria
which cause the likes number to increase - only one of which involves clicking the like button.” It further reads,
“Facebook stressed that the added likes were anonymous, and would not appear on the user’s timeline.”
Page 19 of 22
APP. 57
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
70
FB000000296
Craig Lloyd, Facebook Auto-Liking Pages
for Users Without Permission [Updated],
Slash Gear (October 4, 2012), available at
http://www.slashgear.com/facebook-autoliking-pages-for-users-without-permission04250415/.
In this article, the author cites reports indicating that “Facebook is scanning its users’ private messages and searching
for links to Facebook fan pages . . . [and] supposedly automatically likes the pages for you without asking for your
permission to do so.” The author writes that “it only seems that it increases the Like count of a page, and doesn’t
actually ‘like’ the page on your behalf,” noting that “some users are reporting that it actually does like the page for you
without your permission.” The author opines that “this can be a huge problem,” but he indicates that “[s]canning itself
is nothing new” and notes that “Gmail does it to provide its users with targeted ads.” The author concludes by noting
that “this auto-liking debacle takes it to another level that’s a little over the line and unnecessary.” He updates the
article to include a Facebook spokesperson’s statement on the issue, which indicates, among other things, that
“‘[a]bsolutely no private information has been exposed and Facebook is not automatically Liking any Facebook Pages
on a user’s behalf” and that “[w]hen the count is increased via shares over private messages, no user information is
exchanged, and privacy settings of content are unaffected.’”
71
FB000000298
Ryan Singel, Juking Your Facebook ‘Like’
Stats Is As Easy As Sending a Message,
Wired (October 4, 2012), available at
http://www.wired.com/2012/10/facebooklikes-messages/.
The author of this article reports that people “looking to artificially inflate their Facebook stats . . . can just simply send
a raft of private messages that include a link to your page, and Facebook will add +2 to your page’s ‘Like’ count for
each message.” He acknowledges that “[i]t’s long been known that Facebook scans internal messages for spam and
security risks—and that it blocks users from sending links to torrent sites such as The Pirate Bay,” but that it’s “never
been clear how much data mining [Facebook] is doing” of users’ Facebook messages. He summarizes Ashkan
Soltani’s Wall Street Journal article on the topic, and then writes that Facebook’s behavior is “not a bug,” but is
“something actually noted in the documentation for developers.” He updates his article to include Facebook’s
spokesperson’s statement on the issue, which reads as follows: “‘Absolutely no private information has been exposed
and Facebook is not automatically Liking any Facebook Pages on a user’s behalf. Many websites that use Facebook’s
“Like”, “Recommend”, or “Share” buttons also carry a counter next to them. This counter reflects the number of times
people have clicked those buttons and also the number of times people have shared that page’s link on Facebook.
When the count is increased via shares over private messages, no user information is exchanged, and privacy settings
of content are unaffected. Links shared through messages do not affect the Like count on Facebook Pages.’”
72
FB000000340
Ed Oswald, Facebook Private Messages
Trigger ‘Likes’ Without Telling, TechHive
(October 5, 2012), available at
http://www.techhive.com/article/2011278/f
acebook-private-messages-trigger-likeswithout-telling html/.
In this article, the author notes that “[t]he next time you share a link with a Facebook friend via private message, be
aware that you’re anonymously ‘liking’ that page publicly as well.” He writes that “[w]hile this may come as a
surprise, evidence that the company was scanning our messages for these likable links has been public for at least a
week,” citing Facebook’s “September 27 FAQ for developers that [states that] ‘the number of inbox messages
containing this URL as an attachment’ is a factor in counting the number of likes that shows up on a page’s Like
Button,’” among other factors. The author writes that “[w]hile this information seems to have been public for some
time, those of us who aren’t developers likely had no clue of Facebook’s actions,” but that “given how Facebook uses
our activities to further its own business interests, this practice shouldn’t surprise us.”
Page 20 of 22
APP. 58
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
73
FB000000342
Lisa Vaas, Facebook Scans Private
Messages to Inflate the ‘Like’ Counter on
Websites, Naked Security (October 8,
2012), available at
https://nakedsecurity.sophos.com/2012/10/
08/facebook-scans-private-messages-likecounter/.
Here, the author writes that “Facebook has confirmed that it’s scanning private Facebook messages to boost ‘Like’
counters on third party websites.” She continues by noting that Facebook “confirmed that they had discovered a bug
affecting Like counts” that “concerned inflating page counts by two Likes instead of one.” She reports that “[t]he fact
that this is [sic] function is baked into Facebook code as opposed to being a potential fluke of privacy transgression is
confirmed . . . on the Facebook Developers page, which states that a websites’ number of Likes is the sum of: [t]he
number of likes of this URL; [t]he number of shares of this URL (this includes copy/pasting a link back to Facebook);
[t]he number of likes and comments on stories on Facebook about this URL; [t]he number of inbox messages
containing this URL as an attachment.” The author confirms that “Facebook’s scanning of private messages isn’t new”
and that “[t]he power of the social media mammoth’s data mining technology when applied to private messages came
to light in March, when Facebook was credited with quashing potential child molestation between a 13-year-old girl
and a man in his 30s who were having a private Facebook conversation about sex.” She writes that Facebook’s “data
mining technology scans postings and chats for criminal activity, analyzing relationships to find suspicious
conversations between unlikely pairings.” Noting that “[e]mail providers such as Gmail also have a long-standing
practice of reviewing messages to weed out spam and to target ads,” she opines that “[t]hose are reasonable uses of
data mining technology, but it’s disconcerting to find what might be yet more intrusive forays into allegedly private
messages.” For that reason, she writes that “it’s a bit of a relief to learn that Facebook later clarified the privacy issue,
saying that ‘absolutely no private information’ is exposed in the private-message-derived Like inflation.” She
concludes by writing, “Be prepared to add to your subjects’ Facebook counter glow, whether you want to or not, if you
send URLs via private Facebook conversations.”
74
FB000000353
Joseph Menn, Social Networks Scan for
Sexual Predators, With Uneven Results,
Reuters (July 12, 2012), available at
http://www reuters.com/article/2012/07/12/
us-usa-internet-predatorsidUSBRE86B05G20120712/.
This article discusses “Facebook’s extensive but little-discussed technology for scanning postings and chats for
criminal activity,” which flagged a sexual predator chatting with a 13 year-old girl. The author writes that “Facebook is
among the many companies that are embracing a combination of new technologies and human monitoring to thwart sex
predators,” noting that “[s]uch efforts generally start with automated screening for inappropriate language and
exchanges of personal information.” He writes that “[l]ike most of its peers, Facebook generally avoids discussing its
safety practices to discourage scare stories, because it doesn’t catch many wrongdoers, and to sidestep privacy concerns
[because] [u]sers could be unnerved about the extent to which their conversations are reviewed, at least by computer
programs.” The author continues by noting that “[i]n part because of its massive size, Facebook relies more than some
rivals on such technology.” He quotes Facebook’s then-Chief Security Officer Joe Sullivan, who says, “‘We’ve never
wanted to set up an environment where we have employees looking at private communications, so it’s really important
that we use technology that has a very low false-positive rate.’”
75
FB000000402
Chi Ibe, Nowhere to Hide: Facebook
Monitors Your Chats, YNaija.com (July
13, 2012), available at
http://ynaija.com/the-internet-has-noprivacy-settings-facebook-monitors-yourchats/.
Writing that “[r]eports have revealed that Facebook and other social platforms are watching users’ chats” to “monitor
criminal activity and notifying police if any suspicious behaviour is detected,” the author asks, “[W]hat ever [sic]
happened to good old privacy?” She indicates that “a number of social networking sites have set up a screening
process which works by a scanning software that monitors chats for words or phrases that signal something might be
amiss.”
Page 21 of 22
APP. 59
Exhibit E: Evidence of Implied Consent
Source
Bates Number
Publication
Summary
76
FB000000407
Tim Bukher, Facebook Monitoring User
The author writes that “[a]ccording to a report via Mashable, Facebook does more than passively scan user profile
Chats, Reporting to Police, LawTechie
settings for targeted advertising, it also monitors chats between users for potential criminal activity.”
(July 13, 2012), available at
http://www.lawtechie.com/2012/07/facebo
ok-monitoring-user-chats-reporting-topolice/.
77
FB000000409
Fox 13 Tampa Bay Staff, Facebook Uses
Technology to Spy on Private Chats, My
Fox - Tampa Bay (July 13, 2012),
available at
http://www myfoxtampabay.com/story/190
17765/2012/07/13/facebook-usestechnology-to-spy-on-private-chats/.
This article reports that “Facebook’s chief security officer admits Facebook users are being monitored for any
suspected criminal activity, and it’s not just the stuff you post on timelines.” It notes Facebook uses “smart software”
to “monitor[ ]personal chats” and “scans those chats for certain phrases, exchanges of personal information and vulgar
language.” The article adds that “Facebook says the technology has a very low false-positive rate to protect its users’
privacy, but as expected there has been a backlash from users” who “feel their private conversations are being
violated.” The article acknowledges that Facebook’s technology “helped net an alleged sexual predator” and reports
that the FBI is “on board with this technology and hopes more online sites use it.”
Page 22 of 22
APP. 60
EXHIBIT F
APP. 61
APP. 62
APP. 63
APP. 64
APP. 65
APP. 66
APP. 67
APP. 68
APP. 69
APP. 70
APP. 71
APP. 72
APP. 73
APP. 74
APP. 75
APP. 76
APP. 77
APP. 78
APP. 79
APP. 80
APP. 81
APP. 82
APP. 83
APP. 84
APP. 85
APP. 86
APP. 87
APP. 88
APP. 89
APP. 90
APP. 91
Disclaimer: Justia Dockets & Filings provides public litigation records from the federal appellate and district courts. These filings and docket sheets should not be considered findings of fact or liability, nor do they necessarily reflect the view of Justia.
Why Is My Information Online?