Tag Archives: Facebook

UK investigates use of personal data in political campaigns

Cables and computers are seen inside a data centre at an office in the heart of the financial district in London, Britain May 15, 2017. REUTERS/Dylan Martinez

May, 2017

LONDON (Reuters) – Britain said it was investigating how politicians and campaigners use data to target voters with online advertising to make sure they comply with electoral laws and do not abuse people’s privacy.

The inquiry coincides with campaigning for a national election next month although the senior official in charge of the review said the timing was unrelated.

Advertising on platforms such as Facebook to relatively small numbers of voters – selected according to their opinions, attitudes and interests – played a decisive role in last year’s EU referendum in Britain and the U.S. presidential election, according to the companies involved.

Britain’s Information Commissioner’s Office (ICO), which is responsible for regulating how companies use data, said it was understandable that political campaigns were exploring the potential of advanced data analysis to help win votes, but they had to comply with strict laws.

“This is a complex and rapidly evolving area of activity and the level of awareness among the public about how data analytics works, and how their personal data is collected, shared and used through such tools, is low,” Information Commissioner Elizabeth Denham said.

The investigation will look into the use of targeted online advertising in the run-up to Britain’s EU referendum last year, and potentially in other campaigns, said the ICO, which can issue fines of up to 500,000 pounds and instigate criminal prosecutions.

Denham said it was clear that data analytical tools had a significant potential impact on individuals’ privacy.

“It is important that there is greater and genuine transparency about the use of such techniques to ensure that people have control over their own data and the law is upheld,” she said.

Denham said she was aware that the investigation came amid a general campaign, but this was not the trigger for the probe.

“I would nonetheless remind all relevant organisations of the need to comply with the law,” she said.

Copyright Reuters 2017

(Reporting by Paul Sandle; Editing by Michael Holden and Toby Chopra)

 ~~~

Facts and Opinions is a boutique journal of reporting and analysis in words and images, without borders. Independent, non-partisan and employee-owned, F&O is funded only by you, our readers. We are ad-free and spam-free, and do not solicit donations from partisan organizations. To continue we need a minimum payment of .27 for one story, or a sustaining donation. Visit our Subscribe page for details and payment options, or donate below. With enough supporters each paying a small amount, we will continue, and increase our original works like this.

F&O’s CONTENTS page is updated each Saturday. Sign up for emailed announcements of new work on our free FRONTLINES blog; find evidence-based reporting in Reports; commentary, analysis and creative non-fiction in OPINION-FEATURES; and image galleries in PHOTO-ESSAYS. If you value journalism please support F&O, and tell others about us.

Posted in Also tagged |

Facebook Feels Heat of Controversies

Sheryl Sandberg, Chief Operating Officer of Facebook attends a session during the annual meeting of the World Economic Forum in Davos, Switzerland January 20, 2016. REUTERS/Ruben Sprich/File Photo

Sheryl Sandberg, Chief Operating Officer of Facebook attends a session during the annual meeting of the World Economic Forum in Davos, Switzerland January 20, 2016. REUTERS/Ruben Sprich/File Photo

By Kristina Cooke, Dan Levine and Dustin Volz 
Fall, 2016

SAN FRANCISCO/WASHINGTON (Reuters) – After Facebook’s removal of an iconic Vietnam war photo stirred an international uproar in September, the social network’s executives quickly backtracked and cleared its publication.

But the image – showing a naked Vietnamese girl burned by napalm – had previously been used in training sessions as an example of a post that should be removed, two former Facebook employees told Reuters.

Trainers told content-monitoring staffers that the photo violated Facebook policy, despite its historical significance, because it depicted a naked child, in distress, photographed without her consent, the employees told Reuters.

The social network has taken great pains to craft rules that can be applied uniformly with minimal discretion. The reversal on the war photo, however, shows how Facebook’s top executives sometimes overrule company policy and its legions of low- and mid-level content monitors.

Facebook has often insisted that it is a technology company – not a media company – but an elite group of at least five senior executives regularly directs content policy and makes editorial judgment calls, particularly in high-profile controversies, eight current and former Facebook executives told Reuters.

One of those key decision-makers – Justin Osofsky, who runs the community operations division – wrote a Facebook post acknowledging that the removal of the war photo was a “mistake.”

“Sometimes,” he wrote, “the global and historical significance of a photo like ‘Terror of War’ outweighs the importance of keeping nudity off Facebook.”

Before you continue: to our supporters, thank you. To newcomers, please know that reader-supported Facts and Opinions is employee-owned and ad-free. We continue only because readers like you pay at least 27 cents per story, on an honour system. Please contribute below, or find more payment options here.

Facebook spokeswoman Christine Chen declined to comment on the company’s use of the photo in training sessions.

Facebook has long resisted calls to publicly detail its policies and practices on censoring postings. That approach has drawn criticism from users who have had content removed and free-speech advocates, who cite a lack of transparency and a lack of an appeals process for many content decisions.

At the same time, some governments and anti-terror groups are pressuring the company to remove more posts they consider offensive or dangerous.

HIGH-LEVEL REVIEW

Monika Bickert, Facebook's head of global policy management, is interviewed by Reuters in Washington DC February 2, 2016. REUTERS/Gary Cameron/File Photo

Monika Bickert, Facebook’s head of global policy management, is interviewed by Reuters in Washington DC February 2, 2016. REUTERS/Gary Cameron/File Photo

The current and former Facebook executives, most of them speaking on condition of anonymity, told Reuters in detail how complaints move through the company’s content-policing apparatus. The toughest calls, they said, rise to an elite group of executives.

Another of the key decision-makers is Global Policy Chief Monika Bickert, who helped rule on the fracas over the war photo.

“That was one we took a hard look at, and we decided it definitely belonged on the site,” said Bickert, a former federal prosecutor.

She declined to elaborate on the decision-making process.

Facebook chief operating officer Sheryl Sandberg followed up with an apology to Norwegian Prime Minister Erna Solberg, who had posted the photo on her own account after Facebook removed it from others in her country.

In addition to Sandberg, Osofsky and Bickert, executives involved in sensitive content issues include Joel Kaplan, Facebook’s Washington-based government relations chief; and Elliot Schrage, the vice president for public policy and communications.

All five studied at Harvard, and four of them have both undergraduate and graduate degrees from the elite institution. All but Sandberg hold law degrees. Three of the executives have longstanding personal ties to Sandberg.

Chief Executive Mark Zuckerberg, a Harvard drop-out, occasionally gets involved with content controversies, Bickert said.

These executives also weigh in on content policy changes meant to reflect shifting social context and political sensitivities around the world, current and former executives said.

Facebook officials said the five people identified by Reuters were not the only ones involved in high-level content decisions.

“Facebook has a broad, diverse and global network involved in content policy and enforcement, with different managers and senior executives being pulled in depending on the region and the issue at hand,” Chen said.

Chen declined to name any other executives who were involved in content policy.

A WAR OVER FREE EXPRESSION

The company’s reticence to explain censorship decisions has drawn criticism in many countries around the globe.

Last month, Facebook disabled the accounts of editors at two of the most widely read Palestinian online publications, Shehab News Agency and Quds. In keeping with standard company practice, Facebook didn’t publicly offer a reason for the action or pinpoint any content it considered inappropriate.

The company told Reuters that the removal was simply an error.

Some Palestinian advocacy groups and media outlets condemned the shutdowns as censorship stemming from what they described as Facebook’s improper alliance with the Israeli government.

Israel’s government has pushed Facebook to block hundreds of pages it believes incite violence against Jews, said Noam Sela, spokesman for Israeli cabinet Minister Gilad Erdan.

Sela said the Israeli government “had a connection” at Facebook to handle complaints but declined to elaborate on the relationship.

“It’s not working as well as we would like,” Sela said. “We have more work to do to get Facebook to remove these pages.”

Ezz al-Din al-Akhras, a Quds supervisor, said that Facebook’s head of policy in the Middle East had gotten in touch after the uproar over the shutdowns and that three of four suspended accounts were restored.

“We hope the Facebook campaign of suspending and removing Palestinian accounts will stop,” he said. “We do not practice incitement; we are only conveying news from Palestine to the world.”

Facebook said the restoration of the accounts was not a response to complaints. It declined to comment on whether top executives were involved.

The company has cited technological glitches in other recent cases where content was removed, then restored, including the takedown of a video that showed the aftermath of a Minneapolis police shooting.

Chen declined to explain the glitch.

She said the company was reviewing its appeals process in response to public feedback. Facebook currently allows appeals of company actions involving entire profiles set up by people or institutions, or full pages on those profiles, but not for individual posts.

THICK RULEBOOK

To manage the huge volume of content complaints – more than a million a day – the company employs a multi-layered system. It starts with automated routing of complaints to content-policing teams in Dublin, Hyderabad, Austin and Menlo Park, who make initial rulings, current and former executives said.

These low-level staffers and contractors consult a thick rulebook that interprets the comparatively spare “community standards” that Facebook customers are asked to follow. The company trains front-line monitors to follow rules and use as little discretion as possible.

When a removal sparks more complaints, regional managers function as a mid-level appeals court. Continuing controversy could then push the issue to top U.S. executives.

Senior executives also weigh in on policy updates. Osofsky and Kaplan, for instance, wrote a blog post last week, in response to “continued feedback” on content removals, explaining that the company would start weighing news value more heavily in deciding whether to block content.

In an earlier post, responding to the Napalm-girl controversy, Osofsky said Facebook’s policies usually work well, but not always.

“In many cases, there’s no clear line between an image of nudity or violence that carries global and historic significance and one that doesn’t,” Osofsky wrote.

The Vietnam war photo – depicting horrors suffered by a girl named Phan Thi Kim Phuc – was first removed from an account in Norway by a front-line monitor.

In protest, the Norwegian newspaper Aftenposten printed the image on its front page and posted it on Facebook, which removed it. That prompted the prime minister to post the photo – only to have Facebook remove it again.

Facebook then issued a statement defending the action, saying it was “difficult to create a distinction between allowing a photograph of a nude child in one instance and not others.”

The next day, executives reversed the call, with Sandberg telling the prime minister: “Even with clear standards, screening millions of posts on a case-by-case basis every week is challenging.”

Copyright Reuters 2016

(Additional reporting by Yasmeen Abutaleb and Joseph Menn in San Francisco, Nidal al-Mughrabi in Gaza and Terje Solsvik in Oslo; Editing by Jonathan Weber and Brian Thevenot)

Related on F&O:

Facebook Lets Advertisers Exclude Users by Race, by Julia Angwin and Terry Parris Jr., ProPublica

Imagine if, during America’s Jim Crow era, a newspaper offered advertisers the option of placing ads only in copies that went to white readers. That’s basically what Facebook is doing nowadays. The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls “Ethnic Affinities.” Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment.

 ~~~

Facts and Opinions is a boutique journal of reporting and analysis in words and images, without borders. Independent, non-partisan and employee-owned, F&O is funded by our readers. It is ad-free and spam-free, and does not solicit donations from partisan organizations. To continue we require a minimum payment of .27 for one story, or a sustaining donation. Details here; donate below. Thanks for your interest and support.

F&O’s CONTENTS page is updated each Saturday. Sign up for emailed announcements of new work on our free FRONTLINES blog; find evidence-based reporting in Reports; commentary, analysis and creative non-fiction in OPINION-FEATURES; and image galleries in PHOTO-ESSAYS. If you value journalism please support F&O, and tell others about us.

Posted in Also tagged , , |

Facebook Lets Advertisers Exclude Users by Race

By Julia Angwin and Terry Parris Jr., ProPublica
Oct. 28, 2016

Imagine if, during America’s Jim Crow era, a newspaper offered advertisers the option of placing ads only in copies that went to white readers. That’s basically what Facebook is doing nowadays. The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls “Ethnic Affinities.” Ads that exclude people based on race, gender and other sensitive factors are prohibited by U.S. federal law in housing and employment.

Here is a screenshot of a housing ad that we purchased from Facebook’s self-service advertising portal:

ProPublica

ProPublica

The ad we purchased was targeted to Facebook members who were house hunting and excluded anyone with an “affinity” for African-American, Asian-American or Hispanic people. (Here’s the ad itself.)

When we showed Facebook’s racial exclusion options to a prominent civil rights lawyer John Relman, he gasped and said, “This is horrifying. This is massively illegal. This is about as blatant a violation of the federal Fair Housing Act as one can find.”

The Fair Housing Act of 1968 makes it illegal “to make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.” Violators can face tens of thousands of dollars in fines.

The Civil Rights Act of 1964 also prohibits the “printing or publication of notices or advertisements indicating prohibited preference, limitation, specification or discrimination” in employment recruitment.

Facebook’s business model is based on allowing advertisers to target specific groups — or, apparently to exclude specific groups — using huge reams of personal data the company has collected about its users. Facebook’s microtargeting is particularly helpful for advertisers looking to reach niche audiences, such as swing-state voters concerned about climate change.

ProPublica recently offered a tool allowing users to see how Facebook is categorizing them. We found nearly 50,000 unique categories in which Facebook places its users. Facebook says its policies prohibit advertisers from using the targeting options for discrimination, harassment, disparagement or predatory advertising practices.

“We take a strong stand against advertisers misusing our platform: Our policies prohibit using our targeting options to discriminate, and they require compliance with the law,” said Steve Satterfield, privacy and public policy manager at Facebook. “We take prompt enforcement action when we determine that ads violate our policies.”

Satterfield said it’s important for advertisers to have the ability to both include and exclude groups as they test how their marketing performs. For instance, he said, an advertiser “might run one campaign in English that excludes the Hispanic affinity group to see how well the campaign performs against running that ad campaign in Spanish. This is a common practice in the industry.”

He said Facebook began offering the “Ethnic Affinity” categories within the past two years as part of a “multicultural advertising” effort. Satterfield added that the “Ethnic Affinity” is not the same as race — which Facebook does not ask its members about.

Facebook assigns members an “Ethnic Affinity” based on pages and posts they have liked or engaged with on Facebook. When we asked why “Ethnic Affinity” was included in the “Demographics” category of its ad-targeting tool if it’s not a representation of demographics, Facebook responded that it plans to move “Ethnic Affinity” to another section.

Facebook declined to answer questions about why our housing ad excluding minority groups was approved 15 minutes after we placed the order.

By comparison, consider the advertising controls that the New York Times has put in place to prevent discriminatory housing ads. After the newspaper was successfully sued under the Fair Housing Act in 1989, it agreed to review ads for potentially discriminatory content before accepting them for publication.

Steph Jespersen, the Times’ director of advertising acceptability, said that the company’s staff runs automated programs to make sure that ads that contain discriminatory phrases such as “whites only” and “no kids” are rejected.

The Times’ automated program also highlights ads that contain potentially discriminatory code words such as “near churches” or “close to a country club.” Humans then review those ads before they can be approved. Jespersen said the Times also rejects housing ads that contain photographs of too many white people.

The people in the ads must represent the diversity of the population of New York, and if they don’t, he says he will call up the advertiser and ask them to submit an ad with a more diverse lineup of models. But, Jespersen said, these days most advertisers know not to submit discriminatory ads: “I haven’t seen an ad with ‘whites only’ for a long time.”

Creative Commons

This story was reported and published by ProPublica, a Pulitzer Prize-winning investigative newsroom. Sign up for their newsletter.

Related on F&O:

Monika Bickert, Facebook's head of global policy management, is interviewed by Reuters in Washington DC February 2, 2016. REUTERS/Gary Cameron/File Photo

Monika Bickert, Facebook’s head of global policy management. REUTERS/Gary Cameron/File Photo

Facebook Feels Heat of Controversies, by Kristina Cooke, Dan Levine and Dustin Volz

Facebook has often insisted that it is a technology company – not a media company. But an elite group  directs content policy and makes editorial judgment calls. Facebook has long resisted calls to publicly detail its policies and practices on censoring postings, drawing criticism citing a lack of transparency and a lack of an appeals process. Meanwhile, some governments and anti-terror groups are pressuring the company to remove more posts.

 ~~~

Facts and Opinions is a boutique journal of reporting and analysis in words and images, without borders. Independent, non-partisan and employee-owned, F&O is funded by our readers. It is ad-free and spam-free, and does not solicit donations from partisan organizations. To continue we require a minimum payment of .27 for one story, or a sustaining donation. Details here; donate below. Thanks for your interest and support.

F&O’s CONTENTS page is updated each Saturday. Sign up for emailed announcements of new work on our free FRONTLINES blog; find evidence-based reporting in Reports; commentary, analysis and creative non-fiction in OPINION-FEATURES; and image galleries in PHOTO-ESSAYS. If you value journalism please support F&O, and tell others about us.
Posted in Also tagged , , , |

It’s Complicated: Facebook’s History of Tracking You

by Julia Angwin, ProPublica

 

For years people have noticed a funny thing about Facebook’s ubiquitous Like button. It has been sending data to Facebook tracking the sites you visit. Each time details of the tracking were revealed, Facebook promised that it wasn’t using the data for any commercial purposes.

Facebook_on_Nasdaq

Billboard on the Thomson Reuters building welcomes Facebook to Nasdaq, 2012. Photo by ProducerMatthew, Creative Commons licence

No longer. Last week, Facebook announced it will start using its Like button and similar tools to track people across the Internet for advertising purposes.

Here is the long history of the revelations and Facebook’s denials:

Facebook’s Mark Zuckerberg introduces the “transformative” Like button …

April 21, 2010 2013 Facebook introduces the “Like” button in 2010 at its F8 developer conference. Facebook founder Mark Zuckerberg declares that it will be “the most transformative thing we’ve ever done for the Web.”

He says his goal is to encourage a Web where all products and services use people’s real identity. He suggests, in fact, that creating a personally identifiable web experience could be divine: “When you go to heaven, all of your friends are all there and everything is just the way you want it to be,” he says. “Together, lets build a world that is that good.” 

Which sends data …

Nov. 30, 2010 2013 Dutch researcher Arnold Roosendaalpublishes a paper showing that Facebook Like buttons transmit data about users even when the user doesn’t click on the button. Facebook later says that Roosendaal found a “bug.”

even when users don’t click on it … 

May 18, 2011 – The Wall Street Journal reports that Facebook Like buttons and other widgets collect data about users even when they don’t click them. Facebook’s chief technology officer says, “we don’t use them for tracking and they’re not intended for tracking.”

Internet pioneer says log out of Facebook …

Sept. 24, 2011 2013 Veteran tech blogger Dave Winer writes that ” Facebook is scaring me” with its apps like the social reader, which can automatically share stories you read. This “kind of behavior deserves a bad name, like phishing, or spam, or cyber-stalking,” he writes. Winer recommends that users log out of Facebook to prevent being tracked on other websites.

Except logging out doesn’t work …

Sept. 25, 2011 2013 Australian blogger Nik Cubrilovic writes that ” Logging Out of Facebook is Not Enough.” He shows that Facebook is tracking users even when they log out of the site. Facebook responds that it is fixing the issue so people won’t be tracked when they are logged out of Facebook.

Facebook says not to worry…

Sept. 27, 2011 2013 Facebook tells the New York Times that it doesn’t use data from Like buttons and other widgets to track users or target advertising to them, and that it deletes or anonymizes the data within 90 days.

Turns out Facebook has patented the technique …

Oct. 1, 2011 2013 Blogger Michael Arrington digs up a Facebook patent application for “a method 2026 for tracking information about the activities of users of a social networking system while on another domain.” The title of his blog post: ” Brutal Dishonesty.

But, really, don’t worry …

Dec. 7, 2012 2013 As the Wall Street Journal finds that Facebook Like buttons and other widgets appear on two-thirds of 900 websites surveyed, the company says again it only uses data from unclicked Like buttons for security purposes and to fix bugs in its software.

OK, worry …

June 12, 2014 2013 Facebook tells Ad Age that it will start tracking users across the Internet using its widgets such as the Like button.

It’s a bold move. Twitter and Pinterest, which track people with their Tweet and PinIt buttons, offer users the ability to opt out. And Google has pledged it will not combine data from its ad-tracking network DoubleClick with personally identifiable data without user’s opt-in consent. Facebook does not offer an opt-out in its privacy settings.

Instead Facebook asks members to visit an ad industry page, where they can opt out from targeted advertising from Facebook and other companies. The company also says it will let people view and adjust the types of ads they see.

We contacted Facebook to ask them about their tracking habits. They didn’t respond.

Read our recent story about how online tracking is getting creepier, and a piece from our archives rounding up the best reporting on Facebook and your privacy.

Creative Commons

Independent, non-partisan and employee-owned, F&O performs journalism for citizens, funded entirely by subscribers and readers who purchase a $1 site day pass. We do not carry advertising or solicit donations from non-journalism foundations or causes.

Posted in Current Affairs Also tagged , |