The Facebook logo on a plain blue background

Can tech leaders be trusted to shape our future? What happens when profit and power trump ethics and accountability?

In Careless People, Sarah Wynn-Williams—Facebook’s Director of Global Policy from 2011-2017—offers an insider account of her time working at Facebook. In this memoir and tell-all, Wynn-Williams alleges that Facebook’s top leaders were dangerously reckless people who caused great harm in the world.

Keep reading to learn more about the dark side of Facebook, as told by an insider.

Overview of Careless People

In Careless People, Sarah Wynn-Williams recounts her time as Director of Global Public Policy at Facebook, where she worked closely with Mark Zuckerberg and other executives from 2011-2017. Through a series of vignettes that chronicle her seven years there, she shares her critical perspective on the character and conduct of these leaders. 

Wynn-Williams joined Facebook because she believed in its potential to connect people and improve the world. But over time, she came to the conclusion that Zuckerberg and company were reckless, uncaring, and corrupt. They consistently put business and personal interests before responsibility or accountability. And because of this, she says, Facebook became a source of harm in the world. It influenced the outcome of the 2016 US presidential election, enabled political corruption and genocide in Myanmar, and tailored its tech to the authoritarian interests of the Chinese Communist Party. 

Wynn-Williams is a former diplomat and lawyer from New Zealand who’s worked for the New Zealand government, the United Nations, and Facebook (now Meta). In this guide, we’ll explore her claims about Facebook’s leadership, the harm they’ve caused, and the lessons we need to learn to shape a better future for Big Tech. In our commentary, we’ll examine her claims and compare them with what others have said about the company and its leaders. We’ll also discuss her testimony to the US Congress and Meta’s response to her speaking up, as well as the state of US-China AI race in 2025. 

(Shortform note: In 2021, Facebook rebranded itself as Meta. Throughout this guide, we’ll refer to the company as “Facebook” when discussing its pre-2021 activity, such as when Wynn-Williams worked there. We’ll refer to the company as “Meta” when describing its post-2021 activity.)

Facebook’s Culture and the Leaders Who Created It

To begin, we’ll characterize how Wynn-Williams experienced Facebook’s workplace culture. Then we’ll cover what she says about three key leaders who created this culture: Mark Zuckerberg, Sheryl Sandberg, and Joel Kaplan. 

Facebook’s Culture: Hard Work and Long Hours

Wynn-Williams writes that when she began working at Facebook, the team was small, scrappy, and always pressed for time. When she joined their D.C. office, which handled Facebook’s public policy in the US, she and her colleagues regularly worked until midnight or later, and they often began working again in the early morning. The work was nonstop, and they did all they could to keep up.

The expectation at Facebook, Wynn-Williams says, was that everyone would dedicate themselves 110% to the work. Facebook wanted its employees to believe they weren’t just running a business—they were changing the world. For Wynn-Williams and her colleagues, work was their purpose. And for some, family, hobbies, and life outside work weren’t even in the picture. 

To make this intense, mission-driven culture possible, Facebook gave its employees perks, like shuttle rides to work, laundry service, childcare, and unlimited snacks in every office. Wynn-Williams explains: Leadership believed that if they took care of people’s daily basic needs, they could and would work harder and longer. 

Zuckerberg: Detached and Self-Absorbed

Wynn-Williams describes Mark Zuckerberg as a complex business leader with several fatal character flaws. She says that he began with an idealistic vision to connect the world but was ultimately preoccupied with his power and image, stubborn in his beliefs, and indifferent to the consequences of his decisions. We’ll cover each of these in the sections below.

Zuckerberg’s Self-Absorption

Wynn-Williams says that Zuckerberg wanted to be powerful and admired by world leaders and the public. To that end, she often worked to arrange or contrive meetings between Zuckerberg and heads of state. Around 2014 and 2015, as Facebook’s global influence grew, she saw him become increasingly preoccupied with how he was perceived by figures like then-President Barack Obama and Chinese President Xi Jinping. After being snubbed by both at the same global conference—Obama rebuffed him and Xi refused to meet with him—Zuckerberg became visibly upset. These were people whose approval he’d wanted, and he’d been denied it. 

Zuckerberg also wanted everyday people to admire him. Wynn-Williams speculates that this is why he so often spoke in lofty, aspirational terms at public events about Facebook’s mission to “connect the world.”

Zuckerberg’s Indifference

Wynn-Williams says that despite this attempt to cultivate a positive public image, Zuckerberg was indifferent in internal meetings to the real-world harm Facebook was causing. For example, she notes that when Facebook came under fire for the worsening mental health of young users, Zuckerberg refused to scale back the company’s data-driven marketing tools. These tools allowed advertisers to target teenagers when they were most emotionally vulnerable—serving them, for instance, ads for weight loss products when they expressed insecurity about their appearance—but Wynn-Williams says Zuckerberg didn’t care about these negative effects.  

Zuckerberg’s Stubbornness

As for Zuckerberg’s conduct within the company, Wynn-Williams writes that he was a deeply stubborn leader who fiercely believed in his own ideas. Though he’d sometimes listen to feedback, he more often insisted on pushing through his preferred decisions—a habit that Wynn-Williams says often caused problems. 

For instance, he fought hard to keep the name Internet.org on an initiative meant to deliver free, limited internet access in parts of the Global South. Wynn-Williams recounts advising him to change it because the name was deceptive—it purported to be real internet service but was just a handful of Facebook-made web apps—and multiple countries had bristled at it. It took the loss of Brazil’s support for the program for Zuckerberg to concede that a name change was necessary (it became Free Basics). But by that time, the damage was already done: Key negotiations had failed, and the product’s reputation was irreparably harmed.

Zuckerberg’s Disconnection

Perhaps what enabled Zuckerberg’s fixation on being liked and his stubbornness as a leader, Wynn-Williams says, was the distance that his extraordinary wealth created between him and the ordinary world. She argues that his wealth made him out of touch: His power and privilege meant he didn’t have to reckon with the consequences of his choices. Over time, Zuckerberg came to care less and less—to be more interested in seeing what he could do, just because he could. According to Wynn-Williams, this was deliberate: He chose to ignore the harm he caused and to pursue his self-interested ends.

Sandberg: Ruthless and Hypocritical

Wynn-Williams explains that Zuckerberg led the engineering and product side of Facebook, while Sheryl Sandberg’s team handled public relations, policy, and politics. For the most part, Wynn-Williams worked under Sandberg. 

Wynn-Williams portrays Sandberg as a charismatic and intelligent woman who was also a coldly calculating and self-interested hypocrite. She alleges that Sandberg’s outward persona—author of Lean In and champion for women’s empowerment—hid who she really was. Behind the scenes, Sandberg ruthlessly played politics, expecting loyalty, obedience, and submission from her team members, many of whom were women. 

Sandberg’s Temper

For one, Wynn-Williams writes, Sandberg had a fierce temper and would take it out on whoever was nearby, even if they hadn’t caused the issue that had upset her. After experiencing this several times, Wynn-Williams and others began keeping problems quiet to avoid drawing Sandberg’s ire and jeopardizing their careers.

Sandberg’s Hypocrisy

Beyond how she treated her employees, Sandberg failed to live up to the ideals of women’s empowerment that she supposedly stood for. This became clear to Wynn-Williams after she discovered the “Feminist Fight Club” (FFC), a secret online Facebook group for women at the company who’d experienced harassment or wanted to support others who had. 

Harassment was a systemic issue at Facebook, Wynn-Williams says. But Sandberg, who believed in individual responsibility, didn’t advocate for systemic solutions that would address the issue. Even when the FFC group broached the issue with Facebook’s leadership, Sandberg didn’t support them. Rather, she tacitly approved of her male counterparts who’d come under fire (like Joel Kaplan, who we’ll discuss in the next section) by remaining silent on this issue.

Wynn-Williams also alleges that Sandberg behaved inappropriately toward some of her female employees. In one instance, Sandberg pressured Wynn-Williams to join her in bed during a trip on a private jet. When Wynn-Williams refused and stood her ground, Sandberg had a young assistant join her instead. This wasn’t uncommon, Wynn-Williams says, and after her refusal, she lost Sandberg’s favor.

Sandberg’s Politicking

Wynn-Williams also writes that Sandberg was a ruthless political actor who played a large role in deflecting criticism from Facebook and managing its reputation. For instance, when Facebook was criticized for doing too little to combat misinformation and its political consequences in Myanmar (an issue we’ll discuss later), Sandberg downplayed Facebook’s responsibility for the problem and exaggerated how much they were doing to address it.

Kaplan: Biased and Badly Behaved

A few years into Wynn-Williams’ work at Facebook, Joel Kaplan became her manager. Wynn-Williams says Kaplan was a savvy and well-connected political operative who’d been an adviser to US president George Bush and maintained ties to the Republican Party. He wasn’t as publicly visible as Zuckerberg or Sandberg, but he had major influence over Facebook’s approach to politics, policy, and content moderation. 

According to Wynn-Williams, Kaplan ensured that Facebook didn’t moderate conservative content, however extreme or hateful it got. She says this enabled the spread of misinformation and disinformation on the platform. She also alleges that Kaplan behaved inappropriately toward her on multiple occasions. 

Kaplan’s Conservative Bias

Wynn-Williams writes that Kaplan often pushed for internal decisions that would benefit right-wing content and personalities. For instance, when Facebook was facing scrutiny from conservative lawmakers because of its apparent liberal bias, Kaplan ensured that Facebook didn’t censor conservative pages that had begun spreading hate speech and disinformation. Wynn-Williams makes no judgment on his politics per se, but she does suggest that his influence—especially around the 2016 US election—was a big part of why Facebook did so little to combat political disinformation.

Kaplan’s Dismissiveness

According to Wynn-Williams, Kaplan also played a key role in Facebook’s morally fraught dealings with China and Myanmar (which we’ll cover in more detail later). In both cases, he ignored or dismissed red flags, such as violent rhetoric on Facebook in Myanmar, and Facebook’s willingness to build surveillance tools for China. Wynn-Williams says that his actions revealed his beliefs: Business growth and profit mattered more to him than ethics or human rights.

Kaplan’s Harassment of Wynn-Williams

Beyond his power as a key decision-maker for Facebook’s political strategy, Kaplan allegedly used his position to sexually harass Wynn-Williams. She describes multiple occasions in which he treated her inappropriately, such as making suggestive comments about her body after childbirth and drunkenly grinding on her at a Facebook offsite event. These weren’t isolated incidents, she argues, but a consistent part of his character. 

When Wynn-Williams eventually filed a formal complaint against him, the company launched an investigation. However, instead of holding Kaplan accountable, Wynn-Williams says Facebook’s investigation team cleared him before she had a chance to provide all her evidence (such as inappropriate emails). Kaplan had powerful allies, including Elliot Schrage (Facebook’s VP of public policy), who allegedly used his influence to ensure Kaplan wasn’t held accountable. Instead, Schrage fired Wynn-Williams on what she says were trumped-up charges—she was told she’d been underperforming—in the wake of her investigation filing. 

Facebook’s Harmful Impact

In the previous section, we explained how Wynn-Williams characterizes Facebook’s leaders during her time at the company. Next, we’ll explore the key events that she says happened as a result of their recklessness. These include Facebook’s role in genocide and political instability in Myanmar, its role in the 2016 US presidential election, and its collaboration with the Chinese Communist Party.

Facebook Enabled Violence in Myanmar

Wynn-Williams writes that in the mid-2010s, Facebook neglected to address hateful political rhetoric spreading on the platform in Myanmar. She says that this negligence led directly to real-world consequences: a violent campaign against the country’s Muslim Rohingya minority that the UN later called genocide. For years, Wynn-Williams and others repeatedly warned key decision-makers, but these leaders chose not to act—not because they didn’t know, but because they didn’t care.

Facebook’s unique role in Myanmar is central to understanding how this happened. Myanmar, formerly Burma, skipped over desktop computing and went straight to smartphones. Further, Facebook was so ubiquitous that most people thought it was the internet, or at least the main way to go online. This happened because Facebook had struck deals with telecoms to preload the app on mobile phones and not charge for data spent using it. All in all, the platform had unprecedented influence over how people in Myanmar connected, communicated, and shared information.

In 2015, Myanmar was preparing to hold its first democratic election in decades. But the military junta, reluctant to give up power, used Facebook to stir up chaos in the country and delay the process. Wynn-Williams’s team found that covert groups employed by the junta spread anti-Rohingya hate speech and nationalist propaganda on Facebook. Her team documented numerous inauthentic accounts being used to stoke division, impersonate influencers, and coordinate harassment. Meanwhile, peace activists and other good-faith actors were silenced or suppressed on the platform.

What was at first just online hate speech led to mobs rioting and burning mosques, shops, and the homes of Rohingya Muslims, according to Wynn-Williams. Despite this, Facebook’s leadership chose not to act. Further, they had only a single Burmese-speaking contractor on staff at the time—one person tasked with moderating Facebook in a country with tens of millions of users. Wynn-Williams managed to get a second Burmese speaker hired, but this was too little, too late. 

In 2017, after Facebook’s leadership had ignored and dismissed the problems in Myanmar for years, the country’s junta launched a series of attacks against the Rohingya. Over 10,000 people were killed, thousands of women and girls were raped, villages were burned, and more than 700,000 Rohingya fled the country. The United Nations later concluded that Facebook had failed to moderate the hate speech and misinformation that led to this violence.

Facebook’s Role in the 2016 US Presidential Election

Wynn-Williams also recounts how Facebook contributed to Donald Trump’s victory in the 2016 US presidential election. She writes that Trump’s team used Facebook’s powerful advertising tools to spread inflammatory and often counterfactual messaging on the platform—an approach which she views as unethical. Facebook leadership knew about this: They were selling campaign services (like help using their ad tools) to multiple candidates, including Trump and other Republicans. But they only realized they’d helped Trump win after the fact, according to Wynn-Williams, and by then it was too late.

How did Facebook’s advertising tools contribute to Trump’s 2016 win, according to Wynn-Williams? Facebook ads allowed Trump’s campaign to target voters based on precise demographic data: age, gender, race, location (down to the ZIP code), political affiliation, income level, hobbies, and even emotional states. For example, Wynn-Williams describes a tactic in which Trump’s team targeted middle-aged Black men in cities like Philadelphia, who were likely to vote Democratic but had reservations about Hillary Clinton. The Trump team then sent these voters clips of her 1996 “superpredators” comment, widely seen as racist, to exploit those doubts and discourage voter turnout. 

This worked so well because Facebook’s algorithm was optimized for engagement, and outrage strongly engages people. The more people Trump’s team could anger or rile up, the more people they could get to vote (or not to vote). Wynn-Williams says that Facebook leadership didn’t take this issue seriously because they saw a Trump victory as impossible. But in the meantime, he was a great customer for Facebook—the business profited massively from his campaign.

Wynn-Williams contends that Facebook made no meaningful effort to curb the spread of Trump’s political disinformation. Zuckerberg and others downplayed the problem, despite internal warnings from employees and external pressure from researchers and journalists concerned about political manipulation. For his part, Kaplan made sure Facebook took no action that might be seen as anti-conservative, even when content crossed into hate speech or blatant falsehood.

After Trump’s victory, Facebook came under increasing scrutiny. But Zuckerberg needed to be convinced that Facebook had even played a role—Wynn-Williams says he was dismissive of that idea until Schrage (Facebook’s public policy VP and Wynn-Williams’ superior) explained to him in detail what had happened. Even then, Facebook’s leadership did little to address the platform’s friendliness to misleading or counterfactual content.

Facebook Collaborated with the Chinese Communist Party

Lastly, Wynn-Williams alleges that Facebook willingly cooperated with the demands of the Chinese Communist Party (CCP) in hopes of being allowed to operate in China. She says leadership was unbothered by the CCP’s authoritarianism, and she alleges that Zuckerberg lied to the US Congress about Facebook’s involvement with them. 

According to Wynn-Williams, Facebook spent multiple years working with CCP officials in an attempt to gain access to the largest market left for it to reach. Zuckerberg championed this initiative, outlining via internal communications in 2014 his intent and a three-year plan. To him, it was the most important move Facebook could make to keep growing (which was his main goal).

To operate in China, Facebook would have to abide by CCP regulations on media companies. These include giving the CCP the ability to censor speech on the platform, access user data and local servers, and broadly surveil citizens. 

Through 2015 and 2016, Facebook worked with Chinese officials to develop technologies that would maintain what the CCP called “safe and secure social order” on the platform. Wynn-Williams, who wasn’t part of the China team until later on, recounts reading internal reports detailing Facebook’s engineering of special tools for moderation and censorship. They included mechanisms for keyword blocking, a viral post flagger, and a master switch that could wipe out viral content during politically sensitive times (like the anniversary of the Tiananmen Square incident).

Facebook also agreed to store Chinese users’ data in China—something it had refused to do for other governments like Russia or Brazil. According to Wynn-Williams, Facebook leaders knew that the CCP would access the servers and gather data. They also acknowledged that the Facebook employees building the censorship tools could be implicated in political violence if the CCP used them for ill.

In the end, Facebook didn’t succeed in gaining access to China. All the same, Wynn-Williams says, the episode demonstrated clearly that Facebook’s leaders were willing to ignore the ethical dilemma of working with an authoritarian regime if it meant they’d profit. In addition, Zuckerberg allegedly lied about it to the US Congress: When asked whether the platform would comply with CCP regulations, Zuckerberg said that “no decisions had been made.” This was four years after they’d begun working to enter China and did all of the above.

Averting a More Reckless Future

So far, we’ve laid out Wynn-Williams’ allegations that Facebook leaders were reckless, unaccountable, and ethically compromised in character and conduct; and that they caused major harm in the world. We’ll look next to her main takeaway. 

Put simply, Wynn-Williams argues that Facebook’s recklessness continues to go unchecked—that Zuckerberg and company haven’t changed at all. She adds that this could be disastrous for the AI race between the US and China, in which both countries want to develop superior AI for military and economic use.

In 2024, Facebook’s leadership chose to open-source their AI models, or make them publicly available to license and build upon. In doing this, Wynn-Williams says, they’ve enabled Chinese tech firms (like DeepSeek) to compete with formerly dominant Western AI companies (like OpenAI) and potentially get an edge.

Why is Facebook’s recklessness a problem? Because if China wins the AI race, Wynn-Williams writes, the CCP will dominate the next generation of tech and use it for authoritarian ends rather than democratic ones. AI would allow them to go far beyond the tools Facebook was building for them and create tech for censorship, surveillance, and control with unprecedented power and precision. 

Finally, Wynn-Williams argues that if we want to ensure that the CCP doesn’t win the AI race, we need responsible leaders and stronger regulations that can keep people like Zuckerberg, Sandberg, and Kaplan from doing more harm.

Exercise: Reflect on Leadership and Responsibility

Wynn-Williams raises questions about leadership accountability and ethical responsibility in the tech industry. This exercise will help you explore these themes and consider their broader implications.

  • Think of a tech product or service you use regularly. What potential negative consequences might arise from its misuse or lack of oversight?
  • If you worked for the company that made that product, how would you want leadership to respond to worries about potential misuses or harm the product could cause?
  • Facebook’s leaders allegedly prioritized growth and profit over ethical concerns. In your professional life, have you ever experienced or witnessed tension between business goals and ethical responsibilities? If so, how did you or others handle it?
  • Wynn-Williams suggests that extreme wealth and power can disconnect leaders from the real-world impact of their decisions. What specific accountability measures or structures do you think could help prevent this kind of disconnection in large tech companies?
Careless People by Sarah Wynn-Williams—Book Overview

Hannah Aster

Hannah graduated summa cum laude with a degree in English and double minors in Professional Writing and Creative Writing. She grew up reading fantasy books and has always carried a passion for fiction. However, Hannah transitioned to non-fiction writing when she started her travel website in 2018 and now enjoys sharing travel guides and trying to inspire others to see the world.

Leave a Reply

Your email address will not be published. Required fields are marked *